II.2. Evidence from researcher interactions with human participants

Forum rules

We encourage contributors to the Discussion Board to publicly identify by registering and logging in prior to posting. However, if you prefer, you may post anonymously (i.e. without having your post be attributed to you) by posting without logging in. Anonymous posts will display only after a delay to allow for administrator review. Contributors agree to the QTD Terms of Use.

Instructions
To participate, you may either post a contribution to an existing discussion by selecting the thread for that topic (and then click on "Post Reply") or start a new thread by clicking on "New Topic" below.

For instructions on how to follow a discussion thread by email, click here.

Anastasia Shesterinina
Yale University
Posts: 17
Joined: Thu Apr 07, 2016 11:51 am

How and when can and should we be transparent about the process of collecting evidence with human participants?

PostMon Sep 05, 2016 8:38 pm

When is it appropriate, ethical, and feasible to make explicit the strategies used to generate evidence with human participants? Are there issue-areas or research contexts that pose particular challenges in making the data generation process open to the broader public? What aspects of the data generation process have been more amenable to sharing in your experience? What ethical considerations have limited your ability to be fully transparent about your research methods? What practical costs (resources, time, etc.) have you incurred in making your data generation process accessible?

Post Reply

Guest

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostTue Sep 20, 2016 6:42 pm

[quote="AnastasiaSh"]When is it appropriate, ethical, and feasible to make explicit the strategies used to generate evidence with human participants? Are there issue-areas or research contexts that pose particular challenges in making the data generation process open to the broader public? What aspects of the data generation process have been more amenable to sharing in your experience? What ethical considerations have limited your ability to be fully transparent about your research methods? What practical costs (resources, time, etc.) have you incurred in making your data generation process accessible?[/quote]

My research focuses on forced labour in global supply chains, a context and topic area that create serious challenges for data generation and sharing. Because forced labour is illegal in most countries, and because companies and governments are hesitant to grant researchers access to their workforces, this issue area is extremely challenging to research. My research combines a range of qualitative methods, including ethnographic methods, participant observation, elite interviews, and interviews with vulnerable populations to understand how forced labour operates in modern industry. Each method carries specific challenges in relation to ethically disclosing the strategies used to generate evidence. For instance, in relation to research focused on workers and others vulnerable to forced labour, there is a danger of jeopardising the safety and job security of the gatekeepers (such as workers, union reps, or managers) who granted access to the research participants. In relation to elite interviews, disclosing details about the data generation process runs the risk of jeopardising the reputation of the organisations that have facilitated access to participants and can compromise anonymization techniques. In all of my research, disclosing the strategies used to generate evidence runs the risk of undermining future research projects and efforts since powerful organisations and individuals could seek to block future access, and since research participants may feel their anonymity has not been appropriately safeguarded.

Post Reply

Anastasia Shesterinina
Yale University
Posts: 17
Joined: Thu Apr 07, 2016 11:51 am

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostThu Sep 22, 2016 1:47 am

Thank you. These are important points from the research on forced labor that have broader implications for qualitative research in sensitive settings with vulnerable populations. While we might not always think of gatekeepers and elites as "vulnerable," the issues that you highlight regarding these research participants' anonymity suggest the care required in making explicit the data generation process. Can we make our data generation process transparent while protecting the identities of our gatekeepers and elites who participated in our research?

Post Reply

Matt Wood
University of Sheffield
Posts: 1
Joined: Fri Sep 09, 2016 6:36 am

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostWed Sep 28, 2016 6:35 pm

AnastasiaSh wrote:When is it appropriate, ethical, and feasible to make explicit the strategies used to generate evidence with human participants? Are there issue-areas or research contexts that pose particular challenges in making the data generation process open to the broader public? What aspects of the data generation process have been more amenable to sharing in your experience? What ethical considerations have limited your ability to be fully transparent about your research methods? What practical costs (resources, time, etc.) have you incurred in making your data generation process accessible?


In my own research I primarily focus on 'process tracing' in the sense of reconstructing the causal process that leads to particular political and governance outcomes, at transnational and national levels. I am particularly focused on 'crises' in public policy and how they are resolved. In order to effectively triangulate my findings, I need to interview very specific people - usually elite politicians and civil servants, but also CEOs of private companies and reports from newspapers - who have knowledge about a particular event at a specific moment, or series of moments, in order to reconstruct the story of what happened and gain detailed causal knowledge. Often, because of the contentious 'crisis' moment in question, elites, especially those still in office, are very difficult to secure for interview without clear guarantees of confidentiality. Revealing the full transcript of the interview in this case would clearly reveal what the person knew about the specific events, and who they knew and in relation to what processes. It would be extremely easy to link the 'raw' data back to them personally if it were made available even in partial fragments. Nonetheless, this elite evidence is absolutely crucial in reconstructing some of the most important political moments of our time, and how decisions came to be made during them.

The aspects of my data most amenable to sharing are those where the interviewees have made broader assertions or statements about general aspects of their work or common practices that tend to take place in their department or organisation. These are not, however, the focus of the DA-RT initiative, which is concerned with transparency of the data that enables researchers to make causal claims. This would entail making very specific knowledge public, that would be easily traceable to the source; the individual or group of individuals there at the time of the decision. If I had to make full interviews publicly available, this would place a substantial barrier to being able to conduct process tracing research into crises in public policy, that get at these specific organisations and individuals and their decision making process. It would substantially weaken the data I could include in an article, and therefore the power of the claims I could make, which would have to be far more abstracted. Contrary to the claims of the DA-RT initiative, these rules would in fact make process tracing research less, not more, rigorous.

In process tracing research, the most important aspect is the triangulation of findings so that multiple sources are used to secure a credible account of the series of causal factors leading to a particular outcome. The 'transparency' issue here is hence a case of the 'weight of evidence' for one interpretation over another. This simply requires that a judgement is made by a reviewer about how much evidence has been used (often quantitative as well as qualitative) to make the claims. Therefore, I would suggest focusing any 'transparency' initiative on simply asking authors to report their schedule of data collection rigorously, and if making any data on interviews available, sending these anonymised to peer reviewers so they can make the judgement in confidence. Making the data available publicly would mean vast swathes of research into crisis management in public policy and political science would be near impossible to conduct.

Post Reply

Anastasia Shesterinina
Yale University
Posts: 17
Joined: Thu Apr 07, 2016 11:51 am

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostSat Oct 01, 2016 11:41 pm

Thank you, Dr. Wood. Your insightful response from elite-based research on 'crises' in public policy raises a number of important issues. One point I would like to highlight is the broader knowledge that the researcher develops in the course of fieldwork. Even when providing short quotes or longer excerpts from our interviews is not advisable for human subject protection reasons, this grounded knowledge greatly impacts our understanding of the issue and the processes underlying it. While we may not be able to disclose the sources of this knowledge or the ways in which we arrived at it in full, it is an essential aspect of the generation of our findings.

Post Reply

Rachel Ellett and Mark Fathi Massoud

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostMon Nov 07, 2016 4:56 pm

[From Rachel Ellett (Beloit College) and Mark Fathi Massoud (UC Santa Cruz):]

Confidentiality and Interview Data Dissemination

Evaluating data from qualitative interviews occurs on three levels – sampling, validity and reliability (Bleich and Pekkanen 2013). While validity is established through evidence triangulation, the reliability of an interviewee may be difficult to ascertain without compromising anonymity. With regard to sampling, there is scope for increasing transparency without compromising confidentiality, including in the creation of an interview appendix (Id.). But even the most faithful transcriptions cannot capture the depth of silences, confusion, laughter, or hostility during an interview. Here, carefully prepared and redacted field notes placed in a methodological appendix may capture the ways that context matters. (Due to length constraints, such appendices would differ for article- and book-length projects.) Collecting interview metadata may prove equally as important as collecting interviewees’ reflections.

In addition, the obvious challenge of confidentiality operates acutely in societies with a small professional class, concentrated over one or two metropolitan areas. In these settings, even choosing not to remain anonymous reduces the pool of people to which anonymous individuals belong. A researcher’s commitment to confidentiality, even when a respondent prefers to speak publicly, enables scholars to protect those who want – or need – to remain anonymous.

Even settings with relative political stability may later collapse into political disorder and conflict, and those in power may suddenly find themselves outside the state’s protection. Though generating accurate transcriptions is costly and time-consuming, they offer an additional layer of protection to recordings.

For interview-based research, clarify ethical concerns around interviewee anonymity, particularly in fragile political settings. The assumption that interview data needs to be shared may not be viable in volatile settings with small professional classes, particularly where seemingly innocuous data may become political weapons down the road (Lynch 2016). Furthermore, good data transparency does not necessarily produce good data analysis, which involves the careful documentation of interview context – metadata – and the construction of interview appendices. In short, thinking creatively about how to conduct and disseminate interview-based research is critical to strengthening the inferential value of qualitative data.

Sources:
Bleich, Erik and Robert Pekkanan. 2013. “How to Report Interview Data.” In Interview Research in Political Science, ed. L. Mosley, pp. 84-108. Ithaca, NY: Cornell University Press.
Lynch, Mark. 2016. “Area Studies and the Cost of Prematurely Implementing DA-RT.” Comparative Politics Newsletter, Spring 2016, p.36.

Note: This post is adapted from the article, “Not all Law is Public: Reflections on Data Transparency for Law and Courts Research in Africa,” by Rachel Ellett and Mark Fathi Massoud, African Politics Conference Group Newsletter, August 2016. Available at https://dialogueondartdotorg.files.word ... er12_2.pdf

Post Reply

Anastasia Shesterinina
Yale University
Posts: 17
Joined: Thu Apr 07, 2016 11:51 am

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostWed Nov 16, 2016 11:30 pm

Thank you for your response, especially, for drawing attention to interview context, or metadata. The article appended to your response provides further insight and is an important contribution to the conversation.

Post Reply

Alison Post
U.C. Berkeley
Posts: 3
Joined: Fri Nov 18, 2016 12:39 pm

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostFri Nov 18, 2016 3:10 pm

I concur with the constructive spirit of many of the contributions to this forum, especially those which have focused on identifying ways in which researchers can be more explicit about their research design and methods. Providing explicit and thorough discussions of research design and procedures (e.g. how interview subjects are recruited, the percentage of a desired sample one was able to in fact interview, etc.) is distinct from sharing “data” from interactions with human subjects. Other contributors have made very sensible suggestions regarding how standard practices for such documentation could be improved. If this is to be expected of most qualitative articles, journals that do not currently accommodate online appendices will need to be more flexible on this matter to ensure there is sufficient space for such materials, or else increase article word length limits.

Post Reply

Anastasia Shesterinina
Yale University
Posts: 17
Joined: Thu Apr 07, 2016 11:51 am

Re: How and when can and should we be transparent about the process of collecting evidence with human participants?

PostFri Nov 18, 2016 11:00 pm

aepost wrote:I concur with the constructive spirit of many of the contributions to this forum, especially those which have focused on identifying ways in which researchers can be more explicit about their research design and methods. Providing explicit and thorough discussions of research design and procedures (e.g. how interview subjects are recruited, the percentage of a desired sample one was able to in fact interview, etc.) is distinct from sharing “data” from interactions with human subjects. Other contributors have made very sensible suggestions regarding how standard practices for such documentation could be improved. If this is to be expected of most qualitative articles, journals that do not currently accommodate online appendices will need to be more flexible on this matter to ensure there is sufficient space for such materials, or else increase article word length limits.

Thank you for your response, Pr. Post. If the transparency requirements for qualitative research are indeed different, the issue of sufficient space and additional time and effort invested in the methodological discussion/appendices should be an important part of the conversation.

Post Reply