In inviting input into the Qualitative Transparency Deliberations, members of the Steering Committee asked a number of questions about “not just data access but also, for instance, transparency about how we’ve gathered the empirical information on which we rely, about how we have analyzed or interpreted that information” (viewtopic.php?f=10&t=56
). I would like to briefly respond to some of these questions based on my recent experience of publication with the American Political Science Review.*
I am a qualitative researcher of the internal dynamics of civil war and focus on questions of social mobilization and participation in armed conflict. My current research is based on immersive fieldwork over 2011-2013 in the highly politicized and isolated environment of Abkhazia—a partially recognized, breakaway territory of Georgia,—where I conducted 150 in-depth, semi-structured interviews with a range of participants and non-participants in the Georgian-Abkhaz war of 1992-1993, engaged in daily participant observation, and collected extensive additional primary and secondary materials, including 30 interviews in Georgia and Russia.
My work speaks to the issues of sensitive human subject research in violent conflict settings brought up in other contributions, including trust in the researcher necessary for field access (viewtopic.php?f=10&t=41#p64
), grounded knowledge of the context and reflexivity involved in the interpretation and analysis (viewtopic.php?f=10&t=50; viewtopic.php?f=10&t=47
), and unintended consequences of making field materials publicly available (viewtopic.php?f=10&t=79;
see also Parkinson and Wood, 2015).
Here I will organize my comments around three aspects of the research process, namely, data access, production transparency, and analytic transparency, addressed in the 2012 Revisions to APSA’s Guide to Professional Ethics in Political Science, the 2013 Guidelines for Data Access and Research Transparency, and the 2014 Journal Editors Transparency Statement. While I addressed the issue of data access mainly in the manuscript, production and analytic transparency were facilitated by the inclusion of the online methodological appendices. This helped me protect my respondents in an ongoing way (Fujii 2012). I take each of these issues in turn.
1. Data access
I addressed the issue of data access in two stages. First, at the time of manuscript submission, I informed the editors that in compliance with my ethics protocols, interview transcripts and participant observation notes could not be made publicly available to ensure the security and confidentiality of my respondents. Even if anonymized, disclosure of field materials through a digital depository or online appendix posed a danger of compromising the identity of my respondents as personal identifiers could be grasped in the context of high network density and relatively small size of Abkhazia. This would be especially detrimental to those respondents who participated in the war in various capacities, but also to respondents more generally who came in contact with an international researcher of Canadian-Russian-Ukrainian background in the sensitive political environment complicated by the Russian presence and current relations between Georgia and Russia. Furthermore, this could jeopardise not only the trust of my respondents in my ability to protect in an ongoing way the information that they shared with me, but also my future security as a researcher in the area—a significant issue that deserves greater attention as part of the DA-RT deliberations (see, for example, viewtopic.php?f=10&t=47&p=203#p203
). As an alternative to making my materials available in full, I offered a detailed description of my data collection and analysis procedures in online methodological appendices.
The second stage of addressing data access involved providing extended, often paragraph-length, interview excerpts in support of my manuscript. In a separate post, Alan Jacobs raised an important question in this regard: “Does providing key pieces of evidence in more extensive form help readers better understand and evaluate the empirical basis of findings?” (viewtopic.php?f=10&t=70
). In my case, presenting extended interview excerpts helped me address a potential problem of evidence that could be viewed as too short and out of context for the reader to evaluate. At the same time, it posed similar challenges to making interview transcripts fully available. I had to ensure that I presented extended evidence as part of the typical mobilization trajectories while protecting individual details of my respondents. Moreover, providing extended evidence required significant additional space. This experience points to a broader issue faced by qualitative researchers who rely on extensive textual data, such as interview transcripts, as the empirical basis of their findings. The conditions of publication for this type of qualitative work can impose a different set of requirements not only on the researchers (for a discussion, see, for example, viewtopic.php?f=10&t=59
), but also on the editors, whose willingness to offer additional space can be decisive in facilitating the publication of our work.
II. ONLINE METHODLOGICAL APPENDICES
The issues outlined above relate to the manuscript itself as well as the (online) methodological appendices in support of the manuscript, to which I turn now. I used the appendices to provide additional details of my data generation and analysis procedures, which increased transparency of my research while protecting participants in my research by focusing on my choices in and out of the field. As a result, while the manuscript contains the central aspects of data generation and analysis, the detailed statement on my production and analytic transparency can be found in the online methodological appendices. This came at a price of an additional major piece of writing in support of the manuscript and effort and time that it took to produce. Below I provide a number of examples of production and analytic transparency tools that I used in my methodological appendices.
2. Production transparency
Production transparency in my methodological appendices meant explaining the choices that I made in the field, particularly how I selected my research locales and respondents and how I used various interview strategies, participant observation, and additional primary and secondary materials to develop a grounded understanding of my case and address potential sources of bias in my data.
First, my long-term fieldwork benefited greatly from the exploratory field trip, when I probed the theoretical foundations and feasibility of my research and established contacts necessary for future fieldwork in Abkhazia. My selection of research locales depended on this field trip as I was able to test my initial assumptions and refine my research design based on the understanding that I developed of the spatial and temporal variation at the Georgian-Abkhaz war onset that could have produced distinct patterns of mobilization, but also the security conditions that limited my ability to conduct primary research in certain areas—an example of the back-and-forth between theory and data in the practice of research raised earlier (viewtopic.php?f=10&t=56&sid=85ff77d88e59dcec2877a515a90a1111#p185
). This required me to devise creative strategies of locating comparable interviews conducted in these areas by other researchers and triangulating across extensive archival and secondary materials.
Second, I followed a number of strategies in accessing and selecting respondents in the particular conditions of my research. I worked independently, avoiding local government, non-governmental, or university affiliation, but sought permission for my research from local authorities—an important choice in the sensitive political context of Abkhazia. I devised a combined snowball and targeted selection strategy to ensure that I gained access to respondents with varied participation record along the mobilization roles continuum that I developed in advance of field research and refined during fieldwork in interaction with my data. My sustained presence and research activity allowed me to build the trust necessary for access and extend my initial networks, from which I selected subsequent respondents in each of my four research locales. I approached respondents in the participation categories not provided through the snowball sampling directly at their location of employment, which increased representativeness of my sample.
Finally, while my interviews spanned respondents’ life histories, I focused on the events of the war that took place two decades prior to my research. This created specific problems of potential bias related to reliability of recollections, endogeneity of memory to war-time processes, and homogeneity of responses due to common political loyalties (see Wood, 2004). I tackled these problems in multiple ways. My informed consent procedure stressed unavailability of benefits other than academic writing to reduce the incentives to misrepresent war-time mobilization. Respondents with a broad range of pre- and post-war political loyalties were selected to address the potential homogeneity problem. I used a combination of event and narrative questions in the interviews and drew on preceding interviews, participant observation, and the meta-data that emerged during the interviews (Fujii, 2010) to develop probes and follow-up questions. These strategies helped address issues of memory and suspected incomplete or misleading information and advance the conversation beyond the dominant narrative of conflict. Extensive triangulation, including with alternative interview archives collected by other researchers during the war and mid-way between the war and my research, allowed me to cross-check individual and collective mobilization trajectories and conflict narratives. I provided a full list of the (de facto) state, private, and news archives and libraries that I accessed and detailed how each of the sources that I used tackled the problem from different angles.
3. Analytic transparency
I employed two sets of analytic transparency tools in my methodological appendices. First, I clarified my three-stage coding strategy, whereby I coded the interviews according to broad background characteristics, recollections of events, and narratives of conflict, and provided a sample for each of the three stages of coding that I conducted. Second, I specified the sequence involved in my causal mechanism in comparison to alternative explanations in process tracing.
Overall, these practices and tools of transparency provided the foundation for the evaluation of my findings, while working to protect participants in my research. It is important to emphasize that, while the strategies that I adopted were available in the context of my research, they may not be transferable to other cases or modes of research, which suggests the need for greater trust in the researcher’ knowledge of the context (viewtopic.php?f=10&t=40&p=63#p63;
on ethnographic sensibility, see Schatz, 2009) and warns against a uniform approach to evaluating transparency in qualitative research more broadly.
*“Collective Threat Framing and Mobilization in Civil War,” American Political Science Review, Vol. 110, No. 3 (2016): 411-27.