- Posts: 5
- Joined: Thu Apr 07, 2016 3:03 pm
University of Nebraska-Lincoln
- Posts: 2
- Joined: Sat May 14, 2016 11:17 am
Please find here my comments:
1. The report does an excellent job of representing the concerns of scholars working across a wide range of epistemologies. The need to seriously consider and respect epistemological diversity in the discipline of political science is paramount. I think that the differences between positivist and interpretivist research are an underlying source of miscommunication in debates over transparency and human participant research. The report shows an awareness and respect for this diversity throughout.
2. Assessment of benefits: "transparency could help guard against dishonesty". I would like to see more discussion of the debate that is referenced in footnotes 25 and 26. Ultimately, does it seem like there is agreement among scholars that dishonesty is an actual problem? Should this be listed as the first benefit if participants didn't agree?
3. Assessment of benefits: I think the first and second benefits (or purposes) of transparency is that it helps others evaluate your claims and understand your research. Right now, these benefits are listed as the fourth and fifth ones. Ultimately, I wasn't clear on how the 1st-5th benefits differed. They seem to overlap a lot.
4. Assessment of costs/Human subjects protections and Access to human subjects: These are important sections. Definitely keep!
5. p. 15 The bullet points of transparency suggestions didn't seem very compelling to me. I would keep the paragraph on p. 15 that warns against "best practices" and more quickly get to the section, "In-article Transparency Discussion".
6. Advancing Research Reliability: When I read this section, which I took to be the conclusion, I was confused about how this fits with the themes of data access and research transparency. Maybe the importance of advancing reliability was discussed in the introduction and I missed it. In any case, reliability and transparency to me are different things. So it might be better to end by returning to the definition, purpose, benefits, and costs of transparency when doing research with human participants.
University of Utah
- Posts: 8
- Joined: Thu Oct 12, 2017 6:03 pm
Colleagues, I am writing to comment on the 8-25-2017 draft report of QTD Working Group II.2, Evidence from Researcher Interactions with Human Participants. I want to recognize all of the thoughtful, and careful work of Professors Arriola, Pollack, and Shesterinina that has gone into this document. I sincerely believe that all of you are acting in good faith in taking up the call by QMMR (as articulated by Alan Jacobs and Tim Büthe) to adapt DA-RT to the kinds of scholarship conducted among those doing qualitative and interpretive scholarship in political science. Moreover, there is much that I agree with—even as I take issue with a number of assumptions that seem to permeate the document.
First, the very last paragraph of the report states (p. 20, emphases added):
“The variety of practices discussed, ranging from the design to the write-up phases, can be readily implemented by most scholars to make their findings easier to evaluate in peer review. Greater recognition by journals of these practices as being consistent with transparency guidelines would facilitate the case-by-case determinations that editors and reviewers inevitably need to make when assessing the reliability of scholarship.”
The problem with this statement is that the relationship between DA-RT and JETS has never been fully explicated by the DA-RT proponents. Specifically: How is DA-RT meant to relate to peer review? Why isn’t peer review sufficient? If it is not, what specific elements need improvement and why are DA-RT and, especially, JETS the proper responses? Most important, this statement fails to recognize that JETS puts an additional hurdle in place prior to peer review: “The editor shall have full discretion to follow their journal’s policy on restricted data, including declining to review the manuscript or granting an exemption with or without conditions. The editor shall inform the author of that decision prior to review.”
This JETS approach stigmatizes all those qualitative and interpretive scholars who will now be on the defensive as supplicants to editors as they ask for “exemptions” to standards that have emerged out of the natural sciences passing through psychology’s “replication crisis” to then be smuggled into the APSA Ethics Guide.
Second, as that last phrase intimates, as my colleague Dvora Yanow and I document (Schwartz-Shea and Yanow 2016), the adoption of DA-RT within APSA is of questionable legitimacy. And the role of APSA in sponsoring the non-inclusive meeting that produced JETS is especially galling. I will not repeat those arguments here. Instead, my point is the report presumes the legitimacy of both DA-RT and JETS when, instead, it should be questioned.
Third, while I take at face value the report statement (p. 1) that “Our consultations with scholars in the discipline, combined with insights drawn from contributions to the QTD online forum as well as published materials, reveal broad support for the principle of transparency in social science research,” I would observe that the DA-RT language takes some long-standing practices and repackages them in ways that are pernicious. Here are some of the most worrisome elements of the DA-RT-articulated “transparency” project:
(a) The fetishization of “transparency” encourages a “laying bare” of the scholarly self, implying that our human role in knowledge production is meant to disappear; contrast transparency (and, also, “openness”) with “reflexivity”—a concept that acknowledges human embodiment in all of its complexity, see, e.g., Timothy Pachirat’s October 1st post on the QTD Working Group Report on Ethnography and Participant Observation;
(b) The instrumentalization of our academic work transforms our vocation, our calling, into the excessive documentation of our research “steps” in a linear fashion, such that the joy and “a ha” moments of the research process are now suspect rather than celebrated. Instead there is “production transparency” is if we were workers in an assembly line who need to be “incentivized” (see page 14). That is the language not of the academy but of “new public management.” That “innovation” in governance practices imagines that workers are not intrinsically motivated, not agentic, but, instead, homo economicus, responding to “incentives” dangled before them by oh-so-wise managers.
(c) The DA-RT articulation of “analytic transparency” asks the impossible: that researchers can (emphasis added) provide “a full account how they draw their analytic conclusions from the data, i.e., clearly explicate the links connecting data to conclusions.” The notion of a full account flies in the face of understandings of the practice of experts, which involves tacit knowledge [See Flyvbjerg’s (2001, 10-24), discussion of Dreyfus’ model of the development of expert knowledge.] As Margaret Keck observes in note 23 of the QTD Working Group II.1, Research with Text-Based Sources: “To analyze documents and interviews, I rely not just on language skills but on knowledge accumulated from 35 years of work in a region….” Standard peer review relies on experts of a similar caliber to assess her research claims—why we read and trust peer review articles. But DA-RT sets up the expectation that any reader can assess her claims based, say, on an active citation link to a bit of an interview or document. That is an expectation that disappears the tacit knowledge of expert researchers.
(d) And, finally, in the technical realm, in a period of astounding data insecurity, I have yet to see proponents of DA-RT seriously address the hacking problem for sensitive data stored in “trusted digital repositories.” Nor have they responded adequately to the “Boston College” case in which the U.S. government subpoenaed oral history evidence that was meant to be sequestered, for a period of time, of interviewees who had trusted researchers to keep their identities confidential. (See Palys and Lowman 2012.)
I want to emphasize that my objections are grounded in the coercive, top-down aspects of DA-RT and JETS; I have no quarrel with the idea that scholars may voluntarily chose to archive data or to take up suggested research practices that make sense for their particular projects. I agree with the judgments of Dara Strolovitch (p. 11) that DA-RT rules seem “very unlikely to produce better knowledge or insights about the political world.” And I agree with Rudra Sil (p. 13) that “adding new layers of procedures and regulations requires uniform understandings of what constitutes ‘knowledge’ or ‘truth’….” These are deeply philosophical issues that cannot be settled through rules and to enforce one standard will harm the very quest for truth and knowledge.
In my considered view, the DA-RT project and particularly its instantiation as JETS should be rejected as a threat to the scholarly value of methodological pluralism in the pursuit of truth and knowledge.
Flyvbjerg, Bent. 2001. Making Social Science Matter. Cambridge, UK: Cambridge University Press.
Palys, Ted and Lowman, John. 2012. Defending Research Confidentiality “To the Extent the Law Allows”: Lessons from the Boston College Subpoenas. Journal of Academic Ethics 10: 271-97.
Schwartz-Shea, Peregrine, and Yanow, Dvora. 2016. Legitimizing Political Science or Splitting the Discipline? Reflections on DA-RT and the Policy-making Role of a Professional Association.” Politics & Gender,12 (3), e11, 1-19. doi.org/10.1017/S1743923X16000428