I.1. Ontological/Epistemological Priors

Forum rules

We encourage contributors to the Discussion Board to publicly identify by registering and logging in prior to posting. However, if you prefer, you may post anonymously (i.e. without having your post be attributed to you) by posting without logging in. Anonymous posts will display only after a delay to allow for administrator review. Contributors agree to the QTD Terms of Use.

Instructions
To participate, you may either post a contribution to an existing discussion by selecting the thread for that topic (and then click on "Post Reply") or start a new thread by clicking on "New Topic" below.

For instructions on how to follow a discussion thread by email, click here.

Rudra Sil
University of Pennsylvania
Posts: 6
Joined: Thu Apr 07, 2016 4:50 pm

Transparency and Common Sense in the Face of Diverse Models of Social Science

PostFri Dec 02, 2016 3:01 pm

Since the debates over DA-RT took off a few years ago, I have been having trouble putting a finger on what the problem is that we are trying to solve. Yes, we want to ensure that researchers are honest and transparent, and we want to do this by making sure others can evaluate different aspects of their various knowledge claims. But, isn't that what we already do? For most researchers, is it not simply a matter of common sense that convincing a skeptical audience of one's claims requires laying bare one's assumptions, explaining one's models, and demonstrating how empirical data constitute evidence of claims? For this purpose, if there is not enough space in the version of the article to be published, certainly journals can make space for online storage of appendices or data-sets, giving scholars the opportunity to provide as much documentation as they believe is necessary to convince skeptical readers that their research is innovative and compelling. But, beyond this, current professional practices, from double-blind reviews to post-publication debates between authors and their critics, already do much to incentivize scholars to lay bare the basis for their knowledge claims and to lay bare the principles informing different aspects of an argument (from the formulation of questions and definition of concepts to the operationalization of variables and the claims that certain models or narratives are superior to alternative explanations).

Indeed, this is arguably a more holistic process for encouraging transparency, while also adding expectations of originality and logical coherence. COMMON SENSE already obliges researchers to do what they can to be convincing to audiences that they must assume to be skeptical given the looming threat of rejection from reviewers and the possibility of critical appraisals of their published work. And, some type of "policing" is already evident in common-sense understanding of different forms of social scientific research, and that adding new layers of procedures and regulations requires uniform undersandings of what constitutes "knowledge" or "truth." As for those inclined to knowingly circumvent ethical standards and obligations, adding new standardized procedures simply shift the loci and techniques of evasion, as is amply evident from cases in other disciplines (psychology and economics, for example) where transparency procedures have long been in existence but where some scholars nonetheless sought to evade detection of dishonest or problematic practices (and succeeded in doing so for a period). For more intrinsically diverse disciplines like political science, the main point of vulnerability in a knowledge claim varies for different styles of research aimed at different intellectual objectives and even different kinds of audiences -- and for each of the resulting research products, scholars have long been aware of the things they need to do to lay out convincing knowledge claims, as this is the entire point of their professional identities and commitments.

In purely statistical work relying on data-sets involving intrinsically quantitative data (e.g. GDP growth rates, migration statistics, voter turnout), it is in the interest of most researchers to tout their data-sets and codes to gain recognition for their approach, and repeated refusals by scholars to share their data with colleagues interested in replicating their work carry mounting reputational costs. More problematic are data-sets that require the researchers to quantify observations that do not originally appear as numeric quantities (e.g. assigning a value on a continuous variable to capture the extent of democratization or the strength of identification with an ethnic group), as here the most critical locus of the problem has to do with the initial CODING of historical events, institutional attributes and social processes so that these can be treated as quantitative data. For qualitative researchers relying on archival research, the discussion in the text and footnotes allow authors to anticipate some of the challenges and lay out their assumptions about contested historiography or their motivations for selecting from the range of available primary sources (as self-aware historians have sought to do to make their arguments more convincing). For interpretive work relying on close-range observations of human subjects, there are not only issues with ensuring the privacy and safety of the subjects themselves (a point repeatedly stressed by ethnographers) but also a more fundamental issue of capturing the moment of the observation within a spatio-temporally bound context. For the latter, the issue becomes one of whether similar acts of observation and interpretation might yield alternative narratives within other contexts that appear to be similar - and it is out of the resulting debates that we seek to establish the value of a particular narrative. Unless we assume that we live in a world of final truths that all of us define and approach in the same way, there is no reason to insist on a uniform definition of transparency or a uniform view of the purpose of replication.

Short of that kind of unified consensus, which has eluded us for centuries for a good reason, the very act of convincing skeptics among their primary audiences has a built-in incentive to make different steps and components of one's research clear to readers, and this obligation is carried forward in different ways in different disciplines and in different styles of research in our own discipline. Against this backdrop, an effort to create and impose uniform procedures across journals that have historically showcased diverse approaches and arguments for diverse audiences will create unevenness in submissions, acceptance rates, and costs in terms of time and resources. At least as importantly, these procedures will simply shift the locus of "dishonesty" for the small minority of researchers who consciously seek to conceal some form of "malpractice." In some ways, the DA-RT effort is reminiscent of the creation of long, elaborate, complicated tax codes that make tax evaders simply find different loopholes, while leaving the rest to pay the costs by having to wade through more complicated forms and expend more energy in filing their taxes.

Post Reply

Catherine Boone
London School of Economics
Posts: 6
Joined: Thu Apr 07, 2016 3:33 pm

Re: Transparency and Common Sense in the Face of Diverse Models of Social Science

PostMon Dec 12, 2016 5:02 am

Yes, I agree, Rudy has framed the issue perfectly.

Post Reply

Jacques Hymans (University of Southern California)

Re: Transparency and Common Sense in the Face of Diverse Models of Social Science

PostTue Dec 13, 2016 9:04 pm

I also strongly agree with Rudra Sil's comment. Let me take the discussion in a little bit different direction. Scott Sagan's idea of the "problem of redundancy problem" (elaborated in his tremendous book The Limits to Safety, his great article in the journal Risk Analysis, and other works) is a propos here. You have a potential problem; for instance, Sagan uses the example of the possibility that terrorists might attack a nuclear plant. So you create a solution: armed guards at the nuclear plant. But this creates its own problem: the armed guards might themselves be terrorists, and now they are inside the plant with weapons. In short, your efforts to improve the reliability of the system have actually decreased its reliability. This example applies very readily to DART, which is an attempt to utilize redundant systems to engineer further reliability of our peer review system, notably in order to avoid problems of intentional or unintentional misuse of our data sources (be they quantitative or qualitative). How might the imposition of a requirement to put up all your sources online misfire in practice? Well, for instance, Sagan talks about how "redundancy backfires through social shirking." In other words, the fact that people know that there are redundant systems in place that supposedly protect us from accidents means that they work less hard to guard against accidents, ultimately leading to more accidents. Thus in the DART case, if we have to put all of our original data sources online, then we may actually feel less pressure to process that data properly, because we expect that if we get anything wrong, someone else will find it and call it to our attention. Such social shirking is even more likely in the case of multiple-author work, which is increasing exponentially in our discipline. That's just one possible pitfall. The more important point that Sagan makes is the general one that when you institute new redundancies in an attempt to improve safety, you are likely to have unintended consequences that cannot be predicted in advance. Therefore, it is important to focus one's attention on fixing high probability and/or high consequence problems that definitely need fixing, and to rely on as minor tweaks to the existing system as possible to get the job done, in the hopes of avoiding unintentionally producing a catastrophic result. In sum, when it comes to DART, less is more.
Jacques Hymans
University of Southern California

Post Reply

Guest

Re: Transparency and Common Sense in the Face of Diverse Models of Social Science

PostSat Dec 31, 2016 11:32 am

Rudy's astute sensations of bafflement as he considers the range and depth of the DA-RT process in the discipline captures
effectively how this set of "solutions" does not address the "problems" that the APSA and many political science journal editors believe are endangering the discipline's credibility. The imposition of new procedural practices to regulate "data
access," assure "analytical transparency," and legitimate "production transparency" in the generation of the journal science
in the discipline exceeds, as Rudy notes, "Common Sense."

On the one hand, his remarks in this posting about political science today point out how less and less of
a shared intellectual project exists in common across the discipline, which has been developing in different ways on various planes since the late 1960s to early 1970s. The limited moves toward "methodological pluralism" in the 1990s and 2000s
opened the doors for developing a more common sense of the discipline; yet, it appears too many embedded intellectual and institutional interests in profession disliked what they beheld as these doors opened. The DA-RT process provides a complex
regulatory apparatus for policing these "pluralist methodological" turns in a fashion that reaffirms the conventional norms of hypothetico-deductive model of scientific reasoning at the core of a thousand "Scope and Methods in Political Science" courses, while at the same time affirming the aspirations of the network of journal editors to believe they are becoming political, social, or governmental physicists.

On the other hand, however, Rudy's appeal to "Common Sense" also discloses how the DA-RT process to affirm "good science," "transparent research," and/or "analytical rigor" might be likened to the public relations efforts of major corporations, which are entangled in unsustainable, ecologically degrading or environmentally destructive commodity production chains. With new seals of sustainability, ecological respect or environmental awareness, these firms "green wash" or "green wrap" their product lines in a fashion that occludes the sloppy, irrational, and/or dangerous by-products of their work.

Anxious to protect their increasingly questioned intellectual and institutional sources of normative authority, professional control, and academic legitimacy, the DA-RT supportive individuals and groups in discipline import the regulatory practices of many biomedical, natural or physical sciences to reinforce their professional power, position, and privilege. Yet, such efforts to "good wash" or "good wrap" the social science research process, as they understand it, are a misstep. They are neither producing the best outcomes endorsed in their ethical manifestos nor are they widely celebrated by those they aspire to regulate. Instead they are creating new roadblocks for many researchers, endangering the vocation of scholarship in many sub-fields, and resealing many of the once slightly open doors in the profession where more critical, creative, and cutting-edge work could contribute to everyone's intellectual awareness. Meanwhile, the deeper flaws and darker failures of the discipline are neglected, ignored or forgotten.

Post Reply