Page 1 of 1

What problem does access to evidence and transparency address?

Posted: Tue Sep 13, 2016 5:08 pm
by ingorohlfing
The debate about access to data and transparency is anchored in the aim to solve specific problems (see slide show on http://www.dartstatement.org/). What problems do access and transparency diminish, in your view? If you can think of problems, can you imagine alternative instruments or policies working towards the same end?

Re: What problem does access to evidence and transparency address?

Posted: Wed Oct 05, 2016 10:14 am
by Marcus Kreuzer
Let me respond with respect that the problems that "analytical transparency" tries to solve. I think that the problems that data access and production transparency address are more self-explanatory. The DA-RT guidelines are particularly brief and generic about the goals of analytic transparency. It states: “to describe relevant aspects of the overall research process, detail the micro-connections between their data and claims (i.e., show how the specific evidence they cite supports those claims), and discuss how evidence was aggregated to support claims.“

I believe that causal process tracers, or qualitative scholars more broadly, have a much expansive understanding of analytical transparency that is rooted in Bayesian analysis. The connections of "data and claim" are conditional on making explicit the following elements:
1. Explicating Priors: This involves clearly stating how much foreknowledge on a given subject is available, how much we know, how much we still don't know, and how such foreknowledge affects the confidence we can have in a test result. In short, analytical transparency requires a close and careful review of the literature.
2. Specificity: How many predictions does a particular theory make relevant to competing theories. The number of predictions makes tests riskier and thus generates more robust results. Analytical transparency thus requires we stop treating every test hypothesis as being created equal and differentiate them in terms of their specificity.
3. Test Strength: How many competing theories does a particular test evaluate, and how different are the predictions of these competing tests? Tests become strong to the extent that the competing hypotheses are more different/unique and more alternative hypothesis are tested. Analytical transparency thus requires more information about what control variables were chosen and which ones were ignored.
4. Ontological Assumptions: What assumptions does a test make about the uniformity of evidence across the cases, the independence of evidence across those cases, or temporal structure of causation (Pierson). Also, are the test results conditional on some geographic or historical boundary conditions.
Various process tracers are trying to make those four criteria key building blocks of analytical transparency. They are well articulated but not yet widely used. (See Ingo's work, Bennett&Checkel, Peter Hall's "Aligning Methodology and Ontology", Beach&Pedersens who have written on this)
So the question becomes whether those four criteria fit DA-RT's understanding of analytical transparency?
ingorohlfing wrote:What problems do access and transparency diminish, in your view?

Re: What problem does access to evidence and transparency address?

Posted: Sun Oct 16, 2016 4:13 pm
by ingorohlfing
Thanks for the detailed and excellent post. I do not see why your criteria would not fit DART. I believe some criteria do not necessarily imply a Bayesian perspective (priors do, of course). Does this mean qualitative research should become more Bayesian and transparent on Bayesian ground? Is Bayesianism the only viable future of process tracing and comparative case studies?

Re: What problem does access to evidence and transparency address?

Posted: Wed Oct 19, 2016 5:48 am
by Macartan Humphreys
I’d suggest not tying transparency in process tracing to the use of Bayesianism. In my read the insights on process tracing in Bennett, Collier, and other writing comes from the specification of (possibly multiple) within-case observable implications of claims. That puts a focus on likelihood functions, and so is compatible with either a frequentist or Bayesian approach.

Re another thread on formalization/quantification opened by Tasha, I’d also suggest not mixing up the general issue of transparency with quantification. Quantification may or may not be possible or desirable in different settings but I don’t see any argument that it is necessary or sufficient for analytic transparency.

I think the issue around analytic transparency for process tracing is more basic: **can researchers provide a mapping from the sort of evidence they might see (and seek) to the sorts of conclusions they might draw**, whether or not the conclusions are expressed as posterior probabilities?

Marcus gives a nice example of this for a strategy in which you want a Bayesian conclusion. I think that is a fruitful way to go. But it is not the only way. One could imagine something similar using a falsificationist approach without any specification of priors and with simpler conclusions (e.g. reject / fail to reject).

The hard challenge that any version of this will face I think is the difficulty of specifying in advance the sorts of patterns that might be seen, and how they would be interpreted, given the complexity of qualitative data and the possibility of unexpected, but still interpretable, patterns. I'd be interested in hearing thoughts on ways to address that challenge.

Re: What problem does access to evidence and transparency address?

Posted: Thu Oct 20, 2016 9:33 am
by hillelsoifer
Macartan Humphreys wrote:The hard challenge that any version of this will face I think is the difficulty of specifying in advance the sorts of patterns that might be seen, and how they would be interpreted, given the complexity of qualitative data and the possibility of unexpected, but still interpretable, patterns. I'd be interested in hearing thoughts on ways to address that challenge.


Thanks for this thoughtful post, Macartan. I think you very nicely phrased the core challenge in making process-tracing research transparent. This is indeed the challenge we're grappling with, though there is some disagreement among qualitative scholars about whether transparency is something that requires specifying in advance or whether one can be transparent in one's research even if observable implications are not all identified in advance. I just want to tease apart those two issues, and suggest that transparency about interpreting evidence seems to me to face the same issues you highlight (in terms of specifying patterns relevant to testing causal claims and setting standards for interpretation in a context of complex data and the possibility of complex findings) even if a researcher does not agree that this all must be specified in advance.

Re: What problem does access to evidence and transparency address?

Posted: Fri Oct 21, 2016 3:28 pm
by ingorohlfing
I agree with Macartan and Hillel (as I understand him) that analytic transparency does not require Bayesianism. Process tracing is not intransparent if one does not specify a prior, it is simply not Bayesian. One might prefer Bayesian process tracing to any non-Bayesian variant of process tracing, but superior transparency does not strike me as a good reason.
I also agree with Hillel that transparency does not require specifying expectations in advance, as this is impossible in exploratory research. However, it is a problem if one does not know how sources were selected and processed, how the case was chosen etc., because these issues determine what we infer about a case and, possibly, to what cases we generalize. This holds regardless of whether the study is exploratory or confirmatory.

Re: What problem does access to evidence and transparency address?

Posted: Wed Oct 26, 2016 11:11 am
by jane mansbridge
As a newcomer to this thread, I am not clear as to whether it covers only the specific methodology of "process tracing" or other comparative methods. I am not familiar with doing process tracing, although I have read articles using the method. On other comparative methods, however, I am not sure the approach that Martin sets out would always be the most fruitful. Is this section only on process-tracing? Thanks.

Re: What problem does access to evidence and transparency address?

Posted: Wed Oct 26, 2016 2:07 pm
by Tasha Fairfield
Thank you Jane,
We are aiming to discuss both process tracing and comparative analysis. On process tracing, you might take a look at the thread on Bayesianism and alternatives, where we've posted some links to current works on this topic. More generally, we would welcome any thoughts on specific problems that should be addressed in the current practice of process tracing and/or comparative analysis, as well as examples of excellent qualitative research that we can build on (see the thread on existing practices).

Re: What problem does access to evidence and transparency address?

Posted: Wed Oct 26, 2016 2:58 pm
by Guest
Macartan Humphreys writes: "...can researchers provide a mapping from the sort of evidence they might see (and seek) to the sorts of conclusions they might draw**, whether or not the conclusions are expressed as posterior probabilities? .... The hard challenge that any version of this will face I think is the difficulty of specifying in advance the sorts of patterns that might be seen, and how they would be interpreted, given the complexity of qualitative data and the possibility of unexpected, but still interpretable, patterns. I'd be interested in hearing thoughts on ways to address that challenge."

This would seem to present not just a "hard" challenge, but an impossible standard. In the context of qualitative research into complex, real-world socio-political phenomena, there is basically no limit to the different pieces and sorts of evidence that might be encountered, and no way to exhaustively catalogue the infinite possibilities and pre-analyze all of them in advance.

Fortunately, there is no need to do so. Any reasonable standard of scientific transparency need not demand that researchers provide ahead of time a full mapping between possible evidence and possible conclusions, but only require that they endeavor to supply an honest, rational, and critical assessment of the evidence actually obtained.

Whether the supplied evidence and its analysis are considered cogent is a question for the larger community of scholars who may debate and ultimately accept or question the conclusions. But whether the original analysis was somehow generated before the actual data were collected and pre-registered seems largely beside the point, logically speaking. Peer reviewers and other scholars should assess and critique the argument in the same way and using the same logical, statistical, or historical counter-arguments either way.

Before heading out on the Beagle, Charles Darwin had not even dreamed up the theory of natural selection, and certainly could not have predicted and analyzed ahead of time the forms or behaviors of all possible organisms he might observe. This did not make his eventual insights any less compelling.

Re: What problem does access to evidence and transparency address?

Posted: Wed Oct 26, 2016 3:10 pm
by Guest
[quote="ingorohlfing"]... analytic transparency does not require Bayesianism. Process tracing is not intransparent if one does not specify a prior, it is simply not Bayesian....[/quote]

I would argue that the Bayesian nature of process-tracing does not rise or fall on whether priors are specified. Any time probabilistic language or logic is used, implicitly or explicitly, but the probabilities cannot be naturally interpreted as long-run relative frequencies in some random trials, then the analysis may be regarded as Bayesian, at least to some extent.

Frequentist critics of Bayesianism often focus on the latter's need for prior probabilities, but often this is not even the most salient contrast.

Re: What problem does access to evidence and transparency address?

Posted: Thu Oct 27, 2016 5:02 pm
by ingorohlfing
I guess there are different views on what the most salient contrast is, but I think it is the difference between what you are uncertain about (frequentism = data, Bayesianism = hypothesis, to cut it short). However, it seems a little bit farfetched to me to say that any non-frequentist use of probability is Bayesian, to some extent, because there are so many understandings of chance and probability beyond this. It is clearly defined what Bayesianism is and if something does not feature priors, likelihoods and posteriors, it is not Bayesian. This does not make it worse research, but I believe we should be careful in not overstretching the meaning of Bayesianism.

Re: What problem does access to evidence and transparency address?

Posted: Thu Oct 27, 2016 5:10 pm
by ingorohlfing
I think Macartan was not entirely clear about the difference between expectations and observations. You cannot specify in advance what you observe because this is too specific. However, in confirmatory research you can specify in advance what you expect and not expect on a theoretical level. Fortunately, I believe that an increasing number of process tracing and comparative studies are doing this. For this, you need to formulate priors, if you do Bayesianism, and I agree with Macartan that this can be challenging.

I am not familiar with Darwin's discoveries in much detail, but there is a difference between exploratory and confirmatory research (after all, we speak of the Texas sharpshooter problem in confirmatory research for a reason). An assessment of arguments is valuable, but we cannot judge the empirical analysis only by discussing the conclusions. It might already figure in some thread in this forum: If you do interviews, I want to know whom you interviewed and how you selected the interviewees because you might get different evidence from different interviewees. A critical assessment of how the evidence was gathered and a critical assessment of how the evidence was interpreted need to go hand in hand, in my view.

Guest wrote: Any reasonable standard of scientific transparency need not demand that researchers provide ahead of time a full mapping between possible evidence and possible conclusions, but only require that they endeavor to supply an honest, rational, and critical assessment of the evidence actually obtained.

Whether the supplied evidence and its analysis are considered cogent is a question for the larger community of scholars who may debate and ultimately accept or question the conclusions. But whether the original analysis was somehow generated before the actual data were collected and pre-registered seems largely beside the point, logically speaking. Peer reviewers and other scholars should assess and critique the argument in the same way and using the same logical, statistical, or historical counter-arguments either way.

Before heading out on the Beagle, Charles Darwin had not even dreamed up the theory of natural selection, and certainly could not have predicted and analyzed ahead of time the forms or behaviors of all possible organisms he might observe. This did not make his eventual insights any less compelling.

Re: What problem does access to evidence and transparency address?

Posted: Sat Dec 03, 2016 5:46 pm
by Tasha Fairfield
I'm finally getting around to responding to Marcus' earlier post on the Bayesian approach. I agree that the Bayesian framework provides a very useful framework for analytical transparency (despite a number of challenges), and it helps us take background information seriously, as Marcus rightly notes. However, I do want to address a few misunderstandings about how Bayesian inference works, with regard to Marcus's third and forth points.

On point 3: Specificity--as Marcus defines it in terms of the "number of predictions" that one theory makes compared to another--is not relevant. Bayesian probability is not about counting up predictions, or counting up anything really. What matters is the likelihood ratio of the evidence--how probable would the evidence be if we imagine that one hypothesis is true, as compared to a rival hypothesis? In other words, which hypothesis makes what we see more plausible? This is the key inferential step that governs how we update the odds on one hypothesis being correct vs. a rival hypothesis. In our paper on explicit Bayesian process tracing, A.E. Charman and I lay out guidelines for how to assess likelihood ratios, by "mentally inhabiting the world" of each hypothesis, so to speak.

On point 3, test strength does not have to do with the number of predictions a theory makes or the number of theories being tested. Instead, test strength relates to the probative value of the evidence--once again, the likelihood ratio under rival hypotheses. We argue in our paper that the notion of subjecting a hypothesis to a series of tests is too close in sprit to a frequentist perspective; we advocate simply evaluating priors and likelihoods and updating by direct application of Bayes' rule. However, we also offer a detailed discussion of how to incorporate Van Evera's process tracing tests into a Bayesian framework, building on Humphreys and Jacobs (2015) but arguing that likelihood ratios and/or the concept of relative entropy provide the most sensible classification.

You can read our discussion about process tracing tests here, in Section 5: http://tashafairfield.wixsite.com/home/research
We welcome comments on the paper.