Dear all,
As much as I appreciate the effort and potential interest to have a collective discussion on “research transparency” (this is a preoccupation I think we should all have as researchers, regardless of the discipline and methodological toolbox), I must say that I am a bit reluctant to engage directly in a narrow/technical discussion on the QCA ‘analytical moment’ as is suggested by Carsten. The fundamental reason is that there are – I think – much broader issues at stake, not only in methodological terms but also in ‘political’ terms. So: after having read the discussion thread above, and after having quickly consulted a few contributions around the whole DA-RT thing (impressive how much time and energy has been consumed thus far in this whole enterprise!), I feel the need to first provide a (much) broader picture, with attached notes of caution. I think we should never engage in any ‘technical’ discussion without prior consideration of broader issues at stake. So I will structure my thoughts in 4 points:
1/ the broader ‘political’ picture
2/ the broader scientific picture: why ‘transparency’ in priority? Why not other research qualities?
3/ broader than QCA: the so-called ‘qualitative’ territory (and why I object to that label)
4/ and finally QCA proper & how I would view the ‘transparency _and_ other qualities’ recommendations
NB what follows below is a bit short/blunt & would probably deserve more nuances, elaboration etc., but… I simply have no time right now to write more & I’d like to post my comments before the late 2016 deadline.
1/ The broader ‘political’ pictureHaving observed the evolution of debates around political science / social science methods over the last 20 years or so, I am clearly a skeptic (NB: I did not write ‘opponent’) of the whole DA-RT initiative.
To put it short, my simple/quick ‘political’ reading of the whole context that has led to this initiative is the fact that political science (well, social sciences more broadly) are being questioned in terms of their ‘scientificity’ (two keywords in the DA-RT framing docs: “credibility” and “legitimacy”), specifically in the U.S., and vis-à-vis presumably ‘real’ sciences, e.g. experimental & natural sciences, and closer to them some more ‘scientific’(?) disciplines such as economics. Hence the need to demonstrate, somewhat defensively, that indeed there is a ‘science of politics’ – in very concrete terms in order to keep some legitimacy _and_ funding, academic positions and the like. And even more so on the ‘qualitative’ side of political science, which is fighting to gain some scientific legitimacy vis-à-vis the ‘quantitative’ mainstream, and which has precisely created the APSA’s Organized Section for Qualitative and Multi-Method Research (QMMR). I see the whole QTD initiative (launched by that QMMR Section) as a logical continuation in those (mainly U.S.) strategic moves.
I dislike this context, and in fact as seen from Europe I feel quite remote from it. But indeed the broader situation is that social sciences (sociology even more so than political science, I’d say) are seriously in trouble in the ‘global scientific competition’ – and this goes way beyond issues of methods and transparency.
And one reason I really dislike this context is that, hence, only a small segment of sciences (social sciences narrowly defined) sort of feels compelled to engage in that sort of self-justification (OK, this is a negative framing – in favorable terms, one could also argue that this is ‘consolidation’). The end result could prove to be: creating further ‘noise’ (masses of at least partly contradictory or complex positions), thereby further weakening political science/social sciences in the ‘global race’.
2/ The broader scientific picture: why ‘transparency’ in priority? Why not other research qualities?Here I very much agree with Peter Hall’s points, as he discusses in a recent – and very pondered -- contribution of his in the Comparative Methods Newsletter (Vol. 26, issue 1, 2016, pp. 28-31): “
what social scientists should value most highly is not transparency, but the integrity of research, understood as research that reflects an honest and systematic search for truths about society, the economy, or politics. In its various dimensions, transparency is a means toward that end. But it is only one of several such means (…)” (p. 28).
I think the more useful discussion we should have should not revolve solely around “transparency”, but rather on: “what is good [social] science?” or “what is sound [social] scientific work”?
According to me, “sound scientific work” comprises at least three main dimensions – and these are (or should be) universal, I mean across all sciences:
• Formulate a clear ontological/epistemological position (there are many valid positions, but they should be formulated) [NB: in many ‘hard sciences’ publications, this is missing!]
• Develop and implement a sound protocol (i.e. practical operations) – in here there are multiple sub-dimensions, and transparency is only one out of certainly 5 or 6 core sub-dimensions that are at least as important. One other point is, for instance: the adequacy of the protocol to the object(s) and research question(s) at hand. This point, for me, is more fundamental than “transparency” (transparency is already one level down, more at the technical/practical level)
• Keep a reflexive view as ‘scientific crafts(wo)man” – this should be true in all sciences. Nothing should ever be seen as obvious, as ‘definite truth’, as ‘the sole & best protocol’, etc. [again, in many ‘hard sciences’ publications, this is seldom question done]
So: “transparency”, for me, is only one sub-point in a much broader set of three transversal points, and I feel it extremely limitative to be constrained to limit oneself to that sub-point.
3/ Broader than QCA: the so-called ‘qualitative’ territory (and why I object to that label)The more I do research, the less I find the ‘quantitative’ and ‘qualitative’ labels appropriate. Most researchers who conduct an in-depth case study (allegedly an ideal-type of “qualitative” research) frequently use numbers, statistics, descriptive at the very least. And most scholars who engage in survey research and produce numerically coded ‘data sets’ (matrix-type data) in fact are tapping and labelling ‘qualitative’ variations through the concepts they use.
Anyhow I don’t think QCA broadly defined should be held hostage of debates in “qualitative” methods. QCA is case-oriented (at least in small- and intermediate-N designs), but it is also mathematical (formal logics and set operations) in its operations, etc. (see the numerous contributions in various textbooks and state of the art pieces during the last 10 years).
So: _if_ the core of the matter is the transparency issue (again, I don’t think so), then there should be 3 linked discussions:
- A. What are the core (universal) elements of ‘sound science’ [again, I stress: not limited to what some refer to as ‘qualitative’ science]?
- B. How do these core (universal) elements translate in practical terms for a scientist (NB not only a social scientist!) using QCA?
- C. How to do ‘sound science’ with QCA, more at the technical/applied level (as part of the protocol)?
In my next point below I unfold a tiny bit the third, more focused question.
4/ And finally QCA proper & how I would view the ‘sound science’ (incl. transparency & other things) recommendationsI definitively think we should _not_ begin by listing the technical QCA ‘good practices’ (= only during the QCA ‘analytic moment’) on only one sub-dimension (= only the transparency thing); see above.
My own approach would be very different, and with a much broader/more systematic scope. If I were to start from a blank page, I would write down a series of 3 questions in a logical order [and NB: gladly: we have already produced many answers to these questions over the last 3 decades!]
Question 1: what are the definitional elements of QCA? I think most of us could agree on a list of 7 to 10 elements – I myself tried to formulate these in a concise way in a piece published in the Swiss Political Science Review (2013); this could easily be crossed/enriched with bits & pieces in reference volumes such as the Rihoux/Ragin & Schneider/Wagemann textbooks.
On that basis, we could also list some ‘fundamental prerequisites’ for a researcher to take on board before (technically) engaging in QCA – I mean: more at the epistemological level. We already have this narrative in various recent pieces.
Question 2: what are the procedural (protocol-related) elements of QCA?I.e.: if there are 5 main uses of QCA (in a nutshell: 1. typology building, 2. exploring data, 3. testing theories, 4. testing propositions and 5. expanding theories): for each one of these 5 uses: what is the ‘standard & commonly agreed upon protocol’? Possibly with a few alternatives / sub-protocols? For instance if at some stage the researcher opts either for a csQCA or a fsQCA, this directs him/her to two partly different sub-protocols. Depending on the protocol, I guess this could boil down to between ca. 10 to ca. 25 core operations (steps), with iterations, too [by the way, the iterative nature of ‘sound science’ with QCA is for me at least as fundamental as the ‘transparency’ issue; see for instance the discussion in the Rihoux & Lobe chapter in the Byrne & Ragin handbook (2009)].
Question 3: considering these definitional and procedural elements, how to do “sound science” with QCA – throughout the whole research cycle and not only during the ‘analytic moment’?I would see this as a sort of matrix with, as lines, a quite detailed and ordered list of ‘typical research situations’ in which QCA comes into play at some stage. I guess this could boil down to around 15-25 research situations, depending on how we would aggregate things.
And then as columns I would see first some transversal points, and then the unfolding of the typical sequence of operations (the ‘protocol’, as defined above, i.e. with and ca. 10 to 25 steps depending on the protocol).
And obviously in each one of the cells I would fill in with the respective “sound science” criteria (“quality checks” bullet points) that should be met.
In a very simplified form, this could look like:
- Columns: Research situations // Transversal points/ preoccupations // Step 1 of protocol // Step 2 of protocol /// etc (…)
- and then the lines: for instance:
(…)
Situation 13: csQCA for theory-testing in smaller-N design
(…)
For sure, concrete good practices of ‘transparency’ could find their place in many of these cells – but also some other as important good practices for ‘sound science with QCA’ [perhaps we could agree on 5-6 core criteria that would need to be met in each cell, in fact – and transparency should probably be one of the 5-6 criteria]. And here I have no problem with some of the core points highlighted by Carsten & the contributors to this discussion thread.
For instance: sure: calibration/threshold-setting is, for instance, a core, focal point. NB for this point as well as the others, I think we should always stress that “there are different [technical] good ways to do it” – I mean: keep things pluralistic. For instance: there are at least 4-5 meaningful ways to approach calibration (it’s not only about direct v/s indirect), it depends on the purpose, on the topic, on the theory, on the researcher’s own position, on the nature of the ‘raw’ data [well, there is never really ‘raw’ data…], etc.
And once we have a matrix of this sort [NB I am convinced that we could quite easily find an agreement about at least 90% of the content] it should be made accessible in a readable form (TBD; there are different technical options); COMPASSS would be the obvious place.
By the way, this brings me to another important point/preoccupation of mine: in the next few months & years, we should definitively do this job of clarifying what is ‘sound science with QCA’ (in a way much of our collective discussions over the last few years, in Zurich & other places, have centered precisely on this), but the agenda should not be orientated/guided by some other agendas (I mean: the DA-RT one); we should pursue our own agenda!
OK, I am aware that I haven’t followed Carsten’s injunctions & proposed structure, but I thought I’d be …transparent in expressing my broader perspective

.
Best,
Benoît