Andrew Bennett, Colin Elman & John M. Owen-Security Studies Security Studies and Recent Developments in Qualitative and Multi Method Research

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Security Studies

ISSN: 0963-6412 (Print) 1556-1852 (Online) Journal homepage: http://www.tandfonline.com/loi/fsst20

Security Studies, Security Studies, and Recent


Developments in Qualitative and Multi-Method
Research

Andrew Bennett, Colin Elman & John M. Owen

To cite this article: Andrew Bennett, Colin Elman & John M. Owen (2014) Security Studies,
Security Studies, and Recent Developments in Qualitative and Multi-Method Research, Security
Studies, 23:4, 657-662, DOI: 10.1080/09636412.2014.970832

To link to this article: http://dx.doi.org/10.1080/09636412.2014.970832

Published online: 03 Dec 2014.

Submit your article to this journal

Article views: 860

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=fsst20

Download by: [Uniwersytet Jagiellonski] Date: 18 November 2016, At: 15:53


Security Studies, 23:657–662, 2014
Copyright © Taylor & Francis Group, LLC
ISSN: 0963-6412 print / 1556-1852 online
DOI: 10.1080/09636412.2014.970832

Security Studies, Security Studies, and Recent


Developments in Qualitative and Multi-Method
Research

ANDREW BENNETT, COLIN ELMAN, AND JOHN M. OWEN

Research traditions are essential to social science. While individuals make


findings, scholarly communities make progress. When researchers use com-
mon methods and shared data to answer mutual questions, the whole is very
much more than the sum of the parts. Notwithstanding these indispensable
synergies, however, the very stability that makes meaningful intersubjective
discourse possible can also cause scholars to focus inward on their own tra-
dition and miss opportunities arising in other subfields and disciplines. Delib-
erate engagement between otherwise distinct networks can help overcome
this tendency and allow scholars to notice useful developments occurring in
other strands of social science. It was with this possibility in mind that we,
the Forum editors, decided to convene a workshop to connect two different
and only partially overlapping networks: the qualitative strand of the secu-
rity subfield and scholars associated with the qualitative and multi-method
research project.
The qualitative strand of the security subfield, most notably in the pages
of this journal and in International Security, typically follows traditional
forms of historical analysis. Scholars usually collect qualitative data, engage
in small-n comparisons and/or within-case analyses, and make descriptive
and causal inferences based on that combination of evidence and analysis.
With some minor updating in citations aside, this tradition has remained
unchanged for twenty or more years. By way of illustration, the issues of
Security Studies from 2012, 2013, and 2014 contained seventy-three articles,
of which thirty could be characterized as offering evidence-based analysis,
with a (sometimes implicit) research design, data, and conclusions. Of these

Andrew Bennett is Professor of Government at Georgetown University.


Colin Elman is Associate Professor of Political Science at the Maxwell School, Syracuse
University.
John M. Owen is Taylor Professor of Politics at the University of Virginia.

657
658 A. Bennett, C. Elman, and J. Owen

thirty, the overwhelming majority, twenty-eight, were within-case and/or


small-n comparisons.1
The qualitative and multi-method research project, meanwhile, encom-
passes a group of methodologists across the discipline of political science.
An earlier canon followed a highly inductive “best practices” model. Over
the last fifteen years, this older material has been displaced by a more prin-
cipled and prescriptive literature that focuses on epistemic foundations and
justifications. The more recent literature identifies why the mode of social
analysis being discussed is able to produce or warrant particular types of
knowledge claims. In particular, it explains why the method in question
helps scholars to arrive at valid inferences. In doing so, it engages recent
epistemological positions that have become part of modern mainstream so-
cial science—for example, the “potential outcomes” approach to causation.
A variety of methods are covered in this newer stream, including process
tracing, pattern matching, counterfactual analysis, and Bayesian and other
techniques for small- and medium-N analysis.
In convening the workshop, our hope was that a conversation be-
tween members of the security studies and qualitative and multi-method
research networks would be mutually enriching. One motivating question
was whether scholars in security studies would make their research more
rigorous by application of more recent methodologies, while preserving the
essential elements of the existing historiographical tradition. We were also
curious whether the newly formulated or reformulated techniques were al-
ready implicitly part of within-case and small-n analyses in the subfield. Put
another way, would the patent adoption of the new methods simply amount
to a more deliberate and self-conscious employment of practices already be-
ing used? Another motivating issue was whether the particular features of the
data or the questions addressed in the security subfield provided obstacles to
the ready deployment of the newer qualitative and multi-method techniques.
We held a preliminary one-day workshop on 13 August 2013 at the
APSA Annual Meeting in Chicago. This was followed on 26 and 27 Septem-
ber 2013 by a conference on “History, Method, and the Future of Security
Studies,” at the University of Virginia, cohosted by Security Studies, the Miller
Center of Public Affairs, and Syracuse University’s Center for Qualitative and
Multi-Method Inquiry. The conference comprised four sessions, each orga-
nized around a cluster of questions addressed to a particular research issue

1 The three most cited works were Alexander L. George and Andrew Bennett, Case Studies and

Theory Development in the Social Sciences (Cambridge, MA: MIT Press, 2005); Gary King, Robert Keohane,
and Sidney Verba, Designing Social Inquiry: Scientific Inference in Qualitative Research (Princeton, NJ:
Princeton University Press, 1994); Harry Eckstein, “Case Studies in Political Science,” in Handbook of
Political Science, Vol. 1, Political Science: Scope and Theory, ed. Fred Greenstein and Nelson W. Polsby
(Reading, MA: Addison-Wesley, 1975), 79–138. Among the thirty articles making evidence-based claims
and the subset using small-n and within-case methods, the median number of methodological books and
articles cited was 2.5. The modal number of methods citations for both sets was one. We are grateful to
Giles David Arceneaux for his research assistance in providing these data.
Forum on Transparency: Introduction 659

or approach: research transparency, process tracing, counterfactual analy-


sis, and multi-method research. Participants presented memoranda that later
became submissions for the Security Studies forums.2

INTRODUCTION TO THE FIRST FORUM:


RESEARCH TRANSPARENCY

The first Miller Center workshop session, and the first forum in the series,
focuses on research transparency, and in particular on achieving openness in
qualitative scholarship. In some respects this is a natural place to begin a con-
versation about methods. Transparency is best considered a meta-standard:
all rule-based social inquiry is based on shared and stable beliefs that re-
search designed and conducted in particular ways—according to particular
rules—is warranted to produce knowledge with certain characteristics. Only
research that is designed and conducted in accordance with those rules can
generate knowledge of that type. Openness empowers authors to demon-
strate that inquiry is rigorous work of a certain type and hence can claim its
virtues.3 Transparency is widely considered a fundamental element of social
science, and the discipline of political science has been at the forefront in
the development of new practices.4

Achieving Transparency through Active Citation


Andrew Moravcsik’s “Trust, but Verify: The Transparency Revolution and
Qualitative International Relations” is the lead essay in this first forum. Al-
though most qualitative researchers would likely agree with the claimed
benefits of openness, doubtless many also believe that traditional citation
practices are adequate to the task. The difficulty with this view, as Moravcsik
has rightly noted, is that customary footnotes and endnotes provide “faux”
transparency.5 Accurate citations appear to allow readers to check whether
sources support the author’s inferences and interpretations, but quite often
the high transaction costs of actually checking those connections make do-
ing so impossible in practice. Moreover, opaque research designs mean that
readers are often left to reverse engineer the author’s inferential mechanics.

2 All papers submitted for publication were put through standard double-blind peer review and were

subject to final editorial approval.


3 Colin Elman and Diana Kapiszewski, “Data Access and Research Transparency in the Qualitative

Tradition,” PS: Political Science & Politics 47, no. 1 (January 2014): 43–47.
4 Arthur Lupia and Colin Elman, “Openness in Political Science: Data Access and Research Trans-

parency,” PS: Political Science & Politics 47, no. 1 (January 2014): 19–42.
5 See Andrew Moravcsik, “Active Citation: A Precondition for Replicable Qualitative Research,” PS:

Political Science & Politics 43, no. 1 (January 2010): 29–35; Moravcsik, “Active Citation and Qualitative
Political Science,” Qualitative & Multi-Method Research 10, no. 1 (Spring 2012): 33–37; Moravcsik, “Trans-
parency: The Revolution in Qualitative Political Science,” PS: Political Science & Politics 47, no. 1 (January
2014): 48–53.
660 A. Bennett, C. Elman, and J. Owen

Moravcsik argues that qualitative researchers increase openness by using


active citation.6 This transparency technique entails backing evidence-based
claims with excerpted, annotated citations contained in a Transparency Ap-
pendix (TRAX).7 When scholars follow the best practice of providing digital
copies of the original sources (if they are available and shareable) or hy-
perlinks to permanent online versions of those sources, the original piece of
scholarship, the TRAX, and those sources together compose an active citation
compilation. Active citation operates as a digital exoskeleton, closely map-
ping the shape of traditional practices, but dramatically increasing the power
of those practices. As Moravcsik notes, active citation has been favorably
received by various stakeholders in the transparency conversation, including
domain repositories, journals, disciplinary associations, and of course indi-
vidual scholars. As one would expect from a relatively recent innovation,
however, there is still a great deal of development needed before active
citation becomes a fully developed technique for representing transparent
qualitative scholarship. Some of the questions that those working to develop
active citation need to address are raised by the three essays that follow
Moravcsik’s contribution to this forum.
Elizabeth N. Saunders praises research transparency in principle but
worries that its advocates might be understating the pragmatic and logisti-
cal burdens for scholars in meeting the new standards. She suggests that
it would be a mistake to raise the bar to the point where scholars are de-
terred from making their work more open. Perhaps Saunders’s most striking
contribution to the Forum is her clear description of the work involved in
constructing a TRAX and her concern about which notes should be activated
and what information should be included in each activated note. Saunders
is not convinced of the usefulness of annotations to demonstrate inferential
mechanics and more enthusiastic about the technique’s capacity to show
primary sources. Her essay helpfully points to the range of disagreements
among advocates of active citation and different preferences for when and
how the approach should be used.
In the Forum’s third article, Diana Kapiszewski and Dessislava Kirilova
draw attention to a range of unsettled questions and issues concerning
transparency standards, especially when they are met through active cita-
tion. These include developing mechanisms for determining which citations
should be activated and which of those require annotation; determining
which (and how much) original source material should be included; and
how much context needs to be provided for readers to understand how
data were generated and analyzed. Kapiszewski and Kirilova also remind
readers that active citation is a technique to show the application of research
methods, not a method in and of itself. The technique’s value in any given

6Moravcsik, “Active Citation”; Moravcsik, “Active Citation and Qualitative Political Science.”
7For more information on how to create a TRAX, see A Guide to Active Citation, available at
https://qdr.syr.edu/guidance.
Forum on Transparency: Introduction 661

application is by definition limited by the potential of the method being


applied in the particular research context and by the skill with which the
method is deployed. Active citation cannot put in what the author’s research
design leaves out. Kapiszewski and Kirilova also cover a range of challenges
to transparency, especially human subjects and copyright concerns.
Jack Snyder praises active citation as making a valuable contribution
but cautions that it not be misunderstood as a way to make inferences in
qualitative research. Active citation is a way of presenting data and analysis;
it is not a substitute for either. Snyder also notes that active citation is better
at connecting specific textual claims to evidence than at addressing larger
theoretical and conceptual arguments. Snyder reports that in his own work,
for example, these broader issues have been the main focus of his critics.
He also raises the concern that active citation might lead scholars to focus
too heavily on smoking-gun evidence, single pieces of evidence that are
sufficient to support a hypothesis.

CONCLUSION
We are grateful to Security Studies for hosting these Forums and to the
authors for their participation and contributions. With respect to this Forum
on transparency, we want to close with three points.
First, we agree that active citation appears to be a promising way to
present research, at least for designs that use within-case analyses and small-
n comparisons. It is likely to be one of the ways in which qualitative re-
searchers render their work more open. The technique is still in develop-
ment, however, and as the contributors to this Forum show, there are several
questions that have yet to be settled. Indeed, we suspect that even when the
technique is more advanced, there may not be one “right way” to activate.
Scholars may instead have to make choices among alternatives (for example,
whether to annotate or not) with known trade-offs.
Second, active citation is only one among a family of cognate techniques
that provide transparency in different research contexts. Not all of these prac-
tices will apply equally everywhere. Pre-registration of research designs, for
example, will have a much better fit with laboratory experiments than ob-
servational field studies. A risk that bears mentioning is that even though the
many methodologists working to develop transparency tools and techniques
are doing so in a somewhat coordinated manner, they may still be tempted
to redefine transparency terms of art to provide the closest possible fit with
their own research context. We think the sociological and methodological
payoffs from a coherent and unified disciplinary conversation outweigh any
possible advantages that might arise from location-specific redefinition, and
we hope scholars will resist that temptation.8

8 Terms and definitions designed to fit multiple research contexts and techniques have already been

established for our discipline. The American Political Science Association’s Guide to Professional Ethics
662 A. Bennett, C. Elman, and J. Owen

Finally, we agree with the authors that openness is an irreducible com-


ponent of social inquiry. Transparency allows scholars to demonstrate that
knowledge statements were produced in accordance with prescriptive meth-
ods (that are in turn based on an underlying epistemic foundation) and
hence have added value. Both the nature of the statements’ truth content
and the degree to which they have any depend on those statements’ inti-
mate connection with method and the theory of knowledge that the method
invokes. If one reflects on long-standing conversations among security stud-
ies scholars, this claim may seem like old news. Scholars of security studies
and their interlocutors from other disciplines have long considered political
scientists to be obsessed with public displays of their methods. John Lewis
Gaddis delightfully characterized how historians view political scientists and
contrasted the historians’ preference that form conceal function. “We recoil
from the notion that our writing should replicate, say, the design of the Pom-
pidou Center in Paris, which proudly places its escalators, plumbing, wiring,
and ductwork on the outside of the building, so that they are there for all to
see. We do not question the need for such structures, only the impulse to
exhibit them.”9
In our view, social inquiry where form conceals function is an oxy-
moron. But public display has to be more than performative. Methods only
expand understandings of the social world if they catalyze the underlying
theories of knowledge on which they rest. Accordingly, the only way to
make a strong inferential claim is to show how data were produced and
analyses were conducted to arrive at that conclusion and to show that the
methods have not been misunderstood or misapplied. To borrow Gaddis’
analogy, it is not enough to just put the pipes on the outside; we also have
to demonstrate that they are connected to something.

in Political Science (2012) states that “researchers have an ethical obligation to facilitate the evaluation
of their evidence-based knowledge claims through data access, production transparency, and analytic
transparency so that their work can be tested or replicated.” The three constitutive terms are defined as
follows:

6.1 Data access: Researchers making evidence-based knowledge claims should reference
the data they used to make those claims. If these are data they themselves generated or
collected, researchers should provide access to those data or explain why they cannot. 6.2
Production transparency: Researchers providing access to data they themselves generated
or collected should offer a full account of the procedures used to collect or generate the
data. 6.3 Analytic Transparency: Researchers making evidence-based knowledge claims
should provide a full account of how they draw their analytic conclusions from the data,
i.e., clearly explicate the links connecting data to conclusions.

9 John Lewis Gaddis, “In Defense of Particular Generalization: Rewriting Cold War History, Rethinking

International Relations Theory,” in Bridges and Boundaries: Historians, Political Scientists, and the Study
of International Relations, ed. Colin Elman and Miriam Fendius Elman (Cambridge, MA: MIT Press, 2001),
301.

You might also like