LSQ 2023 - Steiman - Conducting Interview Projects in The US Congress Analyzing The Methods of

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 32

DANIEL STEIMAN

Department of Government, American University


ELIZABETH SUHAY
Department of Government, American University

Conducting Interview Projects in the


US Congress: Analyzing the Methods
of Experts in the Field
The field of political science is seeing renewed interest in studying the US
Congress via one-­on-­one interviews. Yet, the qualitative research methods litera-
ture on this topic has lagged behind, with few recent treatments available. The
result is uncertainty regarding how best to access and interview Congress. In this
study, we implement a novel study design, interviewing over 20 authors—­who
collectively represent nearly all Congressional qualitative interview studies from
the past several decades—­about their research practices. Whereas the existing lit-
erature focuses on lessons learned from one or two authors’ research experiences,
this approach allows us to synthesize a wide range of researchers’ practices and
perspectives, identifying areas of consensus and dispute and ultimately providing
comprehensive advice to qualitative researchers. As interviewing Congress be-
comes increasingly difficult amidst growing political polarization and distrust of
academics, this methodological advice comes at an opportune time for research-
ers studying the US Congress and beyond.

The US Congress is arguably the most powerful political body


on earth. Thus, it represents an important focus of academic re-
searchers’ attention. While the study of Congress has in recent
decades been dominated by studies of publicly available data,
there has been a recent growth in studying Congress up close, in-
cluding via in-­person interviews (e.g., Crosson et al. 2021; Curry
and Lee 2020; Henderson et al. 2021). Surprisingly, this increasing
interest in qualitative interview projects has not been met with up-­
to-­date, comprehensive methodological advice, leaving scholars
who are new to the enterprise without adequate guidance.

Both authors declare that they have no conflicts of interest.

Legislative Studies Quarterly, 0, 0, September 2023 1


DOI: 10.1111/lsq.12436
© 2023 Washington University in St. Louis.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
2 Daniel Steiman and Elizabeth Suhay

A number of high-­ quality publications on the topic of


Congressional interviewing do exist, including a few relatively
recent treatments (e.g., Baker 2011; Beckmann and Hall 2013;
Peters Jr. 2016). These authors, all Congressional interviewers
themselves, provide practical advice with respect to designing and
implementing Congressional research studies. Yet, this literature
has three limitations. First, nearly all such publications focus on
an individual author—­or author team’s—­point of view. Thus, it is
difficult for a reader to know whether any given work represents
the best approach for their study. One can synthesize across the
literature; however, authors clearly disagree on some topics, and
these disagreements go unadjudicated. Further disagreements may
lurk beneath the surface, as not all authors address the same set
of topics. Second, most of the published advice on interviewing
Congress specifically is at least two decades old. This means that
existing scholarship misses numerous real-­world developments,
ranging from technological changes to rising polarization. Third,
researchers have devoted little attention to certain important top-
ics, such as how researchers’ social identities influence the inter-
view process or IRB oversight.
In this article, we seek to address these limitations by pro-
viding a systematic update of the methodological literature on
interviewing Congress. We do so by studying the practices of aca-
demic Congressional interviewers, as one can expect that social
scientists’ methods have been “developed and selected for their
success” (Kuhn (1962) 1996, 208). Borrowing from Congressional
researchers’ own toolkit, we interview at least one author of nearly
every contemporary interview-­based study of Congress. In prac-
tice, this means interviewing authors of studies published as far
back as 1975, as well as authors whose work was forthcoming at
the time of the interview. After an initial review of the literature,
we constructed a questionnaire that reflected important topics in
interview methodology, with a focus on topics that had been sub-
ject to debate, had been given insufficient attention, or for which
methodological advice may have changed in recent years. Our in-
terviews suggest the field has coalesced around a number of “rec-
ommended practices” in sampling, recruiting, and structuring the
interview. This said, interviewees did not always agree with one an-
other. For example, we discovered differences of opinion regard-
ing questionnaire design and rapport building. Scholars also had
many different approaches to working with, and perspectives on,
the IRB. And scholars used their interview data in a wide variety
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 3

of ways. Despite some disagreements, the perspectives of study


participants provide a useful roadmap to researchers embark-
ing on Congressional interviews. Although we focus on the US
Congress, much of the advice distilled herein will be relevant to
researchers interested in interviewing members of legislatures and
parliaments in other nations as well as other classes of political
“hyper-­elites”—­in the United States and around the world.
Our article proceeds as follows. We first provide a succinct
synthesis of the existing literature on conducting interviews in
Congress, noting agreements and disagreements among authors as
well as limitations. We then describe our study sample and ques-
tions. In the empirical section, we mainly analyze the data quali-
tatively, grouping responses by theme. In the Discussion section,
we compare our participants’ perspectives to one another and to
those in the existing methodological literature. We also evaluate in-
terviewees’ practices and recommendations, offering concrete ad-
vice. In the final section, we consider future directions for research.

Theoretical Framework

We define elite interviewing as interviewing members of “a


group of individuals, who hold, or have held, a privileged posi-
tion in society” (Richards 1996, 199). There is a large litera-
ture that advises researchers on how to conduct elite interviews
(Aberbach et al. 1975; Berry 2002; Tansey 2007; Woliver 2002;
Zuckerman 1972). As this diverse literature shows, different types
of elites possess unique characteristics. Congressional elites, as a
specific class of “hyper-­elites” (Baker 2011, 101), present unique
challenges for scholars seeking to interview them.
Recognizing the difficulties that Congress as an institution
poses to specialists in the field, a literature has emerged that gives
advice on how to best conduct interview projects within Congress.
While some of this literature focuses on Congress exclusively (e.g.,
Baker 2011; Jones 1959; Matthews 1960), most of it spreads its at-
tention across many types of political elites—­such as lobbyists, fed-
eral bureaucrats, members of the federal judiciary, and the White
House (e.g., Aberbach et al. 1975; Aberbach and Rockman 2002;
Beckmann and Hall 2013; Berry 2002; Peabody et al. 1990; Peters
Jr. 2016; Robinson 1960).1 In most cases, evidence is drawn from
the authors’ research experiences—­usually the design and execu-
tion of a past project or projects. Some articles provide an overview
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
4 Daniel Steiman and Elizabeth Suhay

of an author’s entire academic career conducting interview pro-


jects (e.g., Baker 2011; Peters Jr. 2016).

Agreement and Disagreement in the Current Literature

Arguably the most prominent theme in the methodologi-


cal literature on interviewing Congress is a preoccupation with
access. Several authors argue that Congressional Members and
staff, given their political importance and hectic schedules, are
a uniquely difficult category of elite for researchers to reach
(Aberbach and Rockman 2002; Baker 2011; Peters Jr. 2016).
Among these authors, there is some consensus on tactics to gain
access to Congressional respondents. Several recommend that, be-
fore beginning the interviewing phase, one should make connec-
tions with those who can put you into contact with your target
sample (Baker 2011; Jones 1959; Matthews 1960). Others recom-
mend “snowball sampling,” that is, asking a respondent for further
contacts at the end of the interview (Beckmann and Hall 2013;
Matthews 1960; Peabody et al. 1990).
The literature devotes considerable time to how to conduct
the interview itself. Several authors recommend building rapport
with small talk before the interview, as well as starting with “easy,”
broad questions, such as asking about the respondent’s personal
background (Aberbach and Rockman 2002; Beckmann and
Hall 2013; Peabody et al. 1990). Given that Members of Congress
have demanding schedules, several advise keeping the interview
brief (Kingdon 1989) and creating a flexible questionnaire so
that interviewers can focus on key questions if time is cut short
(Jones 1959; Peabody et al. 1990). Some authors propose specific
techniques to bolster the validity and reliability of responses from
interviewees (Baker 2011; Beckmann and Hall 2013; Berry 2002;
Kingdon 1989; Matthews 1960). Jeffrey Berry, for example, rec-
ommends getting respondents to indirectly critique their own
perspective as a means of getting the respondent to move beyond
their personal bias. For example, an interviewer could say to a
Republican member: “Why aren’t the Democrats buying this?”
(Berry 2002, 680).
In looking across the literature on Congressional interviewing,
one can also find disagreements. For example, there is tremendous
variation in sampling, with some researchers relying on conveni-
ence samples (e.g., Beckmann and Hall 2013, 201) or purposive
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 5

samples (e.g., Fenno 1978), and some relying on stratified random


samples (e.g., Kingdon 1989, 299). There is surprisingly little dis-
cussion of the relative merits of these different approaches.
Scholars also disagree as to the best type of interview format,
including how structured the interview should be and whether one
should rely on closed-­ended questions. Beckmann and Hall (2013)
strongly recommend that interviewers include a closed-­ ended,
quasi-­survey approach in conjunction with an open-­ended com-
ponent. By contrast, Aberbach and Rockman favor a semistruc-
tured open-­ended approach (Aberbach and Rockman 2002). At
the other end of the spectrum, Baker argues for a completely
unstructured interview format, arguing that a flexible question-
naire is critical when interviewing in Congress, as “[M]embers
of Congress will talk about what Members of Congress will talk
about” (Baker 2011, 111).
A practical but important disagreement is whether to record
interviews. Fenno famously relied on note-­taking or memory and
abstained from recording due to its “possible adverse effect on
the interview,” namely that it may deter frankness and spontane-
ity (1978, 279). Peabody and coauthors argue that recording has
advantages and disadvantages. An obvious advantage is that the
researcher possesses a complete transcript of the conversation;
however, a drawback of recording is the reluctance of politicians
to go on the record (Peabody et al. 1990, 453). Aberbach and
Rockman (2002), by contrast, found little difficulty in getting per-
mission to record, noting that most respondents seemed to forget
the tape recorder was even there during the interview.

Limitations of the Current Literature

While one can glean a great deal of substantive, helpful ad-


vice from these authors, we note four key limitations of the exist-
ing literature.
The first is that most authors’ suggestions stem from their
personal experience conducting interviews. Syntheses of the lit-
erature, as we have provided here, are useful; yet they leave many
debates unresolved, as authors inevitably address different topics
and often do not engage with one another’s work. Interviewing
scholars directly allows us to overcome this problem. Whereas ex-
isting publications do not cover the same set of topics, an inter-
view study can ensure that all authors are asked the same set of
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
6 Daniel Steiman and Elizabeth Suhay

questions, laying the groundwork for systematic analysis. Further,


especially where there exists disagreement among scholars, the in-
terviewer can probe to better understand the reasons and ration-
ales for those differences.
A second limitation of the literature on interviewing in
Congress is, simply, that it is out of date. Most of the advice lit-
erature dates before 2003, with only a handful written in the
2010s (Baker 2011; Beckmann and Hall 2013; Peters Jr. 2016).
Some anachronisms in the literature relate to technology, with nu-
merous works discussing the mailing of written cover letters to
Congressional offices before the widespread use of email or the
inconvenience of using bulky tape recorders in the years before
smart phone-­recording apps became ubiquitous.2 More impor-
tantly, the datedness of much of the advice literature has resulted
in an absence of discussion of recent changes that have occurred
within Congress as an institution. The literature, for example,
scarcely covers the acute hyper-­partisanship and polarization that
has increased in Congress in recent years (Lee 2016). Similarly,
while there are discussions in the extant literature regarding how
difficult it is to schedule interviews with Congress, recent research
on Congress suggests that both Members and staff are busier than
ever due to factors such as shrinking staff sizes and an increas-
ing emphasis on messaging (Gelman 2018; Reynolds et al. 2017).
Social media may also decrease the willingness of Members and
staff to participate in academic interview projects due to fear over
leaks and the risk of political embarrassment (Parker 2018).
A third limitation of the existing literature is that it has not
engaged with recommendations growing out of the large litera-
ture on qualitative methods or the debate between positivist and
interpretivist scholars found therein. Although interview data can
be coded and quantified (Gerring 2017, 19), interviewing is usu-
ally classified as a qualitative method (King et al. 1994, 3–­4). As
with quantitative scholars in the field, most political science inter-
viewers operate within a “positivist” epistemology: using interview
data to test falsifiable hypotheses (Mosley 2013, 9). Most of the
Congressional advice literature on our topic—­as well as most of
our respondents—­are positivist in orientation. However, an impor-
tant subset of scholars (and our interviewees) conduct interviews
within an interpretivist framework. For interpretivists, the goal of
their project is to understand how people perceive their reality and
create meaning. Further, interpretivists argue that it is impossible
to separate identity and power dynamics between researcher and
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 7

subject within the data-­collecting process (9). While positivists be-


lieve objective data can be “extracted” from the interviewee, inter-
pretivists instead assert that the interaction between interviewer
and interviewee is itself a form of data generation (Fujii 2018,
8). We see merit to both positivist and interpretivist approaches;
throughout the article, especially in our Discussion section, we in-
corporate insights from scholars working in both traditions.
A fourth, and final, limitation is that the current literature de-
votes little attention to ethics or the role of the Institutional Review
Board (IRB) as it pertains to Congressional interview projects.
While there is some discussion concerning ethical issues regard-
ing recruiting participants (Dexter 1970, 33–­34) and the impor-
tance of respecting their confidentiality (Baker 2011; Beckmann
and Hall 2013; Fenno 1978; Peters Jr. 2016; Polsby 2005), there is
almost no mention of the IRB. This is due in part to the datedness
of much of the literature. Before the 1990s, IRB oversight in the so-
cial sciences was not yet well established (Schrag 2009). Even after
IRBs extended their reach into the social sciences, interviews with
public officials—­both elected and appointed—­remained exempt
from IRB review. However, in 2019, the US government formerly
removed the IRB’s public officials’ exemption (American Political
Science Association 2020). This change requires scholars to give
greater thought to the benefits and risks of their studies, as well
as ethical safeguards, and will likely curtail some Congressional
research.

Our Study

Interview Sample

Our target population was living authors of published re-


search on the US Congress that included a substantial interview
component. We first worked to create a complete list of relevant
publications. We excluded studies based only on surveys—­that
is, standard questionnaires distributed widely to a random sam-
ple and normally self-­ administered3—­ as well as audit studies.
To reduce the likelihood that we had not missed any works, we
consulted with several Congressional scholars well-­versed in the
field but not involved in the project. Ultimately, we added several
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
8 Daniel Steiman and Elizabeth Suhay

at-­the-­time unpublished interview studies by scholars identified as


active researchers in the field.
We then sought to recruit one author of each work. In the
case of Congressional interview studies conducted by multiple au-
thors, we sought to interview first authors or, for author teams
sharing first authorship, we emailed all authors on the team and
allowed them to choose who would speak with us.4 With the excep-
tion of two retired scholars, every author or team we contacted
agreed to be interviewed. In all, we interviewed 22 scholars in 21
separate interviews representing 20 interview projects. Nineteen
participants were current or retired professors; the other three in-
cluded a senior fellow at a DC-­based think tank, a postdoctoral
fellow, and a current political science PhD student. Sixteen par-
ticipants were men, and six were women. Four of our interviewees
conducted interviews with only Members of Congress, eight with
only Congressional staff, and nine with both. The vast majority
had conducted studies in or after 2000. Approximately half had
conducted interviews in the 1980s and/or 1990s. Several had con-
ducted interview projects as early as the 1970s.5
Given that some authors had published more than one rel-
evant study, we asked interviewees to focus on just one project in
the interview. This allowed us to keep interviews to a reasonable
length and to cross-­check responses with authors’ works, some-
times filling in missing details. Occasionally, interviewees touched
on other projects, especially in response to questions on changes
over time or comparing Congressional interviews to other types of
interviews.

Interview and Questionnaire Design

Our semistructured questionnaire was short enough to be


conducted in a 30-­min. window, although in practice, interviews
usually lasted close to one hour. After asking participants for a
general description of their project, we asked about the follow-
ing topics: (1) why the interview method was chosen; (2) sampling
methodology; (3) recruitment strategies; (4) interview format and
types of questions; (5) rapport-­building strategies; (6) recording
and note-­taking; (7) data analysis; (8) IRB and ethics; and (8) per-
ceived changes in Congressional interviewing over time. The full
questionnaire can be found in the online supporting information.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 9

We conducted interviews in the summer and fall of 2020.


All interviews were conducted and recorded over Zoom. We used
Otter.ai transcription software to transcribe interview recordings
and then personally edited transcripts for accuracy. All interview-
ees were read a standard IRB consent script and agreed to par-
ticipate and be named in an acknowledgment section. Beyond this
general acknowledgment, we promised to keep researchers’ per-
spectives confidential.

Empirical Findings

Why Interview?

Nearly all (20) interviewees said that, given their project’s goals,
it was necessary to conduct personal interviews in Congress rather
than only rely on publicly available data. The most common expla-
nation, highlighted by 12 participants, was that interviews were nec-
essary because their projects required understanding the personal
perspectives of Members and staff. Six reported that they needed
interviews to gather data on behind-­the-­scenes processes occurring
in Congress. Interviewee #18 summarized these views succinctly:
“Interviews reveal the intentions and understanding of actors in a
way that may not come out in floor speeches. They also give insight
into ways of doing business that are not necessarily transparent.”
Five participants provided additional reasons why publicly avail-
able data would be inadequate or unworkable, such as the inherent
“small n” nature of a study or an interest in institutional culture.
Finally, it is worth noting that many of our study participants ap-
preciated the nuance that qualitative data provided researchers in
general. As Interviewee #10 said: “There’s a kind of richness…if
you do it right and you’re lucky, that you can…get people to open
up and kind of talk about things on a much more personal level
than you can just looking at a page of data or spreadsheet.”
How researchers ultimately used their interview data varied a
great deal. Thirteen participants used interview data to either com-
plement or supplement evidence from one or more primary data
sources, such as voting records and congressional transcripts. For
nine participants, the interview transcripts were the main or sole
sources of data for their project. Among all scholars, eight derived
quantitative data from the interviews by content coding them, while
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
10 Daniel Steiman and Elizabeth Suhay

12 analyzed the data in a qualitative manner. This included reviewing


interview transcripts to discern patterns and meaning and using rel-
evant quotes to illustrate their main points. Interviewee #5 described
utilizing a holistic, interpretive approach to interview data analysis:
“[I]t’s very qualitative. It’s very interpretive…I’m looking at this as a
whole body of text…And what is that whole body of text telling me?”
Such diverse uses of interview data illustrate the broad util-
ity of Congressional interviews. We also found it significant that
qualitative analysis of interview data served both “positivist” pro-
jects seeking to understand descriptive and causal patterns as well
as more “interpretivist” projects that sought to understand how
interviewees made sense of their institutional context.

Why It’s Hard to Interview Congress

Interviewees corroborated the extant literature’s characteri-


zation of Congress as a hard-­to-­reach population. Of the 11 par-
ticipants who had also interviewed lobbyists, seven said that they
thought they were easier to access than Congress (two found no
difference and two didn’t say). The four interviewees who had con-
ducted interview projects with state legislatures all thought that
US Congressional elites were harder to access than those working
in state capitals. There were, however, mixed perspectives among
the four scholars who had also interviewed US bureaucrats; bu-
reaucrats may have more available time but, as one interviewee
noted, are unaccustomed to being solicited for interviews.
Notably, 18 participants agreed that conducting interview
projects in Congress had gotten harder over time for researchers.
Importantly, the eight interviewees who had conducted projects
pre-­ and post-­2000 all thought that it had become more difficult
to interview Congressional actors over time. “[I]t was a completely
different world than it is now,” noted Interviewee #4, who had
conducted Congressional interview projects in the 1980s.
Although participants often provided multiple answers to this
question, there were five main explanations for why it has gotten
harder to interview Congress. First, seventeen participants either
stated directly that political polarization was a cause or indirectly
implied this by noting that they found it especially difficult to re-
cruit Republicans. As Interviewee #11 noted: “[T]here’s a lot more
skepticism today about academics than I found when I started in
1989. And much of that skepticism is partisan in nature.”
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 11

Second, seven participants linked the theme of polarization


to the perception that many offices have a deep fear of leaks. As
Interviewee #10 reported: “[A]s polarization has intensified, I
think there is probably more wariness, that anything could be a
trap…the woods are full of wolves.” Four participants attributed
the increasing fear of leaks to the proliferation of social media.
Interviewee #8, for example, noted changes between when they
conducted interviews in the mid-­2000s and today:

I think social media makes it way harder, because when I did my


interviews, I think everyone had the sense that like, there is no prob-
ability that anybody learns about this…And there was no Twitter or
something where I could blast out like, “Hey, Senator So-­and-­So cut
a deal behind the scenes that, so that, there wasn’t a roll call vote on
this.” Now, that would be kind of salacious, and there would be an
audience for it.

Responses such as these suggest that polarization, fear of


leaks, and social media are mutually reinforcing processes that
have made those working in Congress wary of academic interview
projects.
Third, a slight majority of study participants (13) said that
the behavior of researchers themselves has contributed to the
problem of accessing Congress. Some argued that academia has
become politicized to some degree. Six interviewees suspected that
“gotcha” studies—­especially projects, such as audit studies, which
use deception—­undermined Congressional trust in researchers.
Interviewee #8 also lamented the rise of the “partisan public in-
tellectual academic on Twitter”—­scholars who advocate political
positions on social media and other forums, and thereby increase
the perception on Capitol Hill that academics are biased.
Fourth, and on a different note, five participants blamed the
oversaturation of survey and interview requests as a reason for
lower participation rates in Congress. Interviewee #7 summed this
up well:

I think there’s a little bit of a common pool resource manage-


ment problem. Right? Where everyone’s individual incentives are to
try to conduct a survey or do a bunch of interviews and reach out to
these folks. But then they get sort of inundated and they’re like, “I
just did one of these, why do I have to talk to you too?”
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
12 Daniel Steiman and Elizabeth Suhay

This said, another interviewee, while agreeing that oversatu-


ration was a problem, placed the blame on the many nonacademic
surveys from interest groups and lobbyists. Along these lines, a
third scholar mentioned that Congressional offices often cannot
tell the difference between academic studies and those conducted
by advocacy groups or groups that may later sell the data. Not
surprisingly, this person thought this deterred participation in aca-
demic research studies.
Fifth and finally, eight participants thought that institutional
changes in Congress had resulted in decreasing access to Members.
Most of these participants highlighted both staff shortages and in-
creased time spent fundraising. Four scholars observed that staff
were more of a barrier than in the past, with some citing the rising
importance of communications directors in Congressional offices,
who are concerned with keeping Members on message.
We conclude by noting that concerns about research studies
on the Hill—­whatever their precise nature—­have become so great
that many Congressional offices have adopted blanket bans on
participation in research studies and/or surveys (noted by eight of
our participants). Exceptions are sometimes made, but such bans
inevitably decrease researchers’ access to Congress.

Advice and Practices

The increasing difficulties in accessing Congress pose obsta-


cles for nearly every aspect of Congressional interview projects,
from research design, to recruitment, to conducting the interview.
Responses from our study participants both illustrate some of
these difficulties in practice, as well as provide useful advice for
how to overcome them.

Sampling. Whether they took a more positivist or interpretivist


approach, most sought to interview a group of individuals who
were to some degree representative of a broader population.
Those we interviewed took a variety of approaches to
determining their interviewees. Four of the scholars we interviewed
successfully conducted total population sampling, where an
entire subpopulation of Congress was sought for interviews.
Twelve constructed a purposive sample balanced on important
characteristics like party affiliation and seniority; many of these
scholars relied on snowball sampling for recruitment, meaning the
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 13

sample was not constructed a priori. Finally, two participants


only conducted snowball sampling because the subpopulation
in which they were interested was small enough that they could
achieve something approaching representativeness via networks.
Difficulties of access tended to influence researchers’ study
designs in another respect. Researchers must think carefully about
who they are likely to recruit during the timeframe of the study.
Four study participants found House Members and staffers easier
to access than those in the Senate, with the implication that one
should either focus on the House or be ready to employ additional
resources to access the Senate. Seven found Members harder to ac-
cess than staff, with similar implications. In fact, three participants
advised researchers to avoid interviewing Members and instead
focus on staff. Not only are staff easier to reach, they argued, but
staff also are often more knowledgeable regarding the details of
bills and are sometimes franker in their responses.
A popular strategy to improve access—­ used by 11
interviewees—­was focusing on “formers,” that is, people who for-
merly served as Congressional Members or staff. The benefits of
interviewing such individuals goes beyond just their greater availa-
bility: they are often more candid than current Members and staff
and, at least if relatively recently employed there, just as knowl-
edgeable. Interviewee #13 described a focus on “formers” as es-
pecially useful given an interest in Congressional leadership: “[T]
hey’re really hard to get to because they’re so busy, and they’re
also very careful. And so in those circumstances, I think there is
something to be said for talking to former leaders, people who’ve
retired.”

Recruitment Strategies. Participants often mentioned the


importance of contacts and referrals as a helpful strategy for
recruitment. Fourteen participants said they used this strategy.
Researchers thought referrals boosted access for several
reasons: they made the researcher seem more trustworthy
to the potential respondent, they often enhanced the researcher’s
credibility as a scholar, and they simply helped the researcher stand
out amid a crowded field. As Interviewee #17 noted:

[T]he Hill very much operates on personal relationships. And


the amount of demands on Members’ and staffers’ time is so extreme
that reaching out without any kind of introduction, or…basis for
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
14 Daniel Steiman and Elizabeth Suhay

them to know who I am and what I might be interested in doing. Very


difficult to get any response. So, it helps tremendously to have, effec-
tively, people vouching for you, saying, “This person is worth talking
to, this person has no axe to grind, isn’t seeking to embarrass you
politically, just wants to understand how the legislative body works.”

Some participants recommended having referrals make the


initial contact themselves, while others thought it was best if refer-
rals forwarded the researcher’s cover letter/email. Some also rec-
ommended dropping the name of a referral into the cover letter/
email itself.
Our interviewees recommended leveraging any social net-
work contacts a scholar may have to recruit respondents. Several
said that high-­ranking staff and Members of Congress were the
most helpful referrals to have in terms of gaining access. Others
worked through organizations such as the Association of Former
Members of Congress (FMC) or the Congressional Black
Associates Caucus. However, a wide range of contacts can be help-
ful, including a local representative, university colleagues, and, of
course, previous direct contacts within Congress. Eight of our
study participants leveraged contacts established when they previ-
ously worked or interned in Congress.
As already discussed, snowball sampling was common among
the scholars we interviewed. This method uses contacts in two dif-
ferent ways. An integral part of snowball sampling is to ask one’s
interviewees for additional contacts, and possibly a direct referral,
as the interview concludes. What is less often discussed is that this
technique frequently begins by requesting referrals from existing
contacts already known to the researcher, such as one’s local rep-
resentative (cited by two interviewees) or former staffer colleagues
(two interviewees).
Successful recruitment was not only about what one should
do, according to our interviewees, but also about what one should
not do. In response to a question about political neutrality, nearly
all participants said it was important to appear neutral and/or de-
scribed taking steps to appear more neutral. This included not only
the content of one’s communications with potential interviewees
but also one’s public image. Interviewee #14 noted: “I think that
the people who are on the Hill now and the level of polarization
that exists, before they agree to see you, they’re gonna do some
minimal research about you.” Several said they were especially
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 15

careful regarding what they posted online or said in interviews


with the media. Two interviewees said they scrubbed their social
media accounts of partisan posts prior to beginning a new study
and another said that throughout their career they have been care-
ful to avoid doing anything partisan that could be traced to them,
including making donations, signing petitions, etc.
On a related note, Interviewee #12 said that a potential dis-
advantage to having a politically connected patron (e.g., a high-­
ranking House or Senate member) who will refer you to other
contacts is that it may help to recruit some but hurt access to oth-
ers, especially members of the opposite party: “In other words, the
easiest access comes from within the circle of your patron, so to
speak.” However, here there was some disagreement. Three other
participants who had high-­ranking Democratic contacts did not
find that this deterred access to Members on the other side of the
aisle. Interviewee #18, in fact, argued that their previous work with
Democrats “lent me some credibility, because they felt like I was
someone who would know what I was talking about.” Connecting
the dots, we note that contacts on both sides of the aisle are ideal
in most circumstances, but one-­sided contacts are certainly better
than none.

Recording or Transcribing. There was also some disagreement


over whether to record interviews, with 12 interviewees favoring
always recording if they could and eight preferring only note-­
taking. Participants raised two fundamental issues concerning
recording interviews: whether people working in Congress were
receptive to being recorded, and whether recording is helpful
for getting good data from the interview. In practice, these two
things were often intertwined. With respect to the former issue,
seven participants said they found interviewees were receptive to
being recorded, and five said interviewees were not receptive; as
might be expected, these perspectives largely overlapped with the
choice to record or not. Most who chose not to record not only
wished to avoid making their research subjects uncomfortable but
also believed that doing so would lead to poor data. They believed
interviewees would be less candid, forthcoming, and engaged.
For their part, most of those who chose to record believed that
recording was intrinsically better with respect to accuracy as one
had a complete record of the conversation.
Sometimes the choice to record depended on the ease with
which a person could take notes during an interview. Seven
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
16 Daniel Steiman and Elizabeth Suhay

participants considered note-­taking during the interview to be


exceedingly difficult. Five noted the challenge of legibly writing
down ideas—­sometimes full quotes—­at a fast pace, while the other
two said it is hard to look engaged/keep eye contact while taking
notes. Echoing the extant advice literature, 12 participants said
that it was paramount for researchers relying on notes to write up
or transcribe them as soon as possible while memories were fresh.

Interview Content and Format. A considerable portion of each


conversation with our study participants was spent discussing the
interview process itself. Most reported conducting interviews
in person. With respect to the focal projects discussed, 11
participants conducted interviews only in person, while an
additional eight conducted interviews over the phone and in
person. Two participants said they found phone interviews to be
advantageous because it saves on travel costs and makes it easier to
interview people who are not in Washington, D.C., such as former
Members or Members at home in their district when Congress is
out of session.
There was a great deal of variation among participants in
the format of their interviews. Only a few conducted “structured”
interviews, meaning they had a list of questions that they asked
everyone, in the same order, with no follow-­ups. Fifteen conducted
“semistructured” interviews, with a set list of questions for each
interviewee that allowed for probing, making minor changes to
question order, or eliminating questions to avoid redundancy.
Three participants conducted “unstructured” interviews, mean-
ing there might be a limited set of key questions, but interviews
were conducted in a conversational style. Interviewee #17 said: “I
always went into an interview with a list of questions. But I often
didn’t get all the way through the list…the interview would go in
other directions. So I allowed it to unfold organically.”
For the researchers, choice in question format—­either open-­
ended or closed-­ended—­was often associated with degree of struc-
ture, with the proportion of open-­ended questions increasing as
the degree of structure lessened. This said, overall, participants
leaned most heavily on open-­ ended questions. About three-­
quarters of the projects discussed consisted mostly or entirely
of open-­ended questions, with the other quarter including a mix
of open-­and closed-­ended questions. Of the projects that mixed
open-­and closed-­ended roughly equally, three utilized a method
recommended by Beckman and Hall (2013) which combines a
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 17

typical semistructured open-­ended interview with a closed-­ended


survey form filled out by the research subject during the interview.
Given the hectic schedules of Members and staff which can
drastically limit the time windows for conducting an interview
(sometimes just 10–­15 mins.), as well as the fact that a Member of
Congress may be pulled out of an interview at any moment, four
participants gave specific advice on how researchers can manage
time constraints during the interview. This advice included keep-
ing questionnaires short, getting to critical questions quickly, and
being prepared to cut less pertinent questions during the interview
to save time.
We also asked interviewees whether they could offer any in-
sight into certain types of questions to include, or exclude, to im-
prove data quality. Five participants recommended asking specific,
factual questions rather than general, abstract questions. These
scholars found Members of Congress to be poor at generalizing
their own behavior. For example, one should ask Congressional in-
terviewees to provide a real-­world, concrete example of a problem
and their actual reaction to it, rather than ask them to report how
they approach policymaking generally.
With respect to problematic question types, participants
avoided questions that might unnecessarily antagonize research
subjects. This might mean avoiding personal questions, questions
about scandals, or questions about partisan politics. Several of
our participants described being careful with their word choice by
keeping the interviewees’ perspective in mind. For example, one
person avoided describing Congress with negative terms such as
“polarization,” and instead used more neutral terms such as “grid-
lock.” Two additional interviewees discussed being careful about
how they framed the abortion debate, with one scholar taking
care to refer to abortion as “the life issue” when speaking with
Republicans, as that is often their preferred terminology.
A noteworthy disagreement among participants concerned
the question of whether they employed a rapport-­building strat-
egy with interviewees in Congress. Eighteen stated that they ei-
ther believed in the value of rapport-­building to build trust and/
or described using specific rapport-­ building techniques. Most
Congressional interviewers in this category employed generic
rapport-­building tactics like engaging in small talk before the in-
terview or beginning the interview with biographical or “softball”
questions to put the interviewee at ease. Several participants also
mentioned more subtle but consistent efforts to maintain rapport
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
18 Daniel Steiman and Elizabeth Suhay

throughout the interview, such as maintaining eye contact and giv-


ing brief responses to interviewees’ comments (“thank you,” “in-
teresting,” etc.).
This said, three of our study participants explicitly argued
against using traditional rapport techniques at the start of the in-
terview. They raised concerns about wasting precious time, as well
as potentially irritating the interviewee. Interviewee #13 stated
bluntly:

I’m not a big rapport person. I think, you know, there’s a lot of
discussion about rapport and I think frankly, most of that is just bull-
shit. You’re dealing with…very savvy, elite-­level politicians at the
member-­level and really savvy, frankly elite-­level professionals at the
staff-­level as well. And you know, you’re not going to turn them into
your buddies in a half an hour interview and you shouldn’t try.

Instead, these individuals recommended employing tech-


niques to build a researcher’s credibility as a scholar. They recom-
mended the researcher present her or himself as knowledgeable
(but not a know-­it-­all) on Congress, while also establishing that
they were there to learn specific information—­and not to receive a
general lecture about how Congress works. We discuss this debate
below in the Discussion section. Although we disagree that efforts
to establish rapport are unnecessary, we discuss a number of al-
ternative strategies to build relationships with participants besides
the traditional rapport-­building techniques noted above.
Regardless of whether they invested time in rapport building,
nearly all participants suggested that baseline amity was essential.
Those interviewing Members and staff in Congress should present
themselves as amiable, professional, respectful, and trustworthy.

Social Identity. An additional theme brought up by some


participants was social identity, which they saw as both an
enhancer and inhibitor of rapport building and recruitment.6
Two participants who conducted projects focused on specific
social groups within Congress (e.g., gender, race, sexuality)7 noted
that the focus of the project itself could serve as an effective re-
cruitment tool for potential interviewees who are members of the
social identity group on which the project is focused. At the same
time, these same researchers noted that the project topic could
also dissuade potential participants who don’t share that social
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 19

identity—­either because they don’t see how their participation is


relevant and/or because they view the topic as controversial.
Four participants noted that the social identity of the inter-
viewer can influence both recruitment and rapport. They found
that sharing the social identity of the potential respondent seemed
to ease recruitment and make interviewees more trusting during
the interview; of course, the flip side to this is that recruiting and
interviewing may be relatively more difficult if one does not share
the interviewee’s social identity. Interviewee #8 noted the general
advantage of being a White male in the Congressional arena as
well as how his surface resemblance to male conservatives helped
him recruit Republicans and improved rapport:8
“[A]t the time that I was doing a ton of this, I was…a [young],
[tall] White guy. And…I felt like…that helped me a ton, right?…
And especially on the Republican side…I just looked like a
Republican.”
Interviewee #21 concurred that White male scholars had ad-
vantages recruiting participants. This said, she added the caveat
that having characteristics some may perceive as nonthreatening—­
such as being young or female—­can sometimes put the interviewee
at ease and help with rapport.

Ethics and IRB

As noted above, the extant literature on interviewing Congress


is largely missing any discussion of the role of the IRB in interview
projects, or the impact of major changes to IRB policy in recent
years. Therefore, we prioritized discussion of the IRB in our inter-
views with researchers.
Sixteen participants went through the IRB process, either ob-
taining approval from the IRB or an exemption. Six participants,
who conducted projects prior to 2000, did not interact at all with
the IRB. With respect to informed consent (oral or written) and at-
tribution, 14 of the respondents said they both obtained informed
consent and kept subjects’ identities’ confidential. Two partici-
pants used informed consent but kept interviews on the record,
while two others had no informed consent but did have a confi-
dentiality policy. Only one participant, Interviewee #10, both did
not acquire informed consent and was consistently on the record
with their interviews.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
20 Daniel Steiman and Elizabeth Suhay

We asked study participants for their views on the removal of


the IRB public officials’ exemption. We found participants to have
differing views over the inevitable increase in IRB scrutiny over
Congressional interview projects. Eleven participants opposed the
removal of the exemption, four were in favor, and seven had mixed
views.
Those opposed to increased IRB review of Congressional re-
search projects cited several reasons. One was that Congressional
interview projects, in their view, pose a minor risk to their subjects
relative to studies of vulnerable populations and the general pub-
lic. Other participants argued that Congress, as a public, demo-
cratic body, should be completely transparent to researchers, with
Interviewee #7 stating: “I think if you’re [a] public official, you
have a particular responsibility to be more accessible and trans-
parent about what you’re doing on the public’s behalf…More bar-
riers to getting that kind of access are bad.” Finally, one person,
Interviewer #10, was not only opposed to any IRB overview of
academic Congressional research but also considered IRB regula-
tion in general to be an infringement of academic freedom.
Several of those we interviewed were critical of IRB consent
procedures. Three scholars thought that the legalese found in con-
sent forms made respondents wary and deterred study participa-
tion. Interviewee #12 referred to signed consent forms as a “kiss
of death” for respondent participation. Interviewee #3 concurred:
“I struggle with the IRB details, particularly on consent language,
because it is intimidating to [potential research subjects], and I do
think it makes people and their staffs potentially not participate.”
This said, a vocal minority had positive things to say about
the impact of the IRB on Congressional research, with some favor-
ing increased IRB scrutiny. Three participants thought the exemp-
tion removal would lessen the likelihood that researchers would
harm research subjects by (perhaps inadvertently) leaking con-
fidential information. As Interviewee #4 stated: “[W]hen you’re
recording things, and so forth, there’s always the risk that the…re-
cords will, you know, get released somehow.” Others felt that while
House and Senate Members shouldn’t necessarily be protected by
IRB oversight, staffers were more at risk of harm and deserved
IRB protection. Finally, two interviewees noted that certain types
of Congressional projects merited IRB oversight, such as experi-
ments that employed deception, with Interviewee #21 mentioning
“some studies that maybe crossed lines.”
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 21

As time allowed in the interviews, we also probed partici-


pants’ views on the ethics of interviewing in Congress. Three par-
ticipants emphasized the importance of not harming respondents
by revealing confidential, damaging, or embarrassing information.
Interviewer #4 cited Richard Fenno’s (1978) axiom of not poison-
ing the well for other Congressional researchers:

[W]hat you do as a researcher can affect future researchers’


ability to do their work, right? So if you [include] a bunch of out-
rageous, you know, questions on there, or presumptuous questions
or, you know, anything else that’s going to break norms, then that’s
going to make it that much harder…for the next person that comes
along. This by the way was…one of the most important principles to
Dick Fenno. You know, do no harm to future researchers.

We discuss this complex topic further in the next section.

Discussion and Conclusion

In this section, we discuss the advice that emerged from our


interviews, adjudicate disagreements between interviewees, and
provide summary guidance for Congressional scholars.

Sampling and Recruitment

Perhaps the most significant point of agreement among par-


ticipants was the view that it has become much more difficult in
recent years to recruit Congressional interview subjects. Most
thought this problem was linked in some way to increasing politi-
cal polarization in Congress. Republican offices seem to be espe-
cially resistant to study participation, perhaps because they view
academics as Democrats and/or liberals. However, Members and
staff in general seem concerned about “gotcha” culture, especially
in our social media age. Some study participants also mentioned
the problem of growing demands on Congressional offices, in-
cluding a proliferation of outsiders (including both academics and
nonacademics) wishing to study them. These difficulties serve as
a backdrop to recommended approaches to sampling and recruit-
ing, as scholars simply cannot conduct studies of Congress as they
would a study of the general population.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
22 Daniel Steiman and Elizabeth Suhay

We found difficulties in access to have a marked effect on how


our interviewees chose their sampling strategies. For researchers
seeking to draw generalizations about all—­or some subset of—­
Congress, random sampling is desirable in theory. Yet, the best
way to achieve representativeness—­ a random sample—­ is usu-
ally impractical for elite interviewing. Although exceptions exist,9
cold calling (or emailing) yields too low of a response rate for the
typical researcher. Instead, most participants created a purposive
sample. With specific representational goals in mind (e.g., party,
Chamber, sex, race), it is usually best to leverage contacts and
use snowball sampling to build a sample of interviewees. Given
particular difficulties enlisting Republicans, researchers will want
to take special care to build up contacts on that side of the aisle
and expect to spend more time and effort recruiting Republican
Members and staff.
Of course, nonrandom selection may be necessary not only
for practical reasons, but also for theoretical ones. This is true for
both “positivist” and “interpretivist” research designs. For example,
purposive sampling may be more appropriate than random sam-
pling for projects where the goal is process tracing (Martin 2013,
113). We also note that interpretivist projects are likely to favor a
more deliberate process of choosing research subjects given their
frequent interest in deep exploration of small subsets of a popu-
lation as well as their prioritization of understanding individual
perspectives over generalization to a larger population (Fujii 2018,
37–­38; Schwartz-­Shea and Yanow 2012, 87). We recommend that
researchers consider both practical concerns and the goals of the
project (Mosley 2013, 19) when determining the appropriate ap-
proach to choosing research participants for Congressional inter-
view projects.
Congressional interviewers employed various tactics to ad-
dress difficulties in accessing Congress. The tactic with the most
consensus—­recommended by nearly half of participants—­was to
seek out staff and/or former Members. Another important rec-
ommendation was to strive to appear politically neutral when re-
cruiting study participants, which might include refraining from
sharing political views and identifiers on social media.10 Lastly,
several participants argued that academics would benefit from
viewing Congress as a common pool-­ resource problem to be
solved. We believe that Congressional researchers should consider
some coordination to address this. At least one multi-­investigator
study has already been completed (LaPira et al. 2020), and one of
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 23

our study participants reported taking initial steps to organize a


second effort.

Interview Content and Format

Most researchers conducted interviews they described as sem-


istructured with mostly open-­ended questions. As with sampling,
the researcher’s approach and goals are crucial for determining
which format is most useful. For most positivist studies that seek
to generalize about Congress, semistructured formats will be most
useful. Semistructured interviews ensure comparability across in-
terview subjects and keep interviewees focused on the topics of
interest to the researcher, while also allowing the researcher to fol-
low up on interesting ideas or shift gears in response to time pres-
sure. Semistructured interviews are also preferable to structured
interviews or surveys because they accommodate more loquacious
and discursive participants (Gallagher 2013, 119). Mixing closed-­
ended and open-­ended questions allows for both strict compari-
sons and exploration and understanding.
In some cases, an unstructured and completely open-­ended
approach is ideal. This is true for interpretivist and exploratory
“soak-­and-­poke” style projects, such as those seeking to under-
stand the unique perspectives of people occupying different
institutional roles. Qualitative scholars have argued that less struc-
tured, more in-­depth interviewing is particularly suitable for pro-
jects where the goal is understanding participant meaning making
(Soss 2013). Finally, for most projects, we do not recommend a
structured or completely closed-­ended approach; this is likely to
both miss opportunities and annoy interviewees (Rockman 2011:
1342).11
To elicit high-­quality and sincere responses, most ques-
tions should be concrete and specific, and all questions should be
worded so that they do not unnecessarily offend or raise concerns
about bias. Essential questions should be asked first in case an in-
terview is cut short.

Rapport

Most study participants agreed with the existing literature


that one should employ some basic rapport-­building techniques.
However, a vocal minority opposed rapport-­building techniques,
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
24 Daniel Steiman and Elizabeth Suhay

arguing that they are unnecessary, a waste of precious time, and


maybe even off-­putting. On the one hand, it is true that time for
interviews is often extremely limited, and, for this reason, policy-
makers prefer focused meetings (Suhay et al. 2019). Past literature
also notes that mutual friendliness and trust do not necessarily
lead to a more forthcoming interview (Fujii 2018,13, 15). This
said, we argue that the interviewer does need to take some steps to
build rapport with the interviewee.
Addressing the concerns of both supporters and critics of
rapport building among our study participants, we note that the
qualitative literature on rapport is evolving. Newer relationship-­
building strategies may be more successful than traditional pre-
interview chitchat or finding mutual commonalities. For example,
Lee Ann Fujii suggests researchers build “working relationships”
with subjects. Establishing working relationships requires a com-
mitment by the researcher to treat all interviewees equally with
attentiveness and respect (2018, 22). This entails several steps, in-
cluding: taking note of the positionality between interviewer and
interviewee (a method known as reflexivity; Glas 2021); seeking
to minimize harm to interviewees; having situational awareness
of the cultural norms in the environments where you are inter-
viewing; and displaying humility and respect for the interviewee’s
knowledge and expertise (Fujii 2018, 16–­28). We also recommend
that the researcher maintain the same level of respect and dignity
for the interviewee over the course of the whole project, from the
recruitment phase to publication, and afterwards as well if rele-
vant. As an example of what this looks like in practice, Interviewee
#14 said that they always send a hand-­written “thank you” card to
interviewees following the interview.

Social Identity

An additional dynamic discussed by study participants was


how social identity affected both the recruitment and interview pro-
cess. Some reported that their social identities and/or those at the
center of their projects could make interviewees more—­or less—­
amenable to participation and improve—­ or worsen—­ rapport.
While social identities provide both advantages and disadvantages,
given that the US Congress remains dominated by White men, re-
searchers with these identity characteristics likely are, on balance,
advantaged in conducting interview projects there.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 25

One important aspect of social identity not mentioned by our


participants is what we might call the “biasing” effects of social
identity, meaning the data elicited may differ substantively depend-
ing on the interviewer’s characteristics, and how they are perceived
by the interviewee (and vice versa). The qualitative methodologi-
cal literatures from both the positivist and interpretivist traditions
discuss this fact; however, a key difference is that, while positiv-
ists argue there are ways to alleviate these “interviewer effects,”
interpretivists believe it is impossible to eliminate subjectivity and
achieve purely “objective” data (Mosley 2013, 12). Although we
agree with positivist scholars that steps can be taken to improve
the reliability of interview data—­such as employing semistruc-
tured interviews and avoiding conveying one’s own opinion—­
scholars should keep in mind that the interview data-­generation
process, and the resulting data, will differ at least somewhat from
researcher to researcher, even if the methodological protocol is the
same.

Recording Versus Note-­Taking

As in the extant literature, our interviewees disagreed over the


merits of recording the interview: while recording ensures a near-­
perfect transcript of the encounter, it can also make interviewees
uncomfortable and reticent. Ultimately, the choice to record is an
individual judgment call on the part of the researcher, depend-
ent in part on who is being interviewed and the topics discussed.
Members may be more comfortable being recorded than staff, and
all are likely more comfortable being recorded if a project’s subject
matter is self-­evidently uncontroversial. Where the interview topic
is sensitive, it may be better not to record. Unfortunately, note-­
taking presents its own difficulties; having a second person in the
room or on the call taking notes can mitigate some of them. Study
participants who took notes also emphasized that it was critical to
transcribe notes immediately following the interview before memo-
ries faded (see also Mosley 2013, 25).

IRB and Ethics

An important topic absent from the current literature is re-


searchers’ interactions with the IRB, both before and after the
recent removal of the IRB’s exemption for public officials. Most
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
26 Daniel Steiman and Elizabeth Suhay

of those we interviewed opposed the removal of the exemption


or at least expressed reservations about the way IRBs are now
involving themselves in studies of public officials. Those most
strongly opposed to the exemption removal felt that Congress, as
a democratic institution, should be transparent and, thus, open to
academics. They also argued that one need not be as concerned
about risks to public officials as one is when interacting with the
public or, especially, vulnerable populations. Although no par-
ticipants mentioned this explicitly, arguably the most significant
effect of eliminating the public officials’ exemption is that re-
search that threatens their livelihood and reputation, such as an
investigation into corruption, can conceivably be blocked by an
IRB (American Political Science Association 2020; Yanow and
Schwartz-­Shea 2016). Others had more practical concerns, such
as mandated cumbersome or “scary” language in the consent pro-
cess. This said, a minority welcomed IRB involvement, especially
given the involvement of staff and the increase in experimental
research.12
On the question of further IRB reform related to
Congressional research, we suggest a middle path incorporating
both sides’ concerns. First, reinstate the exemption for studies of
public officials, whether on the record or off, that do not involve
deception. Officials routinely interact with journalists in a manner
that includes far fewer safeguards than those now in place for aca-
demic studies. Officials can always protect themselves by declining
to answer a researcher’s questions. This said, to avoid “poisoning
the well” for future researchers as well as wasted public resources
or unreasonable harm, continue to require any studies of public
officials that use deception to undergo IRB review. Second, clarify
that Congressional staff (as well as career civil servants) are not
“appointed officials.” While an exemption may be appropriate for
interview studies limited to understanding staff members’ official
duties, an exemption is less appropriate for interview studies that
delve into their personal perspectives or backgrounds.
Regarding ethics in general, we also stress that researchers
must treat all subjects—­whether elite or nonelite, Congressional
member or staffer—­ethically, regardless of specific IRB regula-
tions (Yanow and Schwartz-­Shea 2016). As others have noted,
interviewees are not simply "data points" but are human beings
deserving of respect and dignity (Fujii 2018, 8). They are also vol-
untarily taking their time to assist the researcher. A high level of
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 27

ethical treatment should extend to anyone the researcher encoun-


ters over the course of the project, including any intermediaries or
referrals.

Future Directions for Research

The aim of this project has been to provide recommended


practices for conducting Congressional interviews. Although we
focus on the US Congress, much of the advice discussed herein will
be relevant to qualitative interviewers more broadly, particularly
those wishing to interview political “hyper-­elites” who are difficult
to access due to scarce time and, perhaps, wariness of researchers.
This study is also a stepping-­stone to future research. While
many of the lessons provided in this article are widely applica-
ble, our systematic approach to generating methodological ad-
vice might be extended to the practice of interviewing other types
of political elites in the United States or those in other nations.
Variation in institutional structure, level of political polarization,
time burdens on members, and norms surrounding interacting
with scholars will likely influence recommended practices for in-
terviewing members of legislatures and parliaments around the
world.
Given changing dynamics within and beyond Congress
over time, the sum of what scholars consider best practices for
interviewing Congress will never be static. This said, the varied
phenomena making access to Congress increasingly difficult are
unlikely to abate anytime soon: political polarization, oversatu-
ration of study requests, fears of leaks and social media gotcha-­
ism, and overworked Members and staff. In addition, recent IRB
changes as well as shifts in how scholars study Congress have
newly elevated the topics of ethics and oversight of Congressional
research studies. It is our hope that the methodological advice and
considerations distilled herein will be useful to a variety of schol-
ars navigating this increasingly difficult terrain.

Acknowledgements. We thank the following scholars for their


participation in our study: Joel Aberbach, Bert Rockman, Karen
Akerlof, Ross Baker, Matthew Beckmann, James Curry, Kelly Dittmar,
Lee Drutman, Lawrence Evans, Alexander Furnas, Richard Hall, Peter
Hanson, Paul Herrnson, James Jones, Frances Lee, Matto Mildenberger,
Kristina Miler, Mark Miller, Ronald Peters, Dakota Strode, Michele
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
28 Daniel Steiman and Elizabeth Suhay

Swers, and Jennifer Wessel. We also thank Andy Ballard, Tim LaPira,
and Gisela Sin and the three anonymous reviewers for their helpful advice.

Data Availability Statement. The data that support the findings of


this study are available on request from the corresponding author. The
data are not publicly available due to privacy or ethical restrictions.

Daniel Steiman is a PhD candidate and instructor in the


Department of Government, School of Public Affairs, American
University. He is also an adjunct instructor at George Mason
University’s Schar School of Policy and Government. His research
focuses on political violence and African politics, with an additional
interest in qualitative interview methods, including elite interviewing.
Elizabeth Suhay is Associate Professor and Graduate Program
Director in the Department of Government, School of Public Affairs,
American University. She studies U.S. politics with a focus on mass
and elite ideology, especially as they relate to economic and scientific
topics. She is the author or co-­author of numerous articles and co-­
editor of three edited volumes, most recently The Politics of Truth in
Polarized America (Oxford).

ENDNOTES

1. This literature also includes one symposium on interview methods in po-


litical science in PS: Political Science and Politics 35(4).
2. Richard Fenno, writing in the pre-­smart-­phone 1970s, cites the hassle of
using tape recorders (“mechanical devices that have to be started, reloaded, and
stopped” [1978, 279]) as a rationale for choosing to not record his interviews with
Members of Congress.
3. Our review of the literature suggests this is not a common way of gather-
ing data on Congress among academics in recent years. We discuss reasons why
in the empirical section.
4. For projects with more than one coauthor, we asked only one person from
the team to participate in our study, with two exceptions: in the case of one large
interview project with multiple components and publications, we interviewed two
authors separately; in another case, two scholars and frequent collaborators pre-
ferred to speak to us together in one interview.
5. All but one of our interviewees personally conducted interviews for their
project.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 29

6. Note that we did not ask about identity directly; here, we only discuss
those participants who mentioned the topic spontaneously in response to general
questions about recruitment and rapport.
7. An example of a recent interview project focused on a specific identity
group in Congress is Dittmar et al. The authors interviewed three-­quarters of the
female members in the 114th Congress (2018, 223).
8. Some details have been removed from this quote to protect anonymity.
9. One scholar we interviewed successfully interviewed most members of a
large Congressional subpopulation. This scholar was likely aided by a number of
salient institutional affiliations, an assistant dedicated to recruiting and schedul-
ing, and an especially appealing topic.
10. Of course, it is impossible to achieve the appearance of perfect neutrality
in practice, as potential interviewees will inevitably associate stereotypes with the
researcher based on their identity characteristics.
11. We do not intend to dissuade scholars from using Beckman and Hall’s
(2013) creative approach, which combines a short, self-­administered survey with
an open-­ended interview.
12. Given that IRB boards vary from school to school, it is possible idio-
syncratic institutional contexts influenced participants’ views. For example, those
more welcoming of IRB oversight may have worked with less intrusive IRBs at
their home institutions, and vice versa. We recommend further research on polit-
ical scientists’ perspectives on the IRB.

REFERENCES

Aberbach, Joel D., and Bert A. Rockman. 2002. “Conducting and coding elite
interviews.” PS: Political Science and Politics 35(4): 673–676.
Aberbach, Joel D., James Chesney, and Bert A. Rockman. 1975. “Exploring Elite
Political Attitudes: Some Methodological Lessons.” Political Methodology
2:1-27.
American Political Science Association. 2020. Principles and Guidance for Human
Subjects Research. Washington, D.C.
Baker, Ross K. 2011. “Touching the Bones: Interviewing and Direct Observational
Studies of Congress.” In The Oxford Handbook of the American Congress,
ed. Eric Schickler and Frances E. Lee, 95–114. New York: Oxford
University Press.
Beckmann, Matthew N., and Richard L. Hall. 2013. “Elite Interviewing in
Washington, DC.” In Interview Research in Political Science, ed. Layna
Mosley, 196–208. Ithaca, NY: Cornell University Press.
Berry, Jeffrey M. 2002. “Validity and reliability in elite interviewing.” PS: Political
Science and Politics 35(4): 679–682.
Crosson, Jesse M, Furnas, Alexander C., Lapira, Timothy, and Casey Burgat.
2021. “Partisan Competition and the Decline in Legislative Capacity
among Congressional Offices.” Legislative Studies Quarterly 46(3):745-789.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
30 Daniel Steiman and Elizabeth Suhay

Curry, James M., and Frances E. Lee. 2020. “What Is Regular Order Worth?
Partisan Lawmaking and Congressional Processes.” The Journal of Politics
82(2): 627-641.
Dexter, Lewis Anthony. 1970. Elite and Specialized Interviewing. Evanston, IL:
Northwestern University Press.
Dittmar, Kelly, Kira Sanbonmatsu, and Susan J. Carroll. 2018. A Seat at the
Table: Congresswomen’s Perspectives on why their Presence Matters. New
York: Oxford University Press.
Fenno, Richard. 1978. Homestyle: House Members in their Districts. New York:
HarperCollins.
Fujii, Lee Ann. 2018. Interviewing in social science research: A relational approach.
New York: Routledge.
Gallagher, Mary. 2013. “Capturing Meaning and Confronting Measurement.” In
Interview Research in Political Science, ed. Layna Mosley, 181-195. Ithaca,
NY: Cornell University Press.
Gelman, Jeremy. 2018. “If Congress Is So Dysfunctional, Why Is Its Staff So
Busy? A Congressional Fellow’s Perspective.” PS: Political Science and
Politics 51(2): 494-495.
Gerring, John. 2017. “Qualitative methods.” Annual Review of Political Science
20: 15–36.
Glas. Aarie. 2021. “Positionality, Power, and Positions of Power: Reflexivity in
Elite Interviewing.” PS: Political Science & Politics. 54(3): 438-42.
Henderson, Geoffrey, Alexander Hertel-­ Fernandez, Matto Mildenberger,
and Leah Stokes. 2021. “Conducting the Heavenly Chorus: Constituent
Contact and Provoked Petitioning in Congress.” Perspectives on Politics:
1-­18. https://doi.org/10.1017/S1537​59272​1000980.
Jones, Charles O. 1959. “Notes on Interviewing Members of the House of
Representatives.” The Public Opinion Quarterly 23(3): 404–406.
King, Gary, Robert O. Keohane, and Sidney Verba. 1994. Designing Social
Inquiry: Scientific Inference in Qualitative Research. Princeton, NJ:
Princeton University Press.
Kingdon, John W. 1989. Congressmen’s Voting Decisions 3rd Edition. Ann Arbor:
University of Michigan Press.
Kuhn, Thomas S. 1996 [1962]. The Structure of Scientific Revolutions, Third
Edition. Chicago: University of Chicago Press.
LaPira, Timothy M., Lee Drutman, and Kevin R. Kosar. 2020. Congress
Overwhelmed: The Decline in Congressional Capacity and Prospects for
Reform. Chicago, IL: University of Chicago Press.
Lee, Frances E. 2016. Insecure Majorities: Congress and the Perpetual Campaign.
Chicago: University of Chicago Press.
Martin, Cathie Jo. 2013. “Crafting Interviews to Capture Cause and Effect.” In
Interview Research in Political Science, ed. Layna Mosley, 109-124. Ithaca,
NY: Cornell University Press.
Matthews, Donald. 1960. U.S. Senators and Their World. Chapel Hill: University
of North Carolina Press.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Conducting Interview Projects in the US Congress:
Analyzing the Methods of Experts in the Field 31

Mosley, Layna. 2013. "‘Just Talk to People’? Interviews in Contemporary Political


Science." In Interview Research in Political Science, ed. Layna Mosley, 1-28.
Ithaca, NY: Cornell University Press.
Parker, David C.W. 2018 “Following Fenno: Learning from Senate Candidates in
the Age of Social Media and Party Polarization.” The Forum 16(2): 145-170.
Peabody, Robert L., Susan Hammond, Jean Torcom, Lynne Brown, Carolyn
Thompson, and Bobin Kolodny. 1990. “Interviewing Political Elites.” PS:
Political Science and Politics 23(3): 451–5.
Peters Jr., Ronald. 2016. “Sitting Around the Couch.” The Legislative Scholar:
The Newsletter of the Legislative Studies Section of the American Political
Science Association Vol 1(2): 13-15.
Polsby Nelson W. 2005. How Congress Evolves: Social Bases of Institutional
Change. New York: Oxford Univ. Press.
Reynolds, Molly, Thomas E. Mann, Norma J. Ornstein, Raffaela Wakeman,
and Andrew Rugg. 2017. Vital Statistics on Congress. Washington DC.
https://www.brook​ings.edu/wp-c­ onte​nt/uploa​ds/2016/07/Vital​-­Stati​stics​
-­Full-­Data-­Set.pdf.
Richards, David. 1996. “Elite Interviewing: Approaches and Pitfalls.” Politics
16(3): 199–204.
Robinson, James A. 1960. “Survey Interviewing among Members of Congress.”
Public Opinion Quarterly 24(1): 127–38.
Rockman, Burt. 2011. “Interviews, Elite.” In International Encyclopedia of
Political Science, eds. Bertrand Badie, Dirk Berg-­Schlosser, and Leonardo
Morlino, 1341-1344. Thousand Oaks: Sage.
Schrag, Zachary M. 2009. “How Talking Became Human Subjects Research:
The Federal Regulation of the Social Sciences, 1965-­1991.” The Journal of
Policy History 21: 3-37.
Schwartz-­Shea, Peregrine, and Dvora Yanow. 2012. Interpretive Research Design:
Concepts and Processes. New York: Routledge.
Soss, Joe. 2013. Talking Our Way to Meaningful Explanations: A Practice-­
Centered View of Interviewing for Interpretive Research. In Interpretation
and Method: Empirical Research Methods and the Interpretive Turn, ed.
Dvora Yanow, and Peregrine Schwartz-­Shea, 161-182.
Suhay, Elizabeth, Emily Cloyd, Erin Heath, and Erin Nash. 2019. “Recommended
Practices for Science Communication with Policymakers.” American
University/American Association for the Advancement of Science.
Tansey, Oisín. 2007. “Process Tracing and Elite Interviewing: A Case for Non-­
probability Sampling.” PS: Political Science and Politics 40(4): 765-772.
Woliver, Laura R. 2002. “Ethical Dilemmas in Personal Interviewing.” PS:
Political Science and Politics 35(4): 679–682.
Yanow, Dvora, and Peregrine Schwartz-­Shea. 2016. Encountering Your IRB 2.0:
What Political Scientists Need to Know. PS: Political Science and Politics
49(2): 277-285.
Zuckerman, Harriet. 1972. “Interviewing an Ultra-­Elite.” The Public Opinion
Quarterly 36(2): 159–175.
19399162, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/lsq.12436 by Universidad Nacional Autonoma De Mexico, Wiley Online Library on [05/10/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
Additional supporting information may be found in the online
Daniel Steiman and Elizabeth Suhay

version of this article at the publisher’s web site:


Supporting Information

Appendix S1. Supporting Information.


32

You might also like