Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Research in Drama Education: The Journal of Applied

Theatre and Performance

ISSN: 1356-9783 (Print) 1470-112X (Online) Journal homepage: https://www.tandfonline.com/loi/crde20

Gaps, silences and comfort zones: dominant


paradigms in educational drama and applied
theatre discourse

Matt Omasta & Dani Snyder-Young

To cite this article: Matt Omasta & Dani Snyder-Young (2014) Gaps, silences and comfort
zones: dominant paradigms in educational drama and applied theatre discourse, Research
in Drama Education: The Journal of Applied Theatre and Performance, 19:1, 7-22, DOI:
10.1080/13569783.2013.872432

To link to this article: https://doi.org/10.1080/13569783.2013.872432

Published online: 25 Feb 2014.

Submit your article to this journal

Article views: 1045

View Crossmark data

Citing articles: 7 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=crde20
RiDE: The Journal of Applied Theatre and Performance, 2014
Vol. 19, No. 1, 7–22, http://dx.doi.org/10.1080/13569783.2013.872432

Gaps, silences and comfort zones: dominant paradigms in


educational drama and applied theatre discourse
Matt Omastaa* and Dani Snyder-Youngb
a
Department of Theatre Arts, Utah State University, Logan, UT, USA; bSchool of Theatre
Arts, Illinois Wesleyan University, Bloomington, IL, USA

This article explores prevailing rhetoric in published scholarship in the field of


educational drama and applied theatre, responding to O’Toole’s call to investigate
if researchers in the field are ‘missing something vital by staying in our comfort
zones’. He noted a ‘serious need for more usable, broad-based, and reliable base-
line data to use for policy change and as starters for our research’ after reviewing
the abstracts of the 86 sessions presented at the International Drama in Education
Research Institute (IDIERI) 2009. To explore if his findings were a microcosm of
broader trends in the field, we reviewed over 400 peer-reviewed, English-language
articles published over the last decade that addressed applied theatre or drama
education. We asked the following: (1) What research methods do authors utilise to
conduct their studies? (2) What types of results do these studies report? (3) Do
scholars based in certain geographic regions influence the field’s discourse
disproportionately? We share the data we collected from each article in light of
these questions through a series of charts. We then analyse the data and detail
how the field’s self-imposed research paradigms create comfort zones that
encourage certain types of research while creating conspicuous gaps and silences
by limiting the modes of inquiry we employ and regulating what data we report.
We conclude by discussing the difficulties that arise when we work within these
safe confines while simultaneously striving to impact the discourse in the broader
fields of education, theatre studies, performance studies and/or cultural studies, as
well as legislation and policy beyond academia. We argue that these disciplinary
boundaries are permeable, but that we must be able to translate our work into a
variety of scholarly and lay languages.

This article explores prevailing rhetoric in published scholarship in the field of


educational drama and applied theatre, which we roughly define as the application
of dramatic activities (process and/or performance) for educational purposes,
community engagement, problem solving, conflict resolution and/or rehabilitation
(‘the field’). Such work takes place in a wide variety of settings and contexts including
classrooms, prisons, community centres, traditional theatre spaces and more.
We were inspired by O’Toole’s (2010) challenge to explore whether researchers in
the field are ‘missing something vital by staying in our comfort zones’ (p. 287). His call
stemmed from his review of abstracts describing the 86 sessions that were presented
at the 2009 International Drama in Education Research Institute (IDIERI). O’Toole
called on scholars to ‘look for the gaps and silences’ in our work, noting a ‘serious

*Corresponding author. Email: matt.omasta@usu.edu


© 2014 Taylor & Francis
8 M. Omasta and D. Snyder-Young

need for more usable, broad-based, and reliable base-line data to use for policy
change and as starters for our research’ (p. 277).
Curious if O’Toole’s IDIERI findings were a microcosm of broader trends in twenty-
first century applied theatre/drama education research, we set out to identify
dominant paradigms in our field. Kuhn (1962) theorises that when members of a
scholarly community embrace a particular paradigm, they develop ‘criterion for
choosing problems that, while the paradigm is taken for granted, can be assumed to
have solutions. To a great extent these are the only problems that the community will
admit … or encourage its members to undertake’ (p. 37). We questioned whether our
own field had adopted a research paradigm that allows us – or even encourages us –
to neglect certain questions and approaches.
To explore this, we began collecting over 500 academic articles that addressed
educational and applied theatre. After finalising a set of articles for analysis (as
described below), we used this data-set to investigate three primary questions:
(1) What research methods do authors utilise to conduct their studies?
(2) What types of results do these studies report? Are peer-reviewed journal
articles distinguishable from pieces written specifically to advocate for the
field? Or are they similar to the IDIERI sessions, ‘more than 60% [of which] had
a clear goal of advocating something’ (O’Toole 2010, 282).
(3) Do scholars based in certain geographic regions influence the field’s discourse
disproportionately? Does ‘the documentation and discussion of applied
theatre in global settings remain in the hands of scholar-facilitators from the
UK, North America, and Europe, the majority of whom have university
associations’ as Kuftinec (2011, 170) argues in her critique of the Applied
Theatre Reader?

Methodology
Project scope and limitations
Our goal was to conduct a comprehensive and ambitious review of the major
features of scholarly articles in the field. However, we had to impose some limitations
to ensure the project remained manageable. We, therefore, limited the scope of this
study to peer-reviewed, English-language articles in scholarly journals published
2002–2012. We ultimately created a data-set of 428 such articles, but we did not
analyse book chapters, monographs, websites and other such publishing platforms or
articles in languages other than English. Had we included such sources, they may
have influenced our findings; we hope a future study might review of these materials.

Data collection
We populated a data-set with content identified via one of two methods. We first
searched for articles using two electronic databases: the Education Resources Informa-
tion Center database, ‘the world’s largest digital library of education literature’, and
Academic Search Premier. We combed both databases for articles containing any of
the following terms in any key field (e.g. title, abstract or descriptors):
. ‘theatre education’/‘theater education’
. ‘educational theater’/‘educational theatre’
Research in Drama Education 9

. ‘drama education’/‘educational drama’


. ‘creative drama’/‘creative dramatics’
. ‘process drama’
. ‘applied theatre’

In addition to articles identified through these searches, we also reviewed all articles
published in Youth Theatre Journal and Applied Theatre Researcher from 2002–2012.
Because the databases above do not include these two important journals in the field
that publish peer-reviewed scholarship, we manually reviewed the abstracts of each
article published to identify additional articles for inclusion in the data-set.1

Exclusions
Our initial searches identified approximately 500 articles that might be relevant to our
study. We reviewed each of these articles and purged those that, while they included
the search terms above, did not address the field or did so only tangentially. We also
excluded articles outside the scope of our study, including the following:
. Articles exclusively or primarily dedicated to mainstream theatre, including
theatre for young audiences or professional theatre with political or social
themes that was not created in relation to a specific community.
. Articles exclusively or primarily covering the teaching of theatre content in
higher education (e.g. university courses in performance or theatre history),
though we did include articles addressing the teaching of applied theatre or
educational drama methods.
. Articles using the lens of performance to explore phenomena unrelated to
educational theatre/applied theatre.
. Book reviews, conference reviews, performance reviews, announcements,
obituaries, descriptions of collections/archives/resources.

After culling such articles, we identified 428 relevant articles from 87 journals. We
entered descriptive data for each article to data-sets in Excel and SPSS, which we
used to conduct the analyses described below.
Figure 1 illustrates the percentage of articles in our data-set that were published
in various journals. The majority appeared in Research in Drama Education: The
Journal of Applied Theatre and Performance, suggesting the journal plays a key role in
influencing the field’s discourse.

Data coding and categorisation


We conducted content analysis (Bazerman 2006, 83–84) of each article in the data-
set, categorising each by features related to our research questions, as detailed
below. To reduce coding variance and the likelihood of coding bias, two coders
independently analysed each article. If a discrepancy arose between the original
regarding of any particular datum, a third coder reviewed the article to resolve the
discrepancy.
We first conducted provisional coding (Saldaña 2013, 144) using a set of pre-
established categories we expected most articles would fall within. We shared our
preliminary findings at the 2012 IDIERI and American Alliance for Theatre & Education
conferences, where we solicited comments from peers. Their feedback, coupled with
10 M. Omasta and D. Snyder-Young

Figure 1. Percentage of study articles published in various journals (N = 428).

our increasing familiarity with the texts, informed our revision of the codification
schema. We then recoded each article employing elaborative coding (Saldaña 2013,
229), using new and more precisely defined aggregating categories.

Codification and (over)simplification


Any attempt to quantify complex data necessarily results in a degree of simplification
(some might argue oversimplification). We strove to avoid creating artificial dichotom-
ies by providing a range of categories, but admit that many of the studies we reviewed
do not fit neatly into the categories we assigned them to. As pragmatic researchers, we
opted to proceed by building on typologies generally accepted within the scholarly
community, despite their imperfections, in hope of illuminating themes and trends in
the literature. We do not present our findings as positivistic ‘truths’, but rather as the
results of a good-faith effort to explore a large set of data in a meaningful way.

Researcher subjectivity
No interpretation-based research project is truly ‘objective’; our analyses are informed
by our experiences. Although as coresearchers we have lived different lives and
pursued different research foci, we are both White, middle-class, US-based scholars in
our mid-30s. Similarities in our academic preparation, professional training and
sociopolitical backgrounds inevitably result in shared ideological perspectives on the
field. Other scholars with different backgrounds and training may have coded the
data differently than us, potentially rendering visible different themes and trends.

Findings
Question 1: research methods and modes of analysis
Through our provisional and elaborative coding processes, we generated nine
aggregating categories (detailed below) that describe the research methods and
Research in Drama Education 11

modes of analysis articles employed. We then used the criteria below to determine
the single2 category of best fit for each piece.

Quantitative
These articles exclusively report quantifiable, statistically analysable data. Surveys
‘describe relevant characteristics of individuals, groups, or organisations’ (Berends
2006, 623), often by posing closed-ended questions to a sample of respondents
representing a broader population. Experimental studies ‘deliberately manipulate one
or more variables to assess their influence on an outcome’ (Cook and Sinha 2006,
551), such as the effect of a particular treatment (Saldaña and Wright 1996, 115). Like
experiments, quasi-experimental studies assess the impact of variables but do not fully
meet the criteria required of experiments (Shadish and Luellen 2006, 539).

Qualitative
These articles exclusively report data gathered using qualitative methods such as
interviews, focus groups and participant observation. Case studies analyse single cases
(e.g. particular performances or people), ‘concentrating on depth rather than breadth’
(Winston 2006, 41). While case studies often explore descriptive or explanatory
questions (Yin 2006, 112), ethnographies explore culture, broadly construed, to
explain how humans make meaning (Anderson-Levitt 2006, 279). Critical ethnogra-
phies function similarly, but ‘tak[e] account of the researcher’s impact on the context
under study’ (Gallagher 2006, 63–64). Unlike ethnographies composed as written
texts, performance ethnographies employ ‘recognisable dramatic techniques’ to share
data (Bacon 2006, 151). In Practice as Research, Reflective Practitioner Research, and
Action Research, authors dually position themselves as researchers and research
participants (Neelands 2006, 16), to reflect on their own praxis; they conduct ‘hybrid
enquiries combining creative doing with reflexive being’ (Kershaw et al. 2011, 64).
Phenomenological studies ‘describe and interpret the phenomena of personal lived
experiences’ (Edmiston and Wilhelm 1996, 90). Researchers who believe that
‘humans, individually and socially, live storied lives’ often conduct Narrative Inquiries,
presenting data in storied formats (Connelly and Clandinin 2006, 477; Zatzman 2006,
111). Because ‘bodies are often the means for understanding how performance
operates and makes meaning’ (Parker-Starbuck and Mock 2011, 210), some
researchers employ Body-Based Research Methods, using the human body as a tool
for collecting and sharing data. Other Arts-Based Research methods supplement
intellectual and analytical knowledge with intuitive, emotional and relational
knowledge.

Mixed methods
These articles deliberately blend quantitative and qualitative methods and data usually
relying on the strengths of each research tradition to mitigate the weaknesses of the
other (Creswell and Plano-Clark 2007, 9). For example, while the findings of large-
scale quantitative studies may be generalisable, they often prohibit detailed analysis
of more nuanced data that resist quantification. Qualitative studies, on the other
hand, are often able to analyse specific, local and personal data collected from fewer
participants, but these small sample sizes usually preclude the generalisation of data.
12 M. Omasta and D. Snyder-Young

Documentation of practice
These articles describe creative practice without employing qualitative or quantitative
methods. They share ‘practitioner knowledge’ (Pitches et al. 2011, 137) often in the
form of ‘how-to’ guides. Unlike reflective practitioner research, these studies do not
include significant reflective components, develop grounded theory or analyse
practice through theoretical lenses.

Conceptual, theoretical or philosophical inquiry


These articles advance philosophical arguments that are not grounded in specific case
studies or comprehensive literature reviews. They often consider the field through
critical lenses (see Bredo 2006, 23–24) such as feminism (see Grady 1996, 2006) or
post-structuralism (see McCormick 2006). This category also includes essays that
discuss research methodologies from theoretical perspectives.

Literature review
These articles systematically review literature to identify salient themes and trends.
They include meta-analyses, which ‘study and combine the findings of many
quantitative research studies’ (Glass 2006, 427), meta-syntheses, which analyse the
findings of many qualitative studies and typologies (see Doty and Glick 1994).

Archival/historical research and historiography


These articles document history and/or analyse how history is documented. Archival/
Historical studies examine primary and secondary sources to ‘reconstruct [historical events]
in order to determine [their] significance’ (Swortzell 1996, 99), and often adopt positivist
or post-positivist standpoints. Historiographies analyse how history has been recorded
and transmitted and ‘raise issues about the research methods we use in uncovering,
interpreting, and disseminating the past’ (Davis et al. 2011, 86); they frequently uncover
the ways ‘official’ historical narratives marginalise or omit certain groups.

Digital performance3
These articles address ‘theatre/performance events where computer technologies
play a key role in content, techniques, aesthetics, or forms of delivery’ (Dixon 2011,
41). Some analyse digital performance(s) while others discuss research methods in the
field.

Scenography4
These articles analyse visual and physical spaces, exploring how they impact
performances/activities conducted within them. They may also ‘draw attention to
the way stage space [can] be used as a dynamic and “kinesthetic contribution” to the
experience of performance’ (McKinney and Iball 2011, 111).

Analysis
Sixty-eight per cent of the articles in our data-set were qualitative or conceptual/
theoretical in nature.5 The paucity of articles categorised as digital performance or
scenography may be attributable in part to the relatively recent emergence of these
research traditions. A similar explanation for the relative scarcity of articles based in
Research in Drama Education 13

Figure 2. Primary research methods/modes of analysis (N = 428).

more established research traditions (e.g. quantitative or historical/archival research)


would be insufficient, however, suggesting that other factors may be responsible for
the limited number of such articles (Figure 2).

Question 2: study results


In order to explore how academic journals contribute to discussions of the field’s
perceived value, quality, impact(s) and efficacy, we next categorised articles based on
the results they reported. The scope of this particular analysis was narrow: we only
categorised articles that made explicit claims regarding the value of specific
educational drama and applied theatre projects; we did not analyse the findings of
studies that were not designed to engender empirical results of this nature.6
Our intention was to analyse the discourse surrounding the fields’ value and not to
construct a model that evaluated its value. We, therefore, investigated how articles
represented the contributions of various projects, rather than the credibility of such
claims. As such, we categorised articles solely on the basis of claims study authors
made and outcomes they reported.7 Through our codification processes, we
developed six aggregating categories describing the results articles reported. We
categorised each article only once, determining the single category of best fit based
on the criteria described below.

Positive
Articles in this category reported that the programme(s) they investigated (e.g.
particular applied theatre experiences or educational drama interventions) benefited
participants and/or provided evidence affirmed the value of the field. We included
qualitative articles in the ‘positive’ category when their authors indicated that the
programme(s) they studied improved participants’ in some way, at least temporarily.
We considered quantitative results positive when statistical tests recommended the
rejection of the null hypothesis due to contextually desirable statistically significant
differences. Regardless of studies’ designs, they were included in this category only
when they yielded exclusively positive results or a mix of positive and neutral results
(as defined below).
14 M. Omasta and D. Snyder-Young

Neutral
Articles in this category reported that the programme(s) they investigated had no
discernible impact on participants. We considered the results of qualitative studies
‘neutral’ when researchers explicitly stated that there was no evidence that a
programme influenced participants in any way and included quantitative studies in
this category when statistical tests recommended retaining the null hypothesis
(i.e. there was no statistically significant difference between conditions).

Negative
Articles in this category reported that the programme(s) they investigated negatively
impacted participants and/or provided evidence disconfirming the value of the field.
We included qualitative articles in the ‘negative’ category when their authors
indicated that the programme(s) they studied harmed participants (physically,
emotionally, spiritually or other otherwise). We considered quantitative results
negative when statistical tests recommended the rejection of the null hypothesis
due to contextually undesirable statistically significant differences. As above, articles
were only included in this category when they yielded exclusively negative results or a
mix of negative and neutral results.

Mixed
Articles in this category reported both ‘positive’ and ‘negative’ results per the criteria
above. We included qualitative articles in the ‘mixed’ category when their authors’
reported that some programme participants’ lives were improved but that of others
participants were harmed. We included quantitative articles in this category when
statistical tests recommended rejected null hypotheses due to statistically significant
differences that were both desirable and undesirable in the context of the study (e.g.
conflicting results within a single study using multiple measures, or different results
from different groups of participants). We included mixed methods studies in this
category when they reported any combination of positive and negative results,
regardless of which method(s) that yielded the results.

Missing
We included articles in this category if they discussed studies designed to produce
measureable outcomes that were never reported. This category includes articles that
began by posing specific research questions that were never returned to later in the
essay as well as articles that state that statistical tests were performed that never
disclose the results of those tests. Studies that posed research questions which the
researchers later explained they could not answer based on data available were NOT
included in this category; the ‘missing’ category only includes articles in which
researchers implied they would provide data/results but failed to do, without
explanation. Of the total articles analysed, only approximately 4% met the criteria
for this category.

Ambiguous
We included articles in this category if they reported results in extremely vague,
unclear or self-contradictory fashions. As indicated above, studies that yielded both
Research in Drama Education 15

Figure 3. Types of results reported by articles that reported results (N = 203).

positive and negative results were included in the ‘mixed’ category, NOT the
‘ambiguous’ category. Of the total articles analysed, only approximately 1% met the
criteria for this category.
Because so few articles were included in the ‘missing’ and ‘ambiguous’ categories
in our initial analysis, we performed a second analysis that excluded such articles.
Figure 3 summarises the results reported by the 203 that were included in our
second analysis.
Ninety per cent of these result-oriented articles affirmed the power of theatre to
change lives for the better. While some may quickly embrace this finding, it merits
closer examination. The limited number of articles reporting neutral or negative
findings could be attributable to publication bias; a theory that suggests many
journals simply reject all articles that fail to report statistically significant results (see
Scargle 1999).
However, we spoke with the editors of two leading journals in the field, both of
whom indicated that they would welcome such submissions, but rarely or never
receive them. This anecdotal evidence alone cannot refute the possibility of
publication bias, but we should consider other explanations. Is it possible that these
‘gaps and silences’ stem from our collective reluctance to design research studies
that we anticipate might yield results that challenge dominant narratives in the field?
Might we hesitate to report such results even when we do find them?

Question 3: where are published scholars located?


Kuftinec (2011) contends that published research in the field discourse disproportio-
nately privileges the perspectives and practices of scholars from wealthy, Western
nations. To investigate if our data supported her critique, we coded the likely location
geographical locations from which authors8 wrote based on their reported institu-
tional affiliations or regions of residence. For this analysis, we counted each author’s
estimated region once per article, based on their institutional affiliation/reported
16 M. Omasta and D. Snyder-Young

Figure 4. Percentage of articles written by authors in various regions (N = 563).

region at the time of each article’s publication (i.e. scholars were counted once per
article, per region) (Figure 4).
Eighty-five per cent of the scholars authoring articles in our data-set were
affiliated with institutions in Europe, North America and Oceania. Of these, 90% were
affiliated with institutions in just four countries: Australia, Canada, the UK, and the
USA. Our analysis supports Kuftinec’s assertion that our field’s research is composed
by a relatively narrow and homogenous group of individuals from wealthy, Western
nations.
However, our exclusion of articles published in languages other than English may
have significantly influenced our findings. We know of studies exploring praxis in
countries where the primary languages spoken range from Spanish to Cantonese.
Unfortunately, however, these studies are infrequently published in English-language
journals. Translation can be expensive, rendering research inaccessible to scholars
who are not fluent in the languages in which articles are composed. Had we included
articles written in non-English languages, our search may have revealed additional
publications by scholars in non-Western regions.
While it would have been ideal to reduce our study’s language bias, it is quite
possible that the inclusion of additional articles may have had a minimal impact on
our findings. Indeed, many of the regions under-represented in the literature we
analysed include countries where English is a primary language (e.g. Jamaica or
Nigeria). Collectively, these nations tend to be smaller and less affluent than the
Western countries over-represented in the literature, suggesting that regional
economic prosperity also plays a role in limiting the number of articles scholars
from these regions produce. Regardless, if some (or many) scholars who publish
primarily in English cannot access research published in other languages, they may
continue to privilege Western structures, philosophies and assumptions even as they
write for global audiences whose contexts might require very different approaches.9
Research in Drama Education 17

Analysis: dominant paradigms


Advocacy, ‘bullshit,’ rigour, comfort zones, gaps and silences
Taylor’s (1996) Researching Drama and Arts Education begins with a warning: ‘as the
arts struggle to remain on the curriculum, as they continue to be relegated to the
marginalised periphery, their leaders seek salvage in the politics of fear … arts
education learns to play the game of scientism’ (p. 1). Taylor calls on researchers to
defy neo-positivist paradigms (p. 4) and offers a host of alternative approaches,
including action research, reflective practitioner research, critical and transformative
research, and theoretically informed approaches to historical or experimental
research.
The research landscape today looks quite different than it did in 1996. The
overwhelming majority of the 428 articles we reviewed were grounded in critical
theory, action research and other qualitative or performance-based approaches; only
4% were quantitative and directly tied to the ‘scientism’ Taylor described in the late
twentieth century. Perhaps this reformation was shaped by the strong influence
scholars like he and Maxine Greene exerted over past two decades. But just as the
paradigm of ‘scientism’ limited discourse and failed to acknowledge the validity of
other important approaches, perhaps our current paradigm’s emphasis on qualitative
and theoretical research creates blind spots as well.
Our analysis suggests that, collectively, our field tends to stay within the safe
confines of what O’Toole terms ‘comfort zones’ (p. 273). We seem most comfortable
studying practice using qualitative research methods, and frequently engage in
philosophical inquiry. Most of our published research is conducted by scholars
affiliated with universities in (relatively) comfortable, wealthy, English-speaking
nations. When describing practice, we mostly report positive outcomes that align
comfortably with the existing advocacy-based discourse. What criticism we do offer
usually takes the form of theoretical essays that critique the field at large, perhaps
arguing that it unintentionally perpetuates hegemony or promotes inequality; we
rarely risk the potential discomfort that might accompany more specific critique.
Resting in these comfort zones necessarily entails the creation of gaps and
silence. Indeed, we believe our field has collectively implemented unspoken guide-
lines that govern how we conduct our work and, more significantly, dictate what
approaches and ideas we will avoid. We generally avoid quantitative approaches and,
sometimes, explicitly reject them. It is extremely rare for a research report to directly
criticise the work of an individual practitioner, programme or practice. Very few
articles cite evidence that contradicts a-priori assumptions about the supposed
benefits of drama education or applied theatre. While there are exceptions to these
norms, they are eclipsed by the sheer volume of scholarship conforming to them.
We did not adopt our field’s paradigm arbitrarily; there were (and are) credible
rationale for limitations it imposes. We might believe that asking certain questions
could trigger a ‘slippery slope’ that ultimately jeopardises the existence of our
programmes and, not inconsequentially, our livelihoods. As researchers, practitioners
and advocates for the field, we share a vested interest in its survival. However, we
must consider if adherence to the paradigm actually undermines our efforts.
Several earlier studies raise concerns about our field’s research that may stem
from paradigmatic issues. For example, Hughes’ (2005) reviews 400 arts programmes
in the criminal justice sector and cites an array of concerns, such as:
18 M. Omasta and D. Snyder-Young

lack of baseline information; lack of controls; problems finding appropriate measures;


over-reliance on anecdote; assumptions made about links between outcome and
intervention (and limited linking of evidence to findings); lack of use of research and
literature; short-term views; small samples; difficulties accessing information relating to
offending; and lack of information about how qualitative data has been analysed and
interpreted. (p. 10)

We did not systematically seek out the issues Hughes identified when we reviewed
the articles in our data-set, but her findings resonated throughout the coding
process.
Belfiore (2009) argues that a discourse of ‘bullshit’ permeates rhetoric ‘around the
alleged transformative powers of the arts and their consequent (presumed) positive
social impacts’ (p. 343). She takes issue with scholarly reports ‘marred by a profound
confusion between genuine research and research for the sake of advocacy’ (p. 353).
Her critique mirrors Hughes’ complaint that much research makes questionable
assumptions regarding relationships between arts-based interventions and non-arts
outcomes. We find the perception that researchers in our field sometimes fail to
distinguish between research reports and advocacy-driven publications particularly
disconcerting. If we do not strive to address such concerns and inspire confidence
that our field’s researchers can assess programmes disinterestedly, some audiences
will likely reject the legitimacy of our field’s practices altogether.
As scholars, we could adapt Belfiore’s (2009) suggestions for arts education
researchers in general (see p. 353) to our field specifically. This might entail focusing
less on how our field’s programmes can be evaluated and improved and reinvesting
our collective energy into rigorous studies that ask whether educational drama and
applied theatre experiences actually make the impacts we claim they do, if these
impacts are consistently positive, and whether we can legitimately generalise the
diverse experiences of diverse people within and across the diverse practices that
constitute our field (p. 353). Adopting the rigorous approaches to research Belfiore
recommends may help address reservations the scholars and policy-makers outside
our field might hold. Conducting studies free of underlying, advocacy-related goals in
a disinterested manner may be the most effective way to advocate for the field in the
long term.
We must also become fluent in the bureaucratic languages of legislators and
sponsors, despite any reservations we may have about their agenda. If we truly hope
to influence these parties’ views, we must communicate in a language they
understand and respect. Such fluency is also critical in many countries if we wish
to secure funding for our research. In the USA, for example, federal funding for
education research is guided by the No Child Left Behind Act of 2001. This law
explicitly calls for researchers to employ ‘systematic, empirical methods’ that are
‘evaluated using experimental or quasi-experimental designs’ (1550–1551). Of the
studies in our data-set, fewer than 5% would potentially qualify as ‘scientifically based
research’ eligible for federal funding.
Many projects are sustained by grants from governmental agencies or corporate
foundations that privilege positivist frameworks. Grant-writing and programme
evaluation require us to speak languages donors understand, and increasingly, these
are languages of neoliberalism and quantification. We may feel this social trend is a
problem, but we also know it is a reality in many countries. We can refuse to consider
Research in Drama Education 19

the validity of quantitative research, or to engage only in discourse grounded in


critical theory, but doing so would risk completely disconnecting ourselves from
programme managers and scholars in other disciplines (including education),
ultimately further marginalising our scholarship.
Some argue that allowing external institutional and political forces to influence
our research is analogous to falling into a trap: that accepting conditional funding
necessarily entails yielding ideological and practical control to outside entities. We
believe, however, that we can navigate terrain shaped by outside entities without
compromising our commitment to our core values. Partnerships need not require
blind capitulation to demands of funders or politicians; we should exercise due
caution and carefully consider the implications of any agreement with agencies
whose goals differ from our own.
Collectively, we must practice a wide variety of research methodologies and
report the results of our studies whether or not they align with our predetermined
outcomes. We need to clearly distinguish between research for its own sake and
research in service of advocacy. We would be wise to establish global publishing
initiatives, to dedicate space in our journals to highlighting the work of scholars
working permanently in less wealthy nations and to secure resources for the
translation of scholarship. Only by engaging in truly interdisciplinary, multilingual
and honest research can we truly become advocates for our field.

Keywords: critique; exclusion; research methods; paradigms; status quo; impact

Notes
1. We hoped to review additional journals not indexed in the databases we searched, such as
Drama Australia’s NJ. However, these closed, subscription-based journals were unavailable
through our institutions’ research libraries, and we were unable to secure access directly
from their publishers.
2. Though some articles met the criteria of multiple categories, for analytic purposes we
included each article in only one category. Certain criteria trumped others while
categorizing as detailed in the next note.
3. All articles meeting the criteria described in these final categories were coded as ‘Digital
Performance’ and ‘Scenography’, respectively, even if they employed other research
methods/modes of analysis.
4. See previous note.
5. While our data suggest the field shies away from quantitative measures, Belfiore and
Bennett (2010) contend that research reports prepared for policy-makers and funders
favour them, noting that research for these audiences ‘has tended to privilege quantitative
approaches borrowed from the disciplines of economics and auditing’ (p. 124). However,
evaluative research reports prepared by organisations pursuing funding or advocacy goals
before policy-making bodies are rarely subject to the rigorous, double-blind peer review
process academic journal articles undergo. It is unsurprising that research in journals differs
significantly from research prepared for funders or policy-makers.
6. We do not believe that such articles lack merit; indeed, many advance important theories
or use empirical methods to explore pressing questions unrelated to assessment.
Ultimately, we excluded 215 articles from this particular analysis.
7. We did not assess the calibre of any study’s research design or independently confirm data
reported. Factors such as studies’ validity, reliability, rigour and robustness were outside
the scope of our investigation.
8. To maintain a manageable volume of data, we only coded the regions of the first three
authors of collaborative works.
20 M. Omasta and D. Snyder-Young

9. We believe that the field would benefit from a study conducted by a team of multi-lingual
scholars able to thoroughly contrast the features of educational and applied theatre
literature in multiple languages.

Notes on contributors
Matt Omasta directs the Theatre Education programme at Utah State University, where he
researches educational drama and theatre for young audiences. He is coeditor of Play &
Performance (Routledge, forthcoming) and has published in Youth Theatre Journal; The
International Journal of Education & the Arts; Teaching Theatre; and Theatre for Young Audiences
Today. His current projects examine the state of theatre arts education in secondary schools,
playwriting for youth and methods of devising theatre with young people.
Dani Snyder-Young is Associate Professor of Theatre Arts at Illinois Wesleyan University, where
she researches applied theatre. She is the author of Theatre of Good Intentions: Challenges and
Hopes for Theatre and Social Change (Palgrave Macmillan 2013), and has published in Research
in Drama Education: The Journal of Applied Theatre Research; Theatre Research International,
Youth Theatre Journal, Qualitative Inquiry, Texas Theatre Journal, and the International Journal of
Learning. Her current project examines applied theatre spectatorship.

References
Anderson-Levitt, K. M. 2006. “Ethnography.” In Handbook of Complementary Methods in
Education Research, edited by J. Green, G. Camilli, and P. Elmore, 279–296. Washington,
DC: AERA.
Bacon, J. 2006. “The Feeling of Experience: A Methodology for Performance Ethnography.” In
Research Methodologies for Drama Education, edited by J. Ackroyd, 135–157. Sterling, VA:
Trentham.
Bazerman, C. 2006. “Analyzing the Multidimensionality of Texts in Education.” In Handbook of
Complementary Methods in Education Research, edited by J. Green, G. Camilli, and P. Elmore,
77–94. Washington, DC: AERA.
Belfiore, E. 2009. “On Bullshit in Cultural Policy Practice and Research: Notes from the British
Case.” International Journal of Cultural Policy 15 (3): 343–359. doi:10.1080/102866309028
06080.
Belfiore, E., and O. Bennett. 2010. “Beyond the ‘Toolkit Approach’: Arts Impact Evaluation
Research and the Realities of Cultural Policy-making.” Journal for Cultural Research 14 (2):
121–142. doi:10.1080/14797580903481280.
Berends, M. 2006. “Survey Methods in Educational Research.” In Handbook of Complementary
Methods in Education Research, edited by J. Green, G. Camilli, and P. Elmore, 623–640.
Washington, DC: AERA.
Bredo, E. 2006. “Philosophies of Educational Research.” In Handbook of Complementary Methods
in Education Research, edited by J. Green, G. Camilli, and P. Elmore, 3–32. Washington,
DC: AERA.
Connelly, M. F., and D. J. Clandinin. 2006. “Narrative Inquiry.” In Handbook of Complementary
Methods in Education Research, edited by J. Green, G. Camilli, and P. Elmore, 477–488.
Washington, DC: AERA.
Cook, T. D., and V. Sinha. 2006. “Randomized Experiments in Educational Research.” In
Handbook of Complementary Methods in Education Research, edited by J. Green, G. Camilli,
and P. Elmore, 551–566. Washington, DC: AERA.
Creswell, J. W., and V. L. Plano-Clark. 2007. Designing and Conducting Mixed Methods Research.
Thousand Oaks, CA: Sage.
Davis, J., K. Normington, G. Bush-Bailey, and J. Bratton. 2011. “Researching Theatre History and
Historiography.” In Research Methods in Theatre and Performance, edited by B. Kershaw and
H. Nicholson, 86–110. Edinburgh: Edinburgh University Press.
Dixon, S. 2011. “Researching Digital Performance: Virtual Practices.” In Research Methods in
Theatre and Performance, edited by B. Kershaw and H. Nicholson, 41–62. Edinburgh:
Edinburgh University Press.
Research in Drama Education 21

Doty, D. H., and W. H. Glick. 1994. “Typologies as a Unique Form of Theory Building: Toward
Improved Understanding and Modeling.” The Academy of Management Review, 19 (2):
230–251.
Edmiston, B., and J. Wilhelm. 1996. “Playing in Different Keys: Research Notes for Action
Researchers and Reflective Drama Practitioners.” In Researching Drama and Arts Education:
Paradigms and Possibilities, edited by P. Taylor, 85–96. London: RoutledgeFalmer.
Gallagher, K. 2006. “(Post) Critical Ethnography in Drama Research.” In Research Methodologies
for Drama Education, edited by J. Ackroyd, 63–80. Sterling, VA: Trentham.
Glass, G. V. 2006. “Meta-analysis: The Quantitative Synthesis of Research Findings.” In Handbook
of Complementary Methods in Education Research, edited by J. Green, G. Camilli, and
P. Elmore, 427–438. Washington, DC: AERA.
Grady, S. 1996. “Toward the Practice of Theory in Practice.” In Researching Drama and Arts
Education: Paradigms and Possibilities, edited by P. Taylor, 59–71. London: RoutledgeFalmer.
Grady, S. 2006. “Feminist Methodology: Researching as if Gender and Social Power Really
Mattered.” In Research Methodologies for Drama Education, edited by J. Ackroyd, 81–109.
Sterling, VA: Trentham.
Hughes, J. 2005. Doing the Arts Justice: A Review of Research Literature, Practice and Theory.
Canterbury: The Unit for the Arts and Offenders, Centre for Applied Theatre Research.
Kershaw, B., L. Miller, J. B. Whalley, R. Lee, and N. Pollard. 2011. “Practice as Research:
Transdiciplinary Innovation in Action.” In Research Methods in Theatre and Performance,
edited by B. Kershaw and H. Nicholson, 63–85. Edinburgh: Edinburgh University Press.
Kuftinec, S. A. 2011. “The Applied Theatre Reader (Book Review).” TDR 55 (2): 169–171.
doi:10.1162/DRAM_r_00081.
Kuhn, T. 1962. The Structure of Scientific Revolutions. Chicago, IL: University of Chicago
Press, 1996.
McCormick, I. 2006. “Poststructuralist ‘Methodology.’” In Research Methodologies for Drama
Education, edited by J. Ackroyd, 159–179. Sterling, VA: Trentham.
McKinney, J., and H. Iball. 2011. “Researching Scenography.” In Research Methods in Theatre and
Performance, edited by B. Kershaw and H. Nicholson, 111–136. Edinburgh: Edinburgh
University Press.
Neelands, J. 2006. “Re-imaging the Reflective Practitioner: Toward a Philosophy of Critical
Praxis.” In Research Methodologies for Drama Education, edited by J. Ackroyd, 15–39. Sterling,
VA: Trentham.
O’Toole, J. 2010. “A Preflective Keynote: IDIERI 2009: I, Meta-fellow on the Stair.” RiDE: The
Journal of Applied Theatre and Performance 15 (2): 271–292.
Parker-Starbuck, J., and R. Mock. 2011. “Researching the Body in/as Performance.” In Research
Methods in Theatre and Performance, edited by B. Kershaw and H. Nicholson, 210–235.
Edinburgh: Edinburgh University Press.
Pitches, J., S. Murray, H. Poyner, L. Worth, D. Richmond, and J. D. Richmond. 2011. “Performer
Training: Researching Practice in the Theatre Laboratory.” In Research Methods in Theatre and
Performance, edited by B. Kershaw and H. Nicholson, 137–161. Edinburgh: Edinburgh
University Press.
Saldaña, J. 2013. The Coding Manual for Qualitative Researchers. 2nd ed. Los Angeles, CA: Sage.
Saldaña, J., and L. Wright. 1996. “An Overview of Experimental Research Principles for Studies in
Drama and Theatre for Youth.” In Researching Drama and Arts Education: Paradigms and
Possibilities, edited by P. Taylor, 115–131. London: RoutledgeFalmer.
Scargle, J. D. 1999. “Publication Bias: The ‘File-Drawer’ Problem in Scientific Inference.” Paper
presented at the Sturrock Symposium, Stanford University, March 20. http://arxiv.org/pdf/
physics/9909033v1.pdf.
Shadish, W. R., and J. K. Luellen. 2006. “Quasi-experimental Design.” In Handbook of
Complementary Methods in Education Research, edited by J. Green, G. Camilli, and P. Elmore,
539–550. Washington, DC: AERA.
Swortzell, L. 1996. “History as Drama/drama as History: The Case for Historical Reconstruction as
a Research Paradigm.” In Researching Drama and Arts Education: Paradigms and Possibilities,
edited by P. Taylor, 97–104. London: RoutledgeFalmer.
Taylor, P., ed. 1996. Researching Drama and Arts Education: Paradigms and Possibilities. London:
RoutledgeFalmer.
22 M. Omasta and D. Snyder-Young

Winston, J. 2006. “Researching through Case Study. In Research Methodologies for Drama
Education, edited by J. Ackroyd, 41–62. Sterling, VA: Trentham.
Yin, R. 2006. “Case Study Methods.” In Handbook of Complementary Methods in Education
Research, edited by J. Green, G. Camilli, and P. Elmore, 33–55. Washington, DC: AERA.
Zatzman, B. 2006. “Narrative Inquiry: Postcards from Northampton.” In Research Methodologies
for Drama Education, edited by J. Ackroyd, 111–133. Sterling, VA: Trentham.

You might also like