Professional Documents
Culture Documents
(2016) Training Public School Special Educators To Implement Two Functional Analysis Models
(2016) Training Public School Special Educators To Implement Two Functional Analysis Models
DOI 10.1007/s10864-016-9247-2
ORIGINAL PAPER
Emily Gregori4
Abstract The purpose of this study was to investigate the efficacy and efficiency
of a training package to teach public school special educators to conduct functional
analyses of challenging behavior. Six public school educators were divided into two
cohorts of three and were taught two models of functional analysis of challenging
behavior: traditional and trial-based functional analysis. The effect of the training
package on functional analysis implementation was evaluated using multiple-
baseline designs across participants for each functional analysis model. The
sequence of functional analysis models taught was counterbalanced across cohorts.
Following the training package, all participants reached 100 % implementation
fidelity during role-plays and classroom sessions with students, and maintained high
fidelity at follow-up role-play sessions. Data on training duration, trials to criterion,
and social validity revealed that trial-based functional analysis training had a shorter
duration and was rated as the more favorable functional analysis model.
123
J Behav Educ
Introduction
123
J Behav Educ
indirect and descriptive assessments (e.g., Ellingson et al. 2000; Symons et al. 1998)
and functional analysis (e.g., Pence et al. 2014).
Functional analysis (FA) is a component of an FBA involving the systematic
manipulation of environmental variables to determine antecedents which occasion
challenging behavior and sources of reinforcement which maintain challenging
behavior (Hanley et al. 2003). In a traditional FA (TFA) (c.f. Iwata et al. 1982/
1994), a variety of social conditions are simulated to examine potential antecedents
and common sources of reinforcement for challenging behavior (e.g., to obtain adult
attention, to escape task demands). While TFA is considered to be the most precise
method for determining functions of challenging behavior (Delfs and Campbell
2010) and has been the subject of peer-reviewed research for over 30 years (Beavers
et al. 2013), FAs are rarely implemented in public school settings (Gresham et al.
2004). The absence of FAs in schools may be related to the level of difficulty of the
procedures, required personnel and time needed to carry out the procedures, and the
potentially lengthy and intensive training required to implement the FA (Daly et al.
1997; Ducharme and Shector 2011). Recent modifications to TFA models may help
to address some of the barriers affecting the use of FA in classroom settings (Lydon
et al. 2012).
Trial-based functional analysis (TBFA) was introduced by Sigafoos and Saggers
(1995) as an FA model which can be embedded into classroom environments.
Rather than relying on ‘‘artificial’’ simulated social conditions (McCahill et al. 2014,
p. 480) to test specific antecedents and reinforcers for challenging behavior, the
TBFA capitalizes on social situations already occurring in the individual’s
environment. Like the TFA model, data are collected on the occurrence of
challenging behavior following the presentation of specific antecedents, such as
removal of teacher attention, presentation of a task demand, or removal of a
preferred activity or item. However, unlike TFA models, which require massed
presentation of antecedents and reinforcement for challenging behavior (Iwata et al.
1982/1994), the TBFA model consists of brief (i.e., 2 min) trials distributed
throughout the school day. It has been proposed that the TBFA model may require
fewer resources and have higher social validity with public school teachers than
TFA models (c.f. Lloyd et al. 2014; Rispoli et al. 2015a, b). While studies have
trained teachers in either TBFA (e.g., Kunnavatana et al. 2013) or TFA (e.g., Erbas
et al. 2006), no study has trained teachers in multiple FA models in order to
compare teacher implementation acquisition, required resources, or measures of
social validity of these FA models.
The purpose of this study was to compare teacher acquisition of TFA and TBFA.
Specific research questions were: (a) What are the effects of a teacher training
package on public school special educators’ implementation of the TFA model and
the TBFA model? (b) Do teacher training effects generalize to teacher implemen-
tation of FAs in their classrooms with students with autism spectrum disorder
(ASD)? (c) Do teacher training effects maintain at follow-up probes? (d) Which FA
model requires fewer sessions for teachers to acquire? (e) How do teachers view the
feasibility and acceptability of conducting an FA in the classroom and which FA
model do these teachers prefer?
123
J Behav Educ
Method
Participants
123
J Behav Educ
Fiona was a special education teacher with 2 years of teaching experience. Fiona
had a bachelor’s degree in special education and no previous experience with FA.
Fiona taught a preschool self-contained special education classroom for children
with ASD located in the elementary school. Her students were between 3 and
5 years and received services both within the special education classroom and in
inclusive settings. Typically, there were between two and three students in the
classroom at a time and one to three paraprofessionals.
Each of the educators was paired with one of her students for evaluating FA
implementation in the classroom. Students’ parents provided informed consent for
their child to participate in this study. All participating students were between 5 and
13 years of age, received special education services related to an educational
diagnosis of autism, and were reported by their teachers to engage in challenging
behavior daily or weekly. As Debbie was a district administrator without a caseload
of students, she implemented the classroom FA with Cate’s student. To ensure their
data were independent, Debbie and Cate were not present during each other’s
sessions and did not discuss any aspects of the study with one another. Student
target behaviors included: verbal protesting (Angela’s student), whining (Brooke’s
student), aggression (Cate and Debbie’s student), repetitive verbalizations (Ellen’s
student), and body clenching or shirt biting (Fiona’s student).
Training, role-play, and maintenance sessions were conducted within the educators’
classrooms or in an empty conference room in the school. In situ sessions were
conducted in the educators’ classrooms. All sessions were video-recorded for data
collection purposes. During baseline, role-play and performance feedback, and
maintenance phases, only the educator and two researchers were present. The in situ
probes were conducted in the educator’s classroom during the instructional day
when students and staff were present. To minimize disruption to the classroom, only
one researcher was present during these sessions.
FA materials varied dependent on the available classroom materials, but included
items related to leisure (e.g., computer, toys, books) and instruction (e.g.,
worksheets, math manipulatives). Training materials included a copy of the Iwata
et al. (1982/1994) and Bloom et al. (2013) articles, a laptop computer loaded with a
PowerPointÒ presentation with corresponding printed handouts of the slides, and a
DVD with 2-min video clips depicting each assessment condition of the FA models.
Experimental Design
123
J Behav Educ
Dependent Measures
Data were collected in vivo using a pencil and paper data sheets for all study
phases. The dependent variable was teacher FA implementation fidelity measured
as the percentage of steps implemented correctly during each FA session using
the point-by-point method (Gast 2010). Implementation fidelity was scored using
a researcher-developed checklist (see ‘‘Appendix 1’’). The checklist included a
task analysis of four to six steps for each condition of the FA model. For a step
in the task analysis to be scored correct, the educator had to implement that step
accurately throughout the entire session. The number of steps implemented
correctly for a given session, including both control and test components for the
TBFA, was divided by the total number of steps in that condition and multiplied
by 100 to obtain a percentage. Data were also collected on the number of
sessions required to reach implementation fidelity criterion (100 % implemen-
tation fidelity on a single session). Finally, the duration of role-play and
performance feedback sessions for each participant to reach criterion was
assessed by calculating the sum of all minutes of role-play and performance
feedback sessions for each participant.
TFA
The TFA procedures were adapted from those described by Iwata et al. (1982/1994)
to be 5 min in duration and to include a tangible condition. Participants were taught
to implement four TFA conditions: (a) attention, (b) tangible, (c) escape, and
(d) play. The attention condition was designed to assess whether the student’s
challenging behavior was maintained by access to attention. In the attention
condition, the educator sat near the student with leisure materials or toys present.
The educator directed the student to engage with the materials then turned her body
away. Contingent upon the target challenging behavior, the educator approached the
student and provided brief physical attention (e.g., pat on the back) and verbal
123
J Behav Educ
attention such as asking, ‘‘Are you ok?’’ until challenging behavior had ceased for
10 s.
The tangible condition was designed to assess whether the student’s challenging
behavior was maintained by access to an item or activity. In the tangible condition,
the educator provided the student with 10 s of access to a preferred tangible (i.e., toy
or activity). The educator then placed the tangible in sight, but out of the student’s
reach and blocked access to the item. Contingent upon challenging behavior, the
educator provided the tangible to the student for 10 s or until challenging behavior
ceased.
The escape condition was designed to assess whether the student’s challenging
behavior was maintained by escape from task demands. During the escape
condition, the educator verbally instructed the student to complete a task related to
the student’s individualized education plan (IEP). The educator provided least-to-
most prompting to assist students in completing the task. These prompts included
verbal, verbal plus model, and verbal plus physical prompts. The educator provided
verbal praise upon task completion and then presented a new task demand.
Contingent upon challenging behavior, the educator removed task-related materials
and turned away from the student for 10 s. Task demands and related materials were
presented after the target challenging behavior had ceased for 10 s.
The play condition was designed to serve as a control condition. During the play
condition, no task demands or task-related materials were presented and educators
gave students unrestricted access to leisure activities or toys. The educator provided
praise and physical contact (e.g., pats on back) at least once every 10 s and did not
respond to challenging behavior.
TBFA
123
J Behav Educ
from the student and placed it in sight but out of reach. The educator told the student
he or she could have the item back later and then blocked student access to the item.
Contingent upon target challenging behavior, the educator gave the item to the
student and the trial ended.
The escape condition was designed to assess whether the student’s challenging
behavior was maintained by escape from task demands. During the control
component, the educator informed the student that he or she could have a break. The
educator turned away and did not interact with the student for 60 s, and no task-
related materials or demands were presented to the student. During the test
component, the educator instructed the student to complete a task related to the
student’s IEP. The educator provided least-to-most prompting to assist the student in
completing the task in the same fashion as the TFA. The educator provided verbal
praise upon task completion and then presented a new task demand. Contingent
upon challenging behavior, the educator removed the task materials and told the
student he/she could have a break and the trial ended.
Baseline
Prior to baseline, educators were emailed a research article describing the FA model
being taught. Educators read the seminal Iwata et al. (1982/1994) article for the
TFA model and the recent Bloom et al. (2013) for the TBFA model. Once the
educator confirmed she had read the article, the baseline session began. The first
author asked the educator to role-play a specific FA condition with the second
author, who played the role of a student with challenging behavior. The order of FA
conditions was randomized within each FA model. The educator was told to gather
any materials she needed and to begin when she was ready. The session continued
until the educator stated that the FA condition was finished. No performance
feedback was provided.
Educator Training
The first author implemented all training procedures with each educator individ-
ually. The training lasted 45 min and included a 30-min PowerPoint presentationÒ
of either the TFA or TBFA model, video examples of the corresponding FA model,
and opportunities for the participant to ask questions. Participants were provided
with corresponding handouts of each presentation slide. The presentation contained
an overview of operant conditioning, social functions of behavior, rationale for
conducting an FA, and task analyses of each FA condition.
Immediately following the 45 min training, the educator and a second researcher
practiced each FA condition with the researcher playing the role of the child each
time. The order in which the FA conditions were role-played was randomized, and
123
J Behav Educ
each role-play was considered one session. A role-play script was created for each
condition of each FA model. The scripts contained four to eight specific behaviors
for the researcher to role-play. These scripts were designed prior to the study and
were used across all educators. Scripts were designed to simulate authentic
examples of FAs with children who have challenging behavior. Examples of role-
play behaviors included: ‘‘Engage in a non-target challenging behavior at least once
during the session,’’ and ‘‘Refuse to give back the tangible by holding on tightly
when the educator attempts to take it.’’
Following each session, the first author provided written and verbal performance
feedback concerning the educator’s implementation of the target condition. The
educator was shown the data sheet (see ‘‘Appendix 1’’) for implementation fidelity.
In addition, the structure of performance feedback followed a researcher-developed
checklist (see ‘‘Appendix 2’’). The researcher provided a positive comment about
the educator’s implementation then reviewed any steps incorrectly implemented
using a neutral tone. Role-play and feedback continued for that session until the
educator reached the preset performance criterion of 100 % fidelity of implementing
or one session of each condition for the target FA model.
In situ Probes
After the educators had reached 100 % implementation fidelity for each role-played
condition of the targeted FA model, the educator conducted the FA conditions in her
classroom with her student. Following each in situ session, the researcher provided
performance feedback according to the same procedures used in the role-play
sessions. This phase continued until the educator reached performance criterion of
100 % implementation fidelity for each condition for the target FA model.
Maintenance probes were conducted for three of the four multiple-baseline designs.
Due to time restrictions, maintenance probes for Cohort 1 TFA were not conducted.
Maintenance of implementation fidelity was assessed at 1–6-week intervals
following the completion of the in situ phase. At the time of maintenance probes,
some of the children who had participated in the in situ probes were receiving
challenging behavior intervention. Their teachers requested that a functional
analysis not be implemented with these children so as to not provide contingent
reinforcement for challenging behavior. As a result, maintenance probes were
identical to the role-play and performance feedback condition with a second
researcher playing the role of a student in the FA. This phase continued until the
educator reached performance criterion of 100 % fidelity for each condition of the
FA model.
Procedural Integrity
123
J Behav Educ
researcher uninvolved with that phase of the study. For example, the first author
collected procedural integrity data for the role-plays and the second author for the
initial training and performance feedback sessions. Procedural integrity was
calculated by dividing the number of target behaviors the researcher engaged in
by the total number of expected researcher behaviors and multiplying by 100 %.
IOA on procedural integrity was calculated by a trained research assistant from
video recordings for at least 20 % of sessions across each participant and was
100 %. Data were collected on the researcher’s engagement in three behaviors
during the initial training including delivering the 30-min presentation, showing the
videos of each FA condition, and responding to participant questions. Procedural
integrity data were collected for 100 % of the trainings and was 100 %. Mean
procedural integrity for role-plays across participants for TFA was 98 % (range
96–100 %) and for TBFA was 98 % (range 75–100 %). Data were collected on the
researcher’s adherence to the performance feedback procedures during the role-play
and performance feedback, in situ, and maintenance phases. Mean procedural
integrity for performance feedback across phases and participants was 99 % (range
80–100 %).
Social Validity
At the conclusion of the study, each educator completed a 16-item Likert scale
questionnaire modified from the Treatment Acceptability Rating Form—Revised
(TARF-R, Reimers et al. 1992). The Likert scale ranged from one to six with one
corresponding to ‘‘strongly disagree’’ and six corresponding to ‘‘strongly agree.’’
Participants responded to each item for the TFA and the TBFA individually. There
were 96 possible points for each FA model with higher scores indicating higher
acceptability. Reverse-scoring procedures were used for negatively keyed items.
Items on the scale included statements such as ‘‘The amount of time needed to
implement this assessment strategy is acceptable’’ and ‘‘I believe this assessment
strategy may lead to effective intervention for the student’s behavior.’’
Results
Cohort 1
TBFA
Figure 1 displays the results for Cohort 1 TBFA implementation fidelity. During
baseline, Angela implemented less than 16.77 % of steps correctly for the attention
and escape conditions, and zero steps correctly of the tangible condition. Following
training, Angela implemented the tangible condition correctly without requiring
performance feedback. She required one session of performance feedback to reach
100 % fidelity in the attention condition and two sessions of performance feedback
to reach criterion for the escape condition. Angela implemented the in situ
classroom probes for the attention and tangible conditions without requiring
123
J Behav Educ
60
TBFA Tangible
45
30
15
Percentage of TBFA Steps Correct
Angela
0
90
75
2 week probe
60
45
30
Brooke
15
0
90
75
1 week probe
60
45
30
15 Cate
0
1 4 7 10 13 16 19 22 25 28
Session
Fig. 1 Cohort 1 trial-based functional analysis implementation fidelity. Phase change hash marks
indicate didactic instruction
123
J Behav Educ
75 TFA Tangible
TFA Play
60
45
30
15 Angela
Percentage of TFA Steps Correct
90
75
60
45
30 Brooke
15
0
90
75
60
45
30
15 Cate
0
1 4 7 10 13 16 19 22 25 28
Session
Fig. 2 Cohort 1 traditional functional analysis implementation fidelity. Phase change hash marks
indicate didactic instruction
123
J Behav Educ
TFA
After teachers had completed in situ probes for the TBFA, they began baseline for
the TFA. The TFA results for Cohort 1 are presented in Fig. 2. Angela’s TFA
fidelity ranged from 0 to 50 % of steps correct. She reached 100 % fidelity for the
attention, play, and escape conditions following training. Angela required one
performance feedback session to reach criterion for the tangible condition. Angela
generalized her implementation to the in situ probes in her classroom without
75 TFA Tangible
TFA Play
60 6 week probe
45
30
15 Debbie
Percentage of TFA Steps Correct
90
75
5 week probe
60
45
30
Ellen
15
0
90
75
60 3 week probe
45
30
15 Fiona
0
1 4 7 10 13 16 19 22 25 28 31 34 37
Session
Fig. 3 Cohort 2 traditional functional analysis implementation fidelity. Phase change hash marks
indicate didactic instruction
123
J Behav Educ
TBFA Attention
90
75 TBFA Escape
15 Debbie
0
90
75
60
3 week probe
45
30
Ellen
15
0
90
75
60
2 week probe
45
30
15 Fiona
0
1 4 7 10 13 16 19 22 25 28
Session
Fig. 4 Cohort 2 trial-based functional analysis implementation fidelity. Phase change hash marks
indicate didactic instruction
123
J Behav Educ
Cohort 2
TFA
TBFA
After Cohort 2 had completed in situ probes for the TFA, they began baseline for the
TBFA model. These results are depicted in Fig. 4. Debbie’s TBFA baseline
implementation fidelity ranged from 0 to 66 % correct. Following training, Debbie
implemented each condition with 100 % fidelity during the role-play, in situ, and
4-week maintenance probe without requiring performance feedback. Ellen’s
baseline data ranged from 16 to 66 % of steps correct with a decreasing trend for
tangible, no trend for escape, and an increasing trend for attention. Like Debbie,
Ellen implemented each condition with 100 % fidelity during the role-play, in situ,
and 3-week maintenance probe without requiring performance feedback. Fiona’s
baseline data ranged from 33 to 50 % of steps correct. Following training, she
implemented each condition with 100 % fidelity. She implemented the escape and
attention conditions with 100 % fidelity in the in situ sessions and required one
performance feedback session for the tangible condition. Fiona maintained 100 %
fidelity for all conditions at the 2-week maintenance probe.
123
J Behav Educ
Sessions to Criterion
Figure 5 presents the sessions to criterion for each condition of each of the FA
models. Cohort 1 required a mean of 2 sessions to reach criterion in the TBFA and
1.17 sessions to reach criterion in the TFA phase. Cohort 2 was taught the TFA first
and the TBFA second. They required on average 1.5 sessions to reach criterion in
the TFA and only 1 session to reach criterion in the TBFA. In the TFA model, the
tangible condition required the most sessions to reach criterion while in the TBFA
model the escape condition required the most sessions to criterion. The TBFA
escape and attention conditions required more sessions than the TFA demand and
attention conditions. However, TFA tangible condition required more sessions than
the TBFA tangible condition.
The duration of the training sessions was held constant (45 min) across FA models.
The duration of the role-play and performance feedback phase was measured to
capture the time required for participants to reach criterion levels for fidelity of each
FA model. These data are presented in Table 1. Mean duration of TBFA role-play
and performance feedback for both cohorts was 24 min 54 s. Mean duration of TFA
role-play and performance feedback across both cohorts was 59 min 54 s. Training
to criterion in the TBFA model took approximately 35 min less than the TFA
model.
Social Validity
Social validity scores for each FA model are presented in Table 2. The mean score
for the TBFA was 84 points (range 77–91 points) while the mean score for the TFA
2.5
Mean number of trials to criterion
2 Trial-Based
Traditional
1.5
0.5
0
Escape Attention Tangible
Condition
Fig. 5 Mean number of sessions to criterion for each functional analysis condition
123
J Behav Educ
was 72.5 points (range 56–90 points). Five of the six educators provided higher
scores for the TBFA as compared to the TFA model. All participants indicated they
strongly agreed that identifying the function of challenging behavior was important
for intervention development. They all agreed or strongly agreed that both the TFA
and TBFA models could allow them to develop effective interventions and that they
would be willing to carry out both FA models. However, educators reported that
both FA models had undesirable side effects, such as classroom disruption.
Discussion
The purpose of this study was to compare public school special educators’
acquisition of traditional and trial-based FA models. Our first research question was
to determine the effects of a teacher training package on implementation of each FA
model. All participants reached 100 % implementation fidelity for both FA models
following training. In instances where educators did not reach criterion levels
immediately following the didactic training, one to two performance feedback
sessions were required. This finding speaks to the potential utility of the
performance feedback using feedback checklists to improve teacher practices in
school settings.
To address potential sequence effects, we counterbalanced the instructional
sequence of each FA model across cohorts. Our results suggest that teacher training
in one FA model, regardless of which model it was, led to more rapid acquisition of
the second FA model. It is possible that the procedures of occasioning and
reinforcing challenging behavior were sufficiently similar across the two models
123
J Behav Educ
123
J Behav Educ
Limitations
There are several limitations to this study which should be discussed. First, while
teachers implemented each FA model in role-play sessions, the only demonstration
of each FA in model in the classroom occurred during the in situ phase. Related to
this, in situ probes did not occur during baseline, thereby limiting the demonstration
of generalization following the training. Future research is needed to further
evaluate the effects of the training package on teacher FA implementation in
classroom settings. Second, while this study included four multiple-baseline
designs, one of these designs (Cohort 1 TBFA) included only one staggered
replication of intervention effect. Third, we taught teachers to conduct FA
conditions for socially maintained behavior, but did not train teachers to conduct
conditions to test automatically maintained challenging behavior. Fourth, partici-
pants were highly educated and trained teachers, many of whom were pursuing
graduate degrees and behavior analyst certification. This may have contributed to
some increasing baseline trends for specific conditions with Ellen and Brooke. It is
unknown if the findings from this study would generalize to teacher without these
degrees and certifications. Fifth, the purpose of this study was to evaluate the
training package on teacher FA implementation, and data on child challenging
behavior were not obtained. Future research should evaluate not only proximal
outcomes of teacher FA training on teacher behavior, but also distal outcomes on
child behavior. Finally, in assessing social validity, teachers rated each FA model,
but did not directly compare the models. Future research on social validity of FA
models should include open-ended questions in which teachers describe advantages
and disadvantages of the models in relation to one another.
This study illustrates how educators can be meaningfully involved in the FBA
process as FA implementers. However, teachers were not taught to collect, analyze,
or make data-based decisions based on the FA data. This is an important area for
123
J Behav Educ
future research and discussion. There is much debate about the level of expertise
necessary to design an FA, to analyze results, and to plan function-based
interventions and whether teachers versus specialists should take on these
responsibilities (e.g., Pence et al. 2014). A compromise to this debate may be
found through the use of FBA teams. Some researchers (Scott et al. 2005) advocate
for a team approach for conducting FBAs in which at least one team member is an
expert in challenging behavior. Determining how a team approach could be used to
facilitate high-quality FAs in school settings is an area in which future research is
needed. This study supports previous research that teachers can be meaningfully
involved in the FBA process as FA implementers (Kunnavatana et al. 2013). In this
capacity, teachers may be able to address some of the barriers to quality FBAs by
reducing reliance on outside personnel and by integrating FA into the student’s
typical school environment.
Summary
Conflict of interest The authors report no conflict of interest with this study.
Ethical Approval All procedures performed in studies involving participants were in accordance with
the ethical standards of the institution and/or national research committee and the 1964 Helsinki Dec-
laration and its later amendments or comparable ethical standards.
Informed Consent Informed consent was obtained from all individual participants included in the
study.
123
J Behav Educ
1 Educator instructs the child to play Educator presents the Educator provides continuous Educator directs
with toys and then ignores tangible item to the student instruction using a least-to-most the child toward
for 10 s prompt hierarchy (verbal, preferred
verbal ? model, verbal ? physical) items/toys
2 Educator turns away from child and Educator removes the item Educator delivers praise upon Educator responds
ignores appropriate behavior of from the student and places successful completion of a trial/task to all appropriate
child it out of reach but visible to (does not matter what prompt level social initiations
the student was necessary for task completion) of child
3 Educator ignores inappropriate Educator ignores appropriate Educator does not deliver any Educator ignores
behavior other than the target behavior of child interactions/praise outside of the task target and all
behavior emitted by the child conditions other
inappropriate
behavior
4 Educator provides brief attention Educator ignores Contingent upon target behavior the Educator delivers
(express concern and brief inappropriate behavior Educator removes the task and turns attention
physical contact) when the child other than the target away for 10 s approximately
emits the target behavior behavior emitted by the every 10 s
child
5 Educator blocks any attempts Educator re-introduces the trial/task Educator engages
to access the tangible item following 10 s if the child has ceased in parallel or
target behavior cooperative play
as appropriate
123
Attention Code Tangible Code Demand Code Play Code
6 Contingent upon target behavior the Educator gives the Educator ignores all other inappropriate
123
student access to the tangible item for 10 s and then and appropriate behavior during task
removes instruction
Percentage
correct
J Behav Educ
J Behav Educ
Control Educator instructs the Educator sits near Educator tells the
participant engage participant and student ‘‘You can
in independent work provides have a break’’
or leisure items. unrestricted access
Educator does not to preferred item
engage in for 60 s
continuous demands
Educator provides Educator does not Educator turns away
participant with provide attention if from child and does
attention at least participant engages not provide
once every 5 s for a in challenging attention for 60 s
total of 60 s behavior
regardless of
participant
engagement in
challenging
behavior. Attention
does not include
demands
No task materials or
task demands are
presented to child
Test Educator instructs the Educator sits near Educator presents task
participant to participant and demands once every
engage in places preferred 10 s using least-to-
independent work item in sight but most prompting
or leisure items out of participant’s (verbal,
reach (more than verbal ? model,
2’). Participant and
access to item is verbal ? physical)
blocked
Educator explains that Educator tells Educator delivers
he/she needs to participant, ‘‘You praise (commenting
complete some can have this later’’ or compliment)
work and turns body upon successful
away from completion of a
participant trial/task (regardless
of the prompt level
necessary to
complete the task)
123
J Behav Educ
Participant_____ Session_____
Reviewer_____ Date:_____
Please Circle: TFA or TBFA Check if second observer:
Directions: In the ‘‘code’’ column mark a ‘‘?’’ if the behavior is observed and a
‘‘-’’ if the behavior is not observed
Criteria Training
fidelity
Researcher began the session by making a positive comment about the therapist’s
implementation
If applicable, researcher discussed an example of incorrect implementation in a neutral
voice
If applicable, researcher describes the correct implementation procedure
If applicable, researcher modeled the correct implementation procedure
If applicable, researcher asks therapist to verbally describe how they would correctly
implement the procedure
If applicable, researcher provides praise contingent on the therapist’s correct verbal
behavior
Researcher addressed all instances of incorrect implementation from the previous
videotaped sessions
Total correct
Percentage correct
123
J Behav Educ
References
Beavers, G. A., Iwata, B. A., & Lerman, D. C. (2013). Thirty years of research on the functional analysis
of problem behavior. Journal of Applied Behavior Analysis, 46, 1–21. doi:10.1002/jaba.30.
Bechtel, N., McGee, H., Huitema, B., & Dickinson, A. (2015). The effects of the temporal placement of
feedback on performance. Psychological Record, 65, 425–434. doi:10.1007/s40732-015-0117-4.
Blood, E., & Neel, R. S. (2007). From FBA to implementation: A look at what is actually being delivered.
Education and Treatment of Children, 30, 67–80. Retrieved from: http://www.
educationandtreatmentofchildren.net/.
Bloom, S. E., Lambert, J. M., Dayton, E., & Samaha, A. L. (2013). Teacher-conducted trial-based
functional analyses as the basis for intervention. Journal of Applied Behavior Analysis, 46, 208–218.
doi:10.1002/jaba.21.
Catania, A. C. (2012). Learning and behavior. Learning (5th ed., pp. 1–11). New York: Sloan Publishing.
Chitiyo, M., & Wheeler, J. J. (2009). Challenges faced by school teachers in implementing positive
behavior support in their school systems. Remedial and Special Education, 30, 58–63. doi:10.1177/
0741932508315049.
Daly, E. J., Witt, J. C., Martens, B. K., & Dool, E. J. (1997). A model for conducting a functional analysis
of academic performance problems. School Psychology Review, 26, 554–574. Retrieved from:
http://www.nasponline.org/publications/spr/.
Delfs, C. H., & Campbell, J. M. (2010). A quantitative synthesis of developmental disability research:
The impact of functional assessment methodology on treatment effectiveness. Behavior Analyst
Today, 11, 4–19. doi:10.1037/h0100685.
Ducharme, J. M., & Shector, C. (2011). Bridging the gap between clinical and classroom intervention:
Keystone approaches for students with challenging behavior. School Psychology Review, 40,
257–274. Retrieved from: http://www.nasponline.org/publications/spr/.
Ellingson, S. A., Miltenberger, R. G., Stricker, J., Galensky, T. L., & Garlinghouse, M. (2000). Functional
assessment and intervention for challenging behaviors in the classroom by general classroom
teachers. Journal of Positive Behavior Intervention, 2, 85–97. doi:10.1177/109830070000200202.
Erbas, D., Tekin-Iftar, E., & Yucesoy, S. (2006).Teaching special education teachers how to conduct
functional analysis in natural settings. Education and Training in Developmental Disabilities, 41,
28–36. http://daddcec.org/Publications/ETADDJournal.aspx.
Gast, D. L. (Ed.). (2010). Single subject research methodology in behavioral sciences. New York:
Routledge Publishers.
Gresham, F. M., McIntyre, L. L., Olson-Tinker, H., Dolstra, L., McLaughlin, V., & Van, M. (2004).
Relevance of functional behavioral assessment research for school-based interventions and positive
behavioral support. Research in Developmental Disabilities, 25, 19–37. doi:10.1016/j.ridd.2003.04.
003.
Hanley, G. P., Iwata, B. A., & McCord, B. E. (2003). Functional analysis of problem behavior: A review.
Journal of Applied Behavior Analysis, 36, 147–185. doi:10.1901/jaba.2003.36-147.
Hassiotis, A., Robotham, D., Canagasabey, A., Romeo, R., Langridge, D., Blizard, R., & King, M. (2009).
Randomized, single-blind controlled trial of a specialist behavior therapy team for challenging
behavior in adults with intellectual disabilities. The American Journal of Psychiatry, 166,
1278–1285. doi:10.1176/appi.ajp.2009.08111747.
Individuals with Disabilities Education Act of 2004 (IDEA). (2004). Retrieved from http://idea.ed.gov/.
Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., & Richman, G. S. (1994). Towards a functional
analysis of self-injury. Journal of Applied Behavior Analysis, 27, 197–209. doi:10.1901/jaba.1994.
27-197 (Reprinted from Analysis and Intervention in Developmental Disabilities, 2, 3–20, 1982).
Kennedy, C. H. (2005). Single-case designs for educational research. Boston, MA: Allyn & Bacon.
Kunnavatana, S. S., Bloom, S. E., Samaha, A. L., & Dayton, E. (2013). Training teachers to conduct trial-
based functional analyses. Behavior Modification, 37, 707–722.
Lloyd, B. P., Wehby, J. H., Weaver, E. S., Goldman, S. E., Harvey, M. N., & Sherlock, D. R. (2014).
Implementation and validation of trial-based functional analysis in public elementary school
settings. Journal of Behavior Education. doi:10.1007/s10864-014-9217-5.
Loman, S. L., & Horner, R. H. (2014). Examining the efficacy of a basic functional behavioral assessment
training package for school personnel. Journal of Positive Behavior Interventions, 16, 18–30.
doi:10.1177/1098300712470724.
123
J Behav Educ
Lydon, S., Healy, O., O’Reilly, M. F., & Lang, R. (2012). Variations in functional analysis methodology:
A systematic review. Journal of Developmental and Physical Disabilities, 24, 301–326. doi:10.
1007/s10882-012-9267-3.
McCahill, J., Healy, O., Lydon, S., & Ramey, D. (2014). Training educational staff in functional
behavioral assessment: A systematic review. Journal of Developmental and Physical Disabilities,
26, 479–505. doi:10.1007/s10882-014-9378-0.
Noell, G. H., Witt, J. C., Slider, N. J., Connell, J. E., Gatti, S. L., Williams, K. L., & Duhon, G. J. (2005).
Treatment implementation following behavioral consultation in schools: A comparison of three
follow-up strategies. School Psychology Review, 34, 87–106.
Pence, S. T., Peter, C. C., & Giles, A. F. (2014). Teacher acquisition of functional analysis methods using
pyramidal training. Journal of Behavioral Education, 23, 132–149. doi:10.1007/s10864-013-9182-4.
Reimers, T., Wacker, D., Cooper, L. J., & de Raad, A. O. (1992). Acceptability of behavioral treatments
for children: Analog and naturalistic evaluations by parents. School Psychology Review, 21,
628–643. Retrieved from: http://www.nasponline.org/publications/spr/.
Rispoli, M., Burke, M., Hatton, H., Ninci, J., Zaini, S., & Rodriguez, L. (2015a). Training Head Start
teachers to conduct trial-based functional analyses of challenging behavior. Journal of Positive
Behavior Interventions, 17, 235–244. doi:10.1177/1098300715577428.
Rispoli, M., Ninci, J., Burke, M., Zaini, S., Hatton, H., & Sanchez, L. (2015b). Evaluating the accuracy of
results for teacher implemented trial-based functional analyses. Behavior Modification, 39, 627–653.
doi:10.1177/0145445515590456.
Scott, T. M., Liaupsin, C., Nelson, C. M., & Mclntyre, J. (2005). Team-based functional behavior
assessment as a proactive public school process: A descriptive analysis of current barriers. Journal
of Behavioral Education, 14, 57–71. doi:10.1007/s10864-005-0961-4.
Sigafoos, J., & Saggers, E. (1995). A discrete-trial approach to the functional analysis of aggressive
behaviour in two boys with autism. Australia & New Zealand Journal of Developmental
Disabilities, 20, 287–297.
Sprague, J. R., Flannery, B., O’Neill, R., & Baker, D. J. (1996). Effective behavioural consultation:
Supporting the implementation of positive behaviour support plans for persons with severe
challenging behaviours. Eugene: Specialised Training Program.
Sugai, G., Horner, R. H., Dunlap, G., Hieneman, M., Lewis, T. J., Nelson, C. M., & Ruef, M. (2000).
Applying positive behavior support and functional behavioral assessment in schools. Journal of
Positive Behavior Interventions, 2, 131–143. doi:10.1177/109830070000200302.
Sugai, G., Lewis-Palmer, T., & Hagan, S. (1998). Using functional assessments to develop behavior
support plans. Preventing School Failure, 43, 6–13. doi:10.1080/10459889809603294.
Symons, F. J., McDonald, L. M., & Wehby, J. H. (1998). Functional assessment and teacher collected
data. Education and Treatment of Children, 21, 135–160. Retrieved from: http://www.
educationandtreatmentofchildren.net/.
Van Acker, R., Boreson, L., Gable, R. A., & Potterton, T. (2005). Are we on the right course? Lessons
learned about current FBA/BIP practices in schools. Journal of Behavioral Education, 14, 35–56.
doi:10.1007/s10864-005-0960-5.
123