Professional Documents
Culture Documents
Positive Behavioural Support 4204EDN: Intervention Implementation & Evaluation
Positive Behavioural Support 4204EDN: Intervention Implementation & Evaluation
4204EDN
Study Guide
Module 4
tailor this style of behavioural intervention to suit your own personal approach
to teaching, learning, and behavioural support.
Training the team across environments has long been a recommended feature of
behavioural intervention (Anderson, Albin, Mesaros, Dunlap, & Morelli-Robbins,
1993; Egel & Powers, 1989; La Vigna, Willis, Shaull, Abedi, & Sweitzer, 1994;
Presland, 1989). Training activities, however, continue to be sidelined in schools
(Knoff, 2017; Van Acker, Boreson, Gable, & Potterton, 2005). In many cases,
also, parents receive little coaching on how to implement support strategies at
home and in the community (Crone, Horner, & Hawken, 2004; Lucyshyn, Dunlap,
& Albin, 2002). As stated by LaVigna and Willis (1995), “no support plan,
regardless of its comprehensiveness or elegance, will produce the desired
outcomes unless it is fully and consistently implemented” (p. 14).
Reading the written intervention plan and its associated programs is rarely
sufficient preparation to ensure sound implementation. Some important reasons
for supplementing written material with structured training include:
Research across contexts (e.g., Bessette & Wills, 2007; Crone & Horner, 2003;
Kraemer, Cook, Browning-Wright, Mayer, & Wallace, 2008; Lewis et al., 2016)
suggests the following in relation to training and behavioural interventions.
Training should not be seen as a one-off activity. A longitudinal training
program is recommended prior to implementation, during implementation,
and following review periods.
Training programs should focus on the dissemination of practical knowledge
and the building of competence and confidence. Training therefore needs to
be concrete, specific, and meaningful to all stakeholders.
Training should use strategies such as modeling, coaching, monitoring, and
feedback.
Best outcomes are achieved when training sessions involve both parents and
staff. In such an environment, overall goals for the student and the team can
be discussed together with individual responsibilities. Other significant people
who interact with the student on a regular basis (e.g., respite care worker or
community volunteer at school) may also be included. The use of a “whole of
school” approach is clearly documented as recommended practice. Note that
it is important to involve the student in training activities especially when there
is an emphasis on self monitoring and data collection.
It is critical that staff and parents are trained to manage effectively the reactive
strategies component of the plan. Training should emphasise the need:
for self control,
to manage the timing of action and nonaction according to the level of the
problem behaviour and the strategy base detailed in the support plan,
Put in place Put in place Put in place Ensure critical Ensure “on- Evaluate
environmental staff training incident communications demand” change in the
& educational response & problem student & the
changes crisis solving community
procedures
Bambara and Knoster (1998, p. 31) suggest that data gathering within
intervention and evaluation phases should be guided by three main questions.
Dodd (1994, p. 38) provides a useful checklist for such occasions. She poses the
following questions for the team.
Have you overlooked a medical problem that may be contributing to the
behaviour problem?
Have you correctly identified the outcome that is motivating the problem?
Have you inadvertently changed the outcome?
Are other carers and family members also consistently carrying out the
program?
Are you encouraging alternative behaviour?
Owing to the difficulties associated with getting a large team together at one time,
reviews often involve:
a staff meeting where the outcomes of the core team meeting are shared,
and future action is discussed and agreed upon.
1 Review the vision for the Have we lost sight of the long-term goals of the
student and long-term student’s support plan?
activity participation, Do in-place activities really address the long-term
relationship, and self- outcomes?
autonomy goals Is the student developing relationships?
3 Review (and revise as Are the prevention steps and procedures difficult
necessary) the short-term to implement? What could ease
prevention steps and implementation?
procedures and the Are our hypotheses holding true?
proposed adaptive Are we working with the most relevant
alternatives hypotheses?
Are there more efficient ways to teach this student
appropriate ways to meet personal needs?
Table 4.3. PBS Plan Monitoring Data Sheet (Jackson & Panyan, 2002, p.
352)
Young, West, and Macfarlane (1994, p. 77) provide the following sample
questions for analysing reinforcement.
Do you have evidence that you are using the most powerful reinforcer?
Do you vary reinforcers to avoid satiation?
Do you use the least amount of reinforcement possible?
Do or can you use a natural reinforcer?
Do you deliver the reinforcer immediately after a correct response?
Note that it is important during this period to keep monitoring the problem
behaviour and introduce changes in gradual systematic fashion. Careful
consideration is warranted regarding:
the point of commencement of generalisation and maintenance procedures,
the rate of change associated with these procedures,
the extent of change associated with these procedures.
Outcomes
Changes in behaviour (educational validity)
related to the targeted goal (short term or IEP)
speed & degree of effects
durability of effects
Generalisation of effects
Side effects of intervention
Change in overall quality of life
Social validity of plan & programs
Overall intervention effectiveness
Future recommendations
Evaluate outcomes
Evaluation should identify, first and foremost, the real outcomes of the
intervention. Meyer and Evans (1993) suggest that in the past, outcomes from
behavioural interventions have been described with such a narrow focus on the
problem behaviour that they failed to report many related positive effects.
Meyer and Janney (1989) list the following as potential socially valid outcomes
from any PBS intervention:
Outcome Item
Psychologists
Persons with
Psychiatrists
Impairment
Managers
Workers
Support
Parents
Nurses
1. Parents better understand why 28 3 4 3 3 5 2
the behaviour occurs
2. Increased participation in 18 13 4 5 8 10 4
community activities
4. Improved interpersonal 51 3 2 2 5 2 3
environment in the home
Validity measures
Clinical or educational validity demonstrates the degree to which the problem
behaviour has changed as a function of the intervention. Evans and Meyer
(1985) describe this measure as requiring “at the very least, that we are able to
report baseline and acquisition performance levels that differ from one another in
a positive direction” (p. 150).
On the other hand, social validity refers to measures such as the feasibility and
acceptability of the intervention as perceived by others (e.g., members of the
families or school staff, the individual, with problem behavior, when relevant).
Schloss and Smith (1998) provide the following description of social validity:
Social validity was introduced by Wolf (1978) who defined it as the social
significance of our goals, the social appropriateness of our procedures, and
the social importance of their effects. (p. 34)
Horner et al. (1990) put forward social validity and the role of dignity as the two
standards associated with the appropriateness of any intervention. Meaningful,
socially valid outcomes are now an established accountability measure within
PBS interventions (Dunlap, Fox, Vaughn, Bucy, & Clarke, 1997; Hieneman &
Dunlap, 2000; Jackson & Panyan, 2002).
Umbreit et al. (2007) provide a number of sample surveys to gauge level of social
validity for pre- and post- behavioural interventions in classrooms. Typical
questions used to gather this information (including some about program integrity
or validity) are:
Was the intervention a fair way to handle the student’s problem behaviour?
Was the intervention effective in changing the student’s problem behaviour?
Did the intervention achieve the targeted goal?
Were there any negative side-effects for the student?
Were there difficulties implementing the intervention in the classroom?
WWW reading
You are already familiar with the elements of analysis related to a phase (viz.,
trend, variability, and mean level). The new elements that compare performance
across phases are overlap and immediacy of effect. Both “within” and “across”
elements are detailed below.
a. Elements that describe performance within a phase (baseline or intervention)
trend
refers to the overall direction taken by the data path during a phase;
variability
refers to the extent to which the measures of behaviour, under the
same conditions, vary from one another during a phase;
signals predictability and inconsistency of occurrence i.e., how stable.
mean level
is the average value of the behaviour during a phase;
is drawn as a line running parallel to the X axis.
A successful PBS intervention implementation is one that displays minimal
variability and a downward trend. The steeper the trend, the more effective the
PBS intervention. An effective intervention can also be argued in terms of a
substantial decrease in mean level compared to that at baseline.
The graph includes trend and overlap lines. Mean levels could also be added to
assist team discussion because these lines would clearly show the substantial
reduction in problem behaviour.
Evaluate effectiveness
Individual behavioural interventions should be evaluated not only in terms of
change in the behaviour (outcomes) but also in terms of effectiveness (input and
context). At least every semester, time needs to be put aside to complete a
cost/benefit analysis of each PBS plan and its associated programs and the
manner in which the intervention has proceeded over the implementation period.
Parents should be involved in this evaluative process (Cooper, Wacker, Sasso,
Reimers, & Donn, 1990; Lucyshyn, Horner, & Ben, 1996).
Time allocations, staffing constraints, and cooperation across the team are some
of the commonly identified factors that influence intervention outcomes.
Hieneman and Dunlap (2000, 2001) have extended this list to embrace 12 factors
categories. These are:
characteristics of the focus individual,
nature and history of the behaviour,
behavioural support plan design,
integrity of implementation,
nature of the physical environment,
buy-in (e.g., commitment of significant others) with the intervention,
capacity of support providers,
4204EDN Module 4 Griffith University 119
relationships with the individual,
match with prevailing philosophy,
responsiveness of the system,
collaboration among providers,
community acceptance (i.e., social validity).
Level of PBS effectiveness should be shared at subsequent IEP reviews and any
individualised planning meetings. Such practice allows the key stakeholders to
form a realistic picture of the intervention and context, and to make judgements
accordingly. All too often, focus is on the student and intervention outcomes.
Conclusion
The implement-and-evaluate cycle for behavioural intervention is more intense
than the typical implement-and-evaluate cycle for learning. The importance of
staff training, implementation frequency, and the systematic collection and
interpretation of student data have been raised. These aspects are crucially
related to ensuring positive student outcomes (treatment validity).In addition,
attention needs to be paid to the social validity of the BIP and intervention
strategies together with the overall effectiveness of the PBS intervention. To this
end, the Rehabilitation Research & Training Center on Positive Behavior Support
provides some benchmarks for consideration across these final phases of the
PBS process.
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
______________________________________________________________
4. Please recommend material from the World Wide Web that you feel should
be included in this module.
______________________________________________________________
______________________________________________________________
______________________________________________________________
5. Please feel free to make any other comment about this module.
______________________________________________________________
______________________________________________________________
______________________________________________________________
Thankyou
Please return to: Dr Wendi Beamish
School of Education and Professional Studies
Mt Gravatt campus Griffith University
176 Messines Ridge Road MT GRAVATT QLD 4122