Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

GRIFFITH UNIVERSITY

Arts, Education, & Law


School of Education and Professional Studies

POSITIVE BEHAVIOURAL SUPPORT

4204EDN

Study Guide
Module 4

Intervention Implementation & Evaluation

Convenor: Dr Wendi Beamish


Phone: 07 3735 5636
Fax: 07 3735 5910
Email: w.beamish@griffith.edu.au
Contents

Module 4: Intervention implementation and evaluation ............................................103

Introduction to the module .............................................................................. 103

What can you expect from this module?......................................................... 103

What will you need to do? .............................................................................. 104

How long should you spend on this module? ................................................. 104

Topic 1: Intervention implementation ..........................................................................105

Recruit resources and train the team ............................................................. 105

Implement, review progress, and modify intervention ..................................... 107

Topic 2: Intervention evaluation ...................................................................................113

Evaluate outcomes ........................................................................................ 113

Validity measures........................................................................................... 115

Evaluate effectiveness ................................................................................... 119

Conclusion ..................................................................................................... 120

Module summary ............................................................................................................121

Module 4 Evaluation....................................................................................... 122


Module 4: Intervention implementation and evaluation
Introduction to the module
This last module deals with both the implementation and evaluation of the
intervention plan and associated support programs (i.e., Phases 3 and 4).
Although this material is presented in a linear manner, it is essential that
implementation and evaluation be seen as interwoven activities, with one
informing the other. Figure 4.1 also shows the interconnectedness between these
phases and design phase (Phase 2).

Figure 4.1. The Positive Behavioural Support Process model.

What can you expect from this module?


When you complete this module, you should have begun to:

 understand the theoretical and practical bases for a number of practices,


principles, and procedures related to behavioural implementation and
evaluation;

 display basic competence in behavioural intervention implementation and


evaluation for students with special needs.

4204EDN Module 4 Griffith University 103


What will you need to do?
As you work through this module you will need to:

 familiarise yourself with the literature related to this style of implementation


and evaluation;

 tailor this style of behavioural intervention to suit your own personal approach
to teaching, learning, and behavioural support.

How long should you spend on this module?


Suggested study time for this module = 2 hours.

4204EDN Module 4 Griffith University 104


Topic 1: Intervention implementation
Research suggests that achieving and sustaining high quality practice when
implementing behavioural interventions is related to a number of salient features.
These features form the basis of this topic and are dealt with under the following
sections:

 recruit resources and train the team,

 implement, review progress, and modify the intervention.

Recruit resources and train the team


To implement comprehensive behavioural support plans, additional resources
are frequently needed. Jackson and Panyan (2002, p. 152) advise resourcing
consideration in four key areas:
 specialised training for staff and peers;
 people or teams of people who will be available to provide a plan’s learning,
prevention, and crisis supports in the classroom and other environments;
 materials such as communication and schedule boards, self-monitoring data
sheets, or video cameras;
 experienced personnel and appropriate materials for ongoing problem solving
and progress evaluation.

Training the team across environments has long been a recommended feature of
behavioural intervention (Anderson, Albin, Mesaros, Dunlap, & Morelli-Robbins,
1993; Egel & Powers, 1989; La Vigna, Willis, Shaull, Abedi, & Sweitzer, 1994;
Presland, 1989). Training activities, however, continue to be sidelined in schools
(Knoff, 2017; Van Acker, Boreson, Gable, & Potterton, 2005). In many cases,
also, parents receive little coaching on how to implement support strategies at
home and in the community (Crone, Horner, & Hawken, 2004; Lucyshyn, Dunlap,
& Albin, 2002). As stated by LaVigna and Willis (1995), “no support plan,
regardless of its comprehensiveness or elegance, will produce the desired
outcomes unless it is fully and consistently implemented” (p. 14).

Reading the written intervention plan and its associated programs is rarely
sufficient preparation to ensure sound implementation. Some important reasons
for supplementing written material with structured training include:

 Clear, written material can be misunderstood, especially when team members


are unfamiliar with specific techniques and/or use different management
practices. Training can provide not only explanation and discussion of
recommendations but also demonstrations of practice. Individual team
members’ responsibilities can be highlighted and clarified.

4204EDN Module 4 Griffith University 105


 Team members will more confidently implement the plan and its programs
when they are aware not only of the “what”, and “how” but also the “why”.
Written material usually only outlines content “what” and procedures “how”.
Training can provide opportunity for team discussion of the rationale “why”
behind programming. This includes explaining the hypothesis and sharing
baseline data. Sessions also should aim to increase “shared understandings”
about the student and implications of the impairment on problem behaviour
e.g., making the team aware of the high degrees of stress many individuals
with ASD experience. LaVigna and Willis (1992) refer to this as “putting
ourselves in their shoes.”

 Greater consistency of implementation and data collection can be achieved


when training emphasises a uniform way of doing things rather than leaving
elements of programming open to interpretation. This practice not only
promotes efficiency and effectiveness across the team but decreases stress
for some individual team members.

Research across contexts (e.g., Bessette & Wills, 2007; Crone & Horner, 2003;
Kraemer, Cook, Browning-Wright, Mayer, & Wallace, 2008; Lewis et al., 2016)
suggests the following in relation to training and behavioural interventions.
 Training should not be seen as a one-off activity. A longitudinal training
program is recommended prior to implementation, during implementation,
and following review periods.
 Training programs should focus on the dissemination of practical knowledge
and the building of competence and confidence. Training therefore needs to
be concrete, specific, and meaningful to all stakeholders.
 Training should use strategies such as modeling, coaching, monitoring, and
feedback.
 Best outcomes are achieved when training sessions involve both parents and
staff. In such an environment, overall goals for the student and the team can
be discussed together with individual responsibilities. Other significant people
who interact with the student on a regular basis (e.g., respite care worker or
community volunteer at school) may also be included. The use of a “whole of
school” approach is clearly documented as recommended practice. Note that
it is important to involve the student in training activities especially when there
is an emphasis on self monitoring and data collection.

It is critical that staff and parents are trained to manage effectively the reactive
strategies component of the plan. Training should emphasise the need:
 for self control,

 to analyse what’s happening,

 to manage the timing of action and nonaction according to the level of the
problem behaviour and the strategy base detailed in the support plan,

 for debriefing following the application of emergency procedures.

Managing serious behavioural episodes in an ongoing manner requires both


support from colleagues and self management. LaVigna and colleagues (1994)
recommend a management system for the implementation and monitoring of
behavioural interventions, based on defining performance responsibilities and
expectations across the team, and the utilisation of a program status report
(PSR). These elements need to be introduced prior to implementation and are an
integral part of staff training.

4204EDN Module 4 Griffith University 106


If time permits, you may wish to examine these materials. The on-line PENT
training materials focus on functional training tips. Backus and CichoskiKelly
(2003) provide an extensive training manual for teaching assistants. La Vigna et
al. (1994) offer a thorough management system (including the setting of
performance standards by the team and a feedback loop) for operation across
the implementation and evaluation phases.

Key resources (Course Content, Module 4 folder)

PENT training materials


- Top 10 Training Tips
- Training Tips for New Trainers
- Principles of Adult Learning
- Training Planning Sheet: 6 Hour
Sample
Additional handouts available at
http://www.pent.ca.gov/trn/training.html

Recommended PBS WWW material

OSEP Technical Assistance Center on PBIS


Training section including videos and online manuals
https://www.pbis.org/training

Illinois PBIS Network website with a range of valuable training tools


See Trainings from Former IL PBIS Network
Tier 3 resources especially for FBA-BIP training
http://www.istac.net/resources/illinois-pbis-network-resources

Backus & CichoskiKelly, 2003, Paraeducator’s Manual (407-page document)


http://www.eric.ed.gov/
search using CichoskiKelly.

Implement, review progress, and modify intervention


Jackson and Panyan (2002) provide a useful framework to follow for PBS
implementation. Table 4.1 presents this overall implementation process and
details the range of demanding components associated with implementing a
comprehensive PBS plan.

Table 4.1. PBS Implementation by Components (adapted from Jackson &


Panyan, 2002, p. 160)

Implement the PBS plan

Put in place Put in place Put in place Ensure critical Ensure “on- Evaluate
environmental staff training incident communications demand” change in the
& educational response & problem student & the
changes crisis solving community
procedures

Revise the PBS plan

4204EDN Module 4 Griffith University 107


As with any educational intervention, implementation should be guided by
programming material that has been distributed to team members at, or following,
an initial training period. The plan and its programs should be initiated as
documented, with care being paid to specific strategies and data collection.

Checks across implementation environments should be conducted during the first


few weeks of implementation to see that the intervention is being carried out as
specified (i.e., have treatment integrity) and data collection procedures are yielding
relevant information. The use of a home-to-school notebook (Hall, Wolfe, & Bollig,
2003) ensures that data is shared across key environments and between key
stakeholders. Early troubleshooting ensures that unforseen problems are dealt with
quickly. Extra assistance can also be provided to team members on an individual
needs basis. In this way consistency of implementation is achieved and team morale
can be kept at a high level.

Bambara and Knoster (1998, p. 31) suggest that data gathering within
intervention and evaluation phases should be guided by three main questions.

 What type of information do you need to gather?

 How will you and your team collect this information?

 How will your team use the information to make decisions?

Monitoring behavioural changes in the home environment is important. Berg,


Wacker, Harding, and Amus (1999) recommend the use of a brief behaviour
checklist. This checklist is completed weekly by parents and reports the overall
occurrences of the problem behaviour throughout that period. A home-to-school
notebook (see Hall, Wolfe, & Bollig, 2003) can serve a similar function.

There should be some decrease in the problem behaviour after an initial


implementation period of 4 weeks (or less, depending on the nature of the
problem and the context). If progress is not obvious to the team and not reflected
in the collected data, attention is warranted.

Dodd (1994, p. 38) provides a useful checklist for such occasions. She poses the
following questions for the team.
 Have you overlooked a medical problem that may be contributing to the
behaviour problem?
 Have you correctly identified the outcome that is motivating the problem?
 Have you inadvertently changed the outcome?
 Are other carers and family members also consistently carrying out the
program?
 Are you encouraging alternative behaviour?

4204EDN Module 4 Griffith University 108


 Have you arranged the situation so that the behaviour is less likely to occur
again?
 Have you taught the child some necessary skills to allow him alternate ways
of attaining what he wants?
 Have you restructured the child’s environment so that the need for the
behaviour problem is reduced?

Recommended practice suggests that this initial


stocktake should be an integral part of the first review
of the plan and its associated programs. Throughout
the intervention periodic reviews need to be
consistently scheduled. Typically, these should occur
every 3-6 weeks.

Owing to the difficulties associated with getting a large team together at one time,
reviews often involve:

 a meeting of core team members (teacher, parent/s, student (where relevant),


principal, consultant and/or guidance personnel) where summarised data
across environments is presented, interpreted, and the effects of the
intervention discussed. Subsequent implementation will depend upon the
team’s decision to either
 continue to implement the plan and its programs, with no changes
occurring;
 modify the plan and/or its program/s, and implement;
 devise a new plan and programs, and implement.

 a staff meeting where the outcomes of the core team meeting are shared,
and future action is discussed and agreed upon.

As implementation proceeds, Wolery (2004) recommends that “intervention


strategies and practices should be monitored for frequent and accurate use” (p.
580) because “levels of correct implementation of intervention practices are
associated with children’s outcomes” (p. 578). Wolery provides guidelines for
undertaking checks on program integrity.
1. Monitor the general use of recommended intervention practices.
2. Examine the integrity of intervention practices when the student is not making
progress.
3. Develop a method of measuring intervention practice integrity.
4. Share the results of monitoring with all implementers.

Moreover, intervention “sometimes falter when a narrow focus on the problem


behaviors obscures the school’s failure to fully implement steps that promote the
support plan’s long-term goals” (Jackson & Panyan, 2002, p. 157). Jackson and
Panyan provide a recommended sequence for PBS intervention reviews in order
to avoid this pitfall. Table 4.2 details the recommended meeting sequence.

Continued implementation, therefore, should consist of sharing information on the


effects (positive and negative) of the intervention informally across the team
between reviews. Providing feedback and reinforcing efforts are important
considerations in team management. At the same time, the monitoring of
program integrity by team members should be viewed as an integral part of the

4204EDN Module 4 Griffith University 109


intervention process. Intermittent or ongoing training may also be necessary on
an individual and/or team basis.

Table 4.2. Recommended Sequence for PBS Review Meetings (adapted


from Jackson & Panyan, 2002, p. 157)

Step Action Important Questions

1 Review the vision for the Have we lost sight of the long-term goals of the
student and long-term student’s support plan?
activity participation, Do in-place activities really address the long-term
relationship, and self- outcomes?
autonomy goals Is the student developing relationships?

2 Review (and revise as Are the immediate behaviour change expectations


necessary) the short-term for this student unreasonable?
expectations Has the student made progress in certain areas
that can help us rethink our expectations at this
juncture in the program?

3 Review (and revise as Are the prevention steps and procedures difficult
necessary) the short-term to implement? What could ease
prevention steps and implementation?
procedures and the Are our hypotheses holding true?
proposed adaptive Are we working with the most relevant
alternatives hypotheses?
Are there more efficient ways to teach this student
appropriate ways to meet personal needs?

4 Review (and revise as Are we having difficulties implementing responses


necessary) individual to problem behaviours? If so, what could help?
team member responses What is the student’s present response to these
to the problem behaviour steps and procedures?

As implementation continues, programming modifications will need to be made


in an ongoing manner and documented. Carr, Levin, McConnachie, Carlson,
Kemp, Smith, and McLaughlin (1999) emphasise the need for behavioural
support plans to be dynamic not static. Hence, progress on all aspects of the
PBS plan requires consistent monitoring. Jackson and Panyan (2002) have
designed a simple data collection form to assist with this task (see Table 4.3).

Table 4.3. PBS Plan Monitoring Data Sheet (Jackson & Panyan, 2002, p.
352)

PBS Plan Item Implementation Contribution Comments


Rating Rating
0 = not at all 0 = not at all
1 = occasionally 1 = helpful
2 = consistently 2 = very helpful

4204EDN Module 4 Griffith University 110


If progress is not evident on aspects of the PBS plan, the strategy base across
the multidimensional plan must be examined and adapted. Group problem
solving is recommended in these instances (Jackson & Panyan, 2002; Ruef et
al., 1999; Sugai & Horner, 1994).

If progress is evident on aspects of the PBS plan, changes related to


generalisation and maintenance should receive attention.

Action may include:


 mediating the generalisation and maintenance of behaviour/s change across
situations (places, people, and activities) and over time;

 introducing natural maintaining contingencies such as


 thinning reinforcement from continuous to intermittent shedules,
 shifting from contrived to naturally occurring reinforcement and
consequences,
 increasing variation in all procedures,
 fading out structured conditions and contingencies,
 fading in realistic conditions and situations that provoked the behaviour
and were abridged during initial treatment;
 introducing or extending self management strategies.

It is important to remember that self management includes self-recording of data.


Self-monitoring, cueing, and recording are increasingly being recommended
within PBS planning and intervention (Hutchinson, Murdock, Williamson, &
Cronin, 2000; McConnell, 1999).

As strong reinforcement is a crucial part of this


nonaversive approach to behavioural support, a review
of reinforcement procedures in particular is essential at
this stage.

Young, West, and Macfarlane (1994, p. 77) provide the following sample
questions for analysing reinforcement.
 Do you have evidence that you are using the most powerful reinforcer?
 Do you vary reinforcers to avoid satiation?
 Do you use the least amount of reinforcement possible?
 Do or can you use a natural reinforcer?
 Do you deliver the reinforcer immediately after a correct response?

4204EDN Module 4 Griffith University 111


 Do you deliver the reinforcer only after a correct response?
 Can the learner see the reinforcer?
 Can the learner choose between different reinforcers?
 Do you use a denser rate of reinforcement initially?
 Do you need to thin the rate of reinforcement?
 Do you need to use a primary reinforcer?
 Do you need to use a secondary reinforcer?
 Do you avoid the use of negative reinforcement?

When incidents of problem behaviour occur, both quantitative and qualitative


data need to be recorded. Janzen (2003, pp. 455-457; Forms 24.2 and 24.3)
provide good formats. As behaviour change occurs, generalisation will need to be
followed by some maintenance procedures. Lack of maintenance, especially
those concerning the adherence to prescribed support strategies, is common in
classrooms once the problem behaviour reduces to an “acceptable” level. As
Singer (2000) reports, “From my work with families and in classrooms, I’m a little
suspicious of our maintenance data” (p. 123).

Maintaining behaviour change/s should not be left to chance and should be


specifically integrated into everyday management. Recommendations include:
 Continue to give praise and pleasant attention for appropriate behaviours as
often as possible. Try to develop and maintain a general attitude of looking for
behaviours to be pleased about.
 Continue to give occasional “surprise rewards” for specified behaviours, while
not actually programming for them.
 Tell the student how pleased you are about improvements in behaviour.

Note that it is important during this period to keep monitoring the problem
behaviour and introduce changes in gradual systematic fashion. Careful
consideration is warranted regarding:
 the point of commencement of generalisation and maintenance procedures,
 the rate of change associated with these procedures,
 the extent of change associated with these procedures.

4204EDN Module 4 Griffith University 112


Topic 2: Intervention evaluation
Regular periodic reviews have already been mentioned as the means for
monitoring whether the intervention plan and its associated programs are
working. As such they form the basis for evaluating both the change in problem
behaviour and the effectiveness of the intervention itself. Their importance to the
total process is highlighted by Zarkowska and Clements (1988) who assert that
“regular meetings and reviews are an essential part of quality control” (p. 182).

Outcomes, social validity, overall intervention effectiveness, and


recommendations about further intervention should be carefully considered when
evaluating a behavioural intervention. Table 4.4 presents the specific breakdown
of elements that require attention.

Table 4.4. PBS Intervention Evaluation by Elements

Phase 5: Intervention Evaluation

 Outcomes
 Changes in behaviour (educational validity)
 related to the targeted goal (short term or IEP)
 speed & degree of effects
 durability of effects
 Generalisation of effects
 Side effects of intervention
 Change in overall quality of life
 Social validity of plan & programs
 Overall intervention effectiveness
 Future recommendations

Evaluate outcomes
Evaluation should identify, first and foremost, the real outcomes of the
intervention. Meyer and Evans (1993) suggest that in the past, outcomes from
behavioural interventions have been described with such a narrow focus on the
problem behaviour that they failed to report many related positive effects.

Meyer and Janney (1989) list the following as potential socially valid outcomes
from any PBS intervention:

 improvement in problem behaviour (reduction);

 acquisition of alternative skills & positive behaviours;

 acquisition of general self-control strategies to support behaviour change;

 positive collateral effects and absence of negative side effects;

 reduced need for medical/crisis management for student and/or others;

 less restrictive placements and greater participation in integrated school


experiences;

 subjective quality-of-life improvement: happiness, satisfaction, choices, and


control;

4204EDN Module 4 Griffith University 113


 perceptions of improvement by teachers/family/significant others;

 expanded social relationships and informal support networks.

Research by Fox and Emerson (2001), moreover, indicates that different


stakeholders hold different views on what are socially valid outcomes of PBS
interventions. In this research, 150 respondents from 7 stakeholder groups (28
young adults with intellectual impairment, 9 parents of individuals with intellectual
impairment, 22 psychologists, 7 psychiatrists, 31 nurses, 33 service managers,
and 20 direct support workers) rated 73 items concerning potential outcomes of
PBS interventions. Table 4.5 presents data signifying that many outcomes (e.g.,
reduction in problem behaviour and learning alternative skills and positive
behaviours) are ranked very differently in terms of importance across stakeholder
groups. Hence, when gathering information about outcomes across the team,
expect varying viewpoints and try to gather objective (not subjective) data.

Table 4.5. Group Ranking of Importance for 12 Most Important Outcomes


(Fox & Emerson, 2001, p. 185)

Outcome Item

Psychologists
Persons with

Psychiatrists
Impairment

Managers
Workers
Support
Parents

Nurses
1. Parents better understand why 28 3 4 3 3 5 2
the behaviour occurs

2. Increased participation in 18 13 4 5 8 10 4
community activities

3. Increased engagement within the 21 13 4 7 3 4 10


home

4. Improved interpersonal 51 3 2 2 5 2 3
environment in the home

5. Person learns alternative way of 28 13 3 8 1 10 7


getting needs met

6. Reduction in problem behaviour 60 1 13 1 2 1 1

7. Increased friendships and 1 7 4 14 19 18 14


relationships

8. Parent learns effective coping 28 7 13 4 6 8 11


strategies

9. Improved relationships between 44 5 4 11 12 25 5


family members

10. Person is able to stay living with 24 33 11 5 15 13 8


family

11. Person has greater control, is 38 2 18 11 10 25 9


more empowered

12. Person has more frequent social 38 19 15 8 8 10 14


contact
Note. Respondents were asked to rate a total of 73 outcomes.

4204EDN Module 4 Griffith University 114


Reporting on behaviour changes (i.e., data on the problem behaviour and skill
acquisition) is usually the focus when addressing intervention outcomes. Janney
and Snell (2008) recommend that behaviour changes should be graphed and
analysed prior to being presented to the team (see next section on visual
analysis of data). Reporting on student quality-of-life and family lifestyle
improvements present a substantial challenge when addressing PBS outcomes.
Outcomes in these two areas are typically less apparent and families are the key
to obtaining such data. Time needs to be invested in gathering this information in
subtle ways over time.

LaVigna and Willis (1995) provide an evaluation framework that is somewhat


narrower in terms of outcomes compared to that developed by Meyer and Janney
(1989). As shown in Figure 4.2, the LaVigna and Willis framework highlights only
the first 4 outcomes in the Meyer and Janney listing (see above).

Speed & Durability Generalisation Side Social Clinical/


Degree of of Effects of Effects Effects Validity Educational
Effects Validity

Figure 4.2. The framework for intervention evaluation.

The above framework, however, does incorporate some additional elements of


recommended practice. First, it does emphasise the need to describe the change
in problem behaviour in terms of (a) speed and degree of effects, (b) durability of
effects, and (c) generalisation of effects. Within this set of descriptions, it is
important to indicate if the goal (short term or IEP) has been achieved.
Second, it does involve the employment of a number of validity measures (viz.,
clinical, social, and program validity). You would recall that clinical and social
validity were introduced as integral to the process in Video 1 (A model for
nonaversive behavior management). Program validity or integrity has been
introduced earlier (under Implement, review progress, and modify intervention).

Validity measures
Clinical or educational validity demonstrates the degree to which the problem
behaviour has changed as a function of the intervention. Evans and Meyer
(1985) describe this measure as requiring “at the very least, that we are able to
report baseline and acquisition performance levels that differ from one another in
a positive direction” (p. 150).

On the other hand, social validity refers to measures such as the feasibility and
acceptability of the intervention as perceived by others (e.g., members of the
families or school staff, the individual, with problem behavior, when relevant).
Schloss and Smith (1998) provide the following description of social validity:
Social validity was introduced by Wolf (1978) who defined it as the social
significance of our goals, the social appropriateness of our procedures, and
the social importance of their effects. (p. 34)

Horner et al. (1990) put forward social validity and the role of dignity as the two
standards associated with the appropriateness of any intervention. Meaningful,
socially valid outcomes are now an established accountability measure within
PBS interventions (Dunlap, Fox, Vaughn, Bucy, & Clarke, 1997; Hieneman &
Dunlap, 2000; Jackson & Panyan, 2002).

4204EDN Module 4 Griffith University 115


A variety of formal (e.g., surveys) and informal (e.g., discussion at a review
meeting) methods can be used to gauge the social validity of the intervention.
Typically, social acceptability should be reviewed in relation to:
 documented procedures in the plan and programs,
 actual practices used in implementing the plan and programs,
 perceived outcomes of the intervention.

Umbreit et al. (2007) provide a number of sample surveys to gauge level of social
validity for pre- and post- behavioural interventions in classrooms. Typical
questions used to gather this information (including some about program integrity
or validity) are:
 Was the intervention a fair way to handle the student’s problem behaviour?
 Was the intervention effective in changing the student’s problem behaviour?
 Did the intervention achieve the targeted goal?
 Were there any negative side-effects for the student?
 Were there difficulties implementing the intervention in the classroom?

Therefore, when evaluating PBS interventions, it is critical to determine whether:


 overall intervention had positive outcomes (for student and others) and that
the intervention achieved targeted goals;
 key stakeholders agreed with (a) actual practices used in implementing the
plan and support programs used and (b) perceived intervention outcomes;
 intervention was carried out as specified in the plan and support programs.
The questions detailed in Table 4.6 may assist in completing evaluative tasks.

Table 4.6. Outcomes Checklist (Bambara & Knoster, 1998, p. 34)

Specific Outcomes Broad Outcomes


 Have there been gains in new  If there have been increases in new skills in
skills to enable the individual combination with decreases in problem behaviour,
to meet personal needs in a has your team (a) adequately planned for
socially acceptable manner? maintenance & generalisation, and (b) considered
 Have there been reductions in broadening quality of life goals at the discretion of
the individual’s problem the individual and family (e.g., increased contacts
behaviour? Are these with peers via events such as birthday parties)?
reductions satisfactory?  In general, has the number of positive relationships
 If there has been little or no that the individual has with others increased,
decrease in problem decreased, or remained the same?
behaviour, are there sufficient  In general, has the number of times the individual
increases in new skills? What participated in activities of his choice in the school
can be done to better enable and in the community increased, decreased, or
individual to decrease problem remained the same?
behaviour?  In general, has the number of times the individual,
 If there has been little along with her family, participated in activities of
development of new skills their choice at both school and in the community
and/or decreases in problem increased, decreased, or remained the same?
behaviour, has your team (a)  In general, has the individual’s general health/ well
implemented the support plan being improved, decreased, or remained the same?
in a consistent manner, and
 What is the individual’s, and the family’s satisfaction
(b) reevaluated your
concerning personal growth and development as a
hypotheses and support
result of the support?
strategies?

4204EDN Module 4 Griffith University 116


It may be timely to revisit the earlier reading by Clarke et al. (2003). In this
research, evaluation measures include not only those related to problem
behaviour, engagement, and transitioning but also perceptions of levels of
happiness and aspects of quality of life. On-line reading is recommended. In
addition to providing suggestions for assessing parent, teacher, and student
satisfaction, these work (collectively) provide information on how to use graphed
data for evaluating and modifying behavioural plans.

WWW reading

Hieneman et al. (1999). Facilitator’s guide on positive behavioral support


(Implementing the plan, pp. 69-81)
http://www.apbs.org/files/PBSwhole.pdf
University of Florida’s Positive Behavior Support project
Range of evaluation materials
http://flpbs.fmhi.usf.edu/resources_indstudents.asp
Tools available @ Learning@Griffith under Resources
Positive Behavior Support Outcomes & Indicators (LRE Project).

Visual analysis of graphed data

Graphing and visual analysis of data provides feedback about programming so


that effective interventions can be distinguished from ineffective interventions.
When interventions are deemed to be effective (i.e., the problem behaviour is
reducing and alternative skills are being learnt, no program modifications are
necessary. When interventions are found to be ineffective, however, program
modifications (changes to strategies, methods, or goals) are necessary so that
time is not wasted. As explained previously, data should be graphed and
analysed across baseline and intervention phases. By comparing the intervention
data with the baseline data, and by looking at the data trends and levels within
each phase, decisions regarding levels of effectiveness can be made.

You are already familiar with the elements of analysis related to a phase (viz.,
trend, variability, and mean level). The new elements that compare performance
across phases are overlap and immediacy of effect. Both “within” and “across”
elements are detailed below.
a. Elements that describe performance within a phase (baseline or intervention)
 trend
 refers to the overall direction taken by the data path during a phase;
 variability
 refers to the extent to which the measures of behaviour, under the
same conditions, vary from one another during a phase;
 signals predictability and inconsistency of occurrence i.e., how stable.
 mean level
 is the average value of the behaviour during a phase;
 is drawn as a line running parallel to the X axis.
A successful PBS intervention implementation is one that displays minimal
variability and a downward trend. The steeper the trend, the more effective the
PBS intervention. An effective intervention can also be argued in terms of a
substantial decrease in mean level compared to that at baseline.

4204EDN Module 4 Griffith University 117


b. Elements that compare performance across phases
 overlap
 summarises the band of commonality in data values across baseline
and intervention; the band of commonality is established by drawing a
set of lines parallel to X axis; one through the lowest value in the
baseline phase and the other through the highest value of the
intervention phase;
 reported as no/ minimal/ some/ substantial/significant overlap;
 shows the effectiveness of the intervention (i.e., the smaller the
overlap, the more effective the intervention).
 immediacy of effect
 described by comparing last 3/4 data points in the baseline phase with
the first 3/4 data points in the intervention phase;
 reported as no/ small/ moderate/ significant immediacy of effect;
 shows the effectiveness of the intervention (i.e., the greater the
immediacy of effect, the more effective the intervention).
Figure 4.3 displays time-series data for the baseline and intervention conducted
in relation to Ben, a student who has been featured throughout modules (see
Figure 2.3 Graphed baseline data.

Figure 4.3. Graphed baseline and intervention data for Ben G


(4 August – 21 October, 2008).

The graph includes trend and overlap lines. Mean levels could also be added to
assist team discussion because these lines would clearly show the substantial
reduction in problem behaviour.

4204EDN Module 4 Griffith University 118


Table 4.7 follows with a visual analysis of the graphed data. Read this summary
and then spend time reviewing Figure 4.3 to pinpoint elements that point to an
effective intervention. First, change in mean level across the two phases provides
strong evidence about the reduction in hitting with intervention. Second, minimal
overlap (only 1 data point) shows that the intervention was effective from the time
of commencement. Third, decreasing trend throughout October shows that
effectiveness was sustained across the month of intervention.

Table 4.7. Visual Analysis of Ben's Graphed Data across Phases

Within and Across Analysis


Phases

Baseline  slightly upward trend


 some variability in occurrence of problem behaviour
 mean level of problem behaviour = 7.4 hits (8+6+7+8+8/5)

Intervention  decreasing trend


 some variability in occurrence of problem behaviour
 mean level of problem behaviour = 2.6 hits (4+4+3+3+2+1+1/7)

Comparison:  decrease in the mean level shows substantial reduction in


Baseline & problem behaviour
Intervention  the decreasing trend across intervention reflects this reduction
 there was minimal overlap across phases that points to the
effectiveness of the intervention
 the intervention shows slight immediacy of effect

To consolidate these concepts concerning with the visual analysis of time-series


data, refer to these readings. You are required to include a visual analysis of data
in both assignments for this course.

Evaluate effectiveness
Individual behavioural interventions should be evaluated not only in terms of
change in the behaviour (outcomes) but also in terms of effectiveness (input and
context). At least every semester, time needs to be put aside to complete a
cost/benefit analysis of each PBS plan and its associated programs and the
manner in which the intervention has proceeded over the implementation period.
Parents should be involved in this evaluative process (Cooper, Wacker, Sasso,
Reimers, & Donn, 1990; Lucyshyn, Horner, & Ben, 1996).

Time allocations, staffing constraints, and cooperation across the team are some
of the commonly identified factors that influence intervention outcomes.
Hieneman and Dunlap (2000, 2001) have extended this list to embrace 12 factors
categories. These are:
 characteristics of the focus individual,
 nature and history of the behaviour,
 behavioural support plan design,
 integrity of implementation,
 nature of the physical environment,
 buy-in (e.g., commitment of significant others) with the intervention,
 capacity of support providers,
4204EDN Module 4 Griffith University 119
 relationships with the individual,
 match with prevailing philosophy,
 responsiveness of the system,
 collaboration among providers,
 community acceptance (i.e., social validity).

Level of PBS effectiveness should be shared at subsequent IEP reviews and any
individualised planning meetings. Such practice allows the key stakeholders to
form a realistic picture of the intervention and context, and to make judgements
accordingly. All too often, focus is on the student and intervention outcomes.

Conclusion
The implement-and-evaluate cycle for behavioural intervention is more intense
than the typical implement-and-evaluate cycle for learning. The importance of
staff training, implementation frequency, and the systematic collection and
interpretation of student data have been raised. These aspects are crucially
related to ensuring positive student outcomes (treatment validity).In addition,
attention needs to be paid to the social validity of the BIP and intervention
strategies together with the overall effectiveness of the PBS intervention. To this
end, the Rehabilitation Research & Training Center on Positive Behavior Support
provides some benchmarks for consideration across these final phases of the
PBS process.

Table 4.8. Benchmarks for Effective Practice (Rehabilitation Research &


Training Center on PBS)
Features Benchmarks of Quality
Implementation, training and resources needed to insure implementation of the behavioral
Monitoring, and
Evaluation
intervention plan are made available to the team
an action plan for implementation is developed, including specific
Training & resources objectives and activities, persons responsible, and time lines
Action planning plan implementation is monitored (through reports & observations) to
insure that strategies are used consistently across intervention settings
Monitoring:
objective information is collected to evaluate the effectiveness of the
- Implementation behavioral intervention plan; this information includes a) decreases in
- Outcomes
problem behavior, b) increases in replacement skills and/or alternative
Team communication behaviors, c) achievement of broader goals, and d) durability of
behavior change
Plan modification
team communicates consistently (based on time lines determined by the
team) to review the individual’s progress and make necessary
adjustments to the behavioral intervention plan

4204EDN Module 4 Griffith University 120


Module summary
This module has provided an overview of behavioural intervention
implementation and evaluation. We will now recapitulate the major points.

 The process for providing intensive behavioural intervention incorporates:


 functional behavioural assessment (FBA),
 behavioural intervention planning (BIP),
 intervention implementation and review,
 intervention evaluation.
 Intervention implementation and evaluation need to be an integral part of the
intensive behavioural intervention.

 Behavioural interventions demand collaboration, consistency, and frequent


monitoring and review across positive, structured, and data-based learning
environments.

 Evaluation of behavioural interventions should report on a wide array of


outcomes, validity measures, and effectiveness factors.

It would be appreciated if you could complete and return the Evaluation


Sheet for Module 4.

4204EDN Module 4 Griffith University 121


4204EDN, Positive Behavioural Support
Module 4 Evaluation
1. What were the most useful concepts and readings associated with this module?

______________________________________________________________

______________________________________________________________

______________________________________________________________

2. What aspects and readings were least useful in this module?

______________________________________________________________

______________________________________________________________

______________________________________________________________

3. What improvements would you suggest to this module?

______________________________________________________________

______________________________________________________________

______________________________________________________________

4. Please recommend material from the World Wide Web that you feel should
be included in this module.

______________________________________________________________

______________________________________________________________

______________________________________________________________

5. Please feel free to make any other comment about this module.

______________________________________________________________

______________________________________________________________

______________________________________________________________

Thankyou
Please return to: Dr Wendi Beamish
School of Education and Professional Studies
Mt Gravatt campus Griffith University
176 Messines Ridge Road MT GRAVATT QLD 4122

4204EDN Module 4 Griffith University 122

You might also like