Course Printable Version

You might also like

Download as pdf
Download as pdf
You are on page 1of 35
Course Overview ‘This course provides a founclation for exercise evaluation concepts as identifed in the Homeland Security Exercise and Evaluation Program (HSEEP). Objectives: After completing this course, you will be able to: + Define the roles and responsibiies of an exercise evaluator. + Discover the tools necessary to suppor the exercise evaluator for a successful exercise evaluation, Identity the necessary tasks in conducting an exercise evaluation. ‘+ Recognize methods of analyzing exercise data. [erreny ey een Soe pnertema gts naioups zn Welcome to IS 130.a, How To Be An Exercise Evaluator 1n 2002, the Department of Homeland Security (DHS) developed the Homeland Security Exercise and Evaluation Program (HSEEP), which provides a common approach to exercise program management, design and development, conduct, evaluation, and improvement planning HSEEP aligns local preparedness efforts with the National Preparedness Goal and the National Preparedness System. As @ key component of national preparedness, exercises provide officials and stakeholders from across the whole community with the opportunity to shape planning, assess and validate capabilities, and address areas for improvement. 1S-0730.a How to Be an Exercise Evaluator provides learners with the basies of serving as an exercise evaluator. The course also bullds a foundation for exercise evaluation concepts as identified in the HSEEP. The course was designed for anyone who has accepted the responsibilty of being an evaluator. Lesson Synopses This couse conta se ossons The remainder of Lesson 1, Introduction to Exercise Evaluation, provides an overview of the exerc'se evaluation process and the Improvement planning process as desorined in tho Homeland Security Exercise and Evaluation Program (HSEEP). + Lesson 2, Roles and Responsibilities of the Exercise Evaluator, explains the responsibiltis of a lead evaluator, the criteria used to select evaluators and the components of an evaluator briefing + Lesson 3, Tools of the Evaluator, discusses the purpose and content of: an Exercise Evaluation Guide (EEG), the Controller and Evaluator Handbook (C/E Handbook), and the Master Scenario Events List (MSEL), + Lesson 4, Exercise Observation and Data Collection, discusses how to collect data during an evaluation, the challenges of evaluation, and the purpose of the player hot wash + Lesson 5, Evaluation Data Analysis, describes the components of the post-exercise ControlleEvaluator (CIE) debriefing and how to develop effective recommendations for improvement. + Lesson 6, Exercise Wrap-up Activities, discusses the purpose and content ofboth the After-Action Repor/Improvement Plan (AARIIP) and the After Action Meeting (AM), Lesson 1 Overview This lesson provides an understanding ofthe exercise cycle, with an emphasis on evaluation and the qualifications needed to be an effective evaluator. Objectives: At the end ofthis lesson, you will be able to: puiretema gts naiaups tz 0s + Distinguish between discussion-based and operations-based exercise evaluations + Explain the purpose of exercise evaluation What is Evaluation? Evaluation is the act of observing and recording exercise activity or conduct, assessing behaviors or actives against exercise objective noting strengths, weaknesses, deficiencies, of ater observations, and valuation methods can differ between discussion-based exercises and operations-based exercises. For example: + Discussion-based exercises: Evaluators or note-akers observe and record participant discussion. + Operations-based exercises: Evaluators observe and recore participant actions, The data recorded by evaluators forms the analytical basis for determining if an exercise meets is objectives. Evaluation should not be a single event inthe exercise cycle; instead, it should be carefuly integrated into overall exercise design. The output of exercise evaluation is information used to improve performance. For this reason, exercise evaluation is part of an on-going process of improvements to preparedness. Evaluator Characteristics ‘The capabiltos and objectives tested in an exercise play a eftical role in recruiting evaluators. Evaluators should have experience, preferably ‘subject matter expertise in thelr assigned functional area. They should have functional knowledge including famillarity wth the plans, policies, procedures, and agreements between local agencies and jurisdictions In addition to subject matter expertise, evaluators must be able to provide objective evaluations. Members ofa participating agency may feel pressured to favor outcomes for their agency. For this reason, itis sometimes best to recruit evaluators from local non-partcipating agencies either within or from outside of the jurisdiction, The Exercise Evaluation and Improvement Planning Process ‘The Exercise Evaluation and Improvement Planning Process has eight steps. In your role as an exercise evaluator, you will participate in pnertema gts naioups zn ss activities that fall within Step 2 - Observe the exercise and collect data and Step 3 - Analyze data, Reese vauation The first four steps ofthe process, Exercise Evaluation, address evaluation planning, observation, and analysis, ‘The last four steps of the process, Improvement Planning, focus on using information gained from exercises to implement improvements to a juriscition’s capabilities, Lesson 1 Summary In this lesson, you learned about the exercise cycle, evaluation and imorovement planning, and the qualifications needed to be and effective evaluator Objectives: Having completed this lesson, you are able to: + Distinguish between discussion-based and operations-based exercise evaluations + Explain the purpose of exercise evaluation ‘The next lesson presents information on Roles and Responsibiles ofthe Exercise Evaluator. onions gts naieupst02nt 0s Lesson 2 Overview This lesson provides an understanding ofthe responsibilities evaluation team members, including the time commitments required Objectives: At the nd ofthis lesson, you will beable to: + Identity the criteria used to select evaluators + Describe the components of an evaluator briefing [erreny ey ees Soe As an Evaluator, You Are a Member of a Team... ‘The evaluation team ig led by a lead evaluator. Appointed by the exercise planning team leader, the lead evaluator oversees all facets of the evaluation. The lead evaluator is charged with developing evaluation requirements, data collection methods, and corresponding tools and ‘documentation to be used during the exercise. In addition, the lead evaluator selects, assigns, and trains his/her evaluation team ‘members. Your lead evaluator is your manager or boss for tne duration of the exercise/evaluation/improvement planning. pnertema gts naioups zn ‘The goal of the evaluation process is to obtain objective evaluations. An evaluator should be familar withthe mission areas, core capabilties, plans, policies, and procedures that will be examined during the exercise, Subject matter expertise and experience inthe assigned functional area for evaluation are beneficial in maintaining objectivity, {A pre-exercise evaluator briefing is held for all evaluation team members to ensure a shared understanding of what key data to colact and how that data will contribute to the evaluation ofthe exercise. The evaluator briefing provides an opportunity to resolve unanswered questions, clarity roles and responsibilties, and distribute any last- minute changes or updates, Itis important to note that depending on the complexity ofthe exercise, the evaluator briefing may be combined with exercise controllers for a Controller valuator (CIE) briefing, and could require briefings at more than one site, depending on the organization and exercise layout Evaluator Responsibility {As an evaluation team member that participated in an exercise, keep in mind that you may be contacted to answer questions and provide ‘additional information concering your evaluation results, Lesson 2 Summary Inthis lesson, you leamed about the responsibites ofthe evaluation team Objectives: Having completed this lesson, you are able to + Identity the criteria used to select evaluators. + Define the challenges of evaluation, + Describe the components of an evaluator briefing The next lesson presents information on the Tools of the Evaluator. Lesson 3 Overview pnertema gts naioups zn This lesson provides an understanding ofthe documentation tools used by the evaluation team to observe, collect dat playeriparticipant reactions to prompts, Objectives: At the end ofthis lesson, you will be abla to: + Identity the purpose and content ofthe Controller and Evaluator Handbook (CIE Handbook), + Identiy the purpose and content of an Exercise Evaluation Guide (EEG). + Identity the purpose and content ofthe Master Scenario Events List (MSEL). [erreny ey inert? Soe Exercise Evaluation Tools Evaluation Too! Evaluation Tool Description Discussion based Exercises Situation Manual (SitMan) The primary reference material provided to all participants. tis a textual background forthe facilitated puirtema gts naioups tmnt exercise and discussion, The SitMan structure includes an overview (scope, objectives, core capabiltis, rules, and exercise agenda), the scenario, broken into modules, and discussion questions at the end of each module, Operations-based Exercises Controller/Evaluator (C/E) Handbook Master Scenario Events List (MSEL) [The ControlerlEvaluator Handbook (CIE Handbook) is the primary evaluation documentation for the exercise. Itis used as an instructional manual for controllers and evaluators and is sometimes called the Evaluation Plan, It typically contains the following information + Exercise-Specific Details: Exercise scenario, schedule of events, and evaluation schedule + Evaluator Team Organization, Assignments, and Locations: Alist of evaluator locations, shift assignments, a map ofthe exercise sites), evaluation team organizational char, and evaluation team contact information + Evaluator Instructions: Step-by-slep instructions for evaluators for activities before, during, and following the exercise + Evaluation Tools: EEGs, the MSEL, or a list of venue-specifc injects, electronic or manual evaluation logs or data collection forms, relevant plans and procedures, Participant Feedback Forms, and Hot Wash templates, The C/E Handbook may be a standalone document or a supplement to the Exercise Plan (ExPlan) It may also be broken into separate controller and evalualor versions. [A chronological listing of events, in spreadsheet format, thal drive exercise play. The MSEL ls used [during operations-based exercises or complex discussion-based exercises. MSEL events include contextual injects, expected action events (milestones), and contingency injects. Situation Manual (SitMan) A Situation Manual (SitMan) is the core documentation provided to all participants in a discussion-based exercise. Itis the textual background for afaciltated exercise. It supports the scenario narrative and isthe primary reference material ‘Typically, the SitMan includes the following: + Exercise scope, objectives, and core capabilties + Exercise assumptions and artfiiliios Isrnaeupstonint ‘+ Insiructions for exercise participants + Exercise structure + Exercise scenario background + Discussion questions and key issues + Schedule of events Reference materials may include: ‘+ Material Safely Data Sheet (MSDS) 6 + Relevant documentation, plans, SOPs, etc + Jurisdiction-specific threat information + Allist of reference terms Exercise Evaluation Guide (EEG) Goals and Components EEGs are designed to accomplish several goals: + Streamline data collection + Enable thorough assessments ofthe participant organizations’ capabilty targets + Support development of the After-Action Report (AAR) + Provide a consistent process for assessing preparedness through exercises + Holp organizations map exercise resulis to exercise objectives, core capabilities, capabilty targets, and critical tasks for further analysis and assessment Each EEG consists of the following components: ++ The EEG: asks evaluators to record the completion or non-complation of tasks and performance measures. + The EEG Analysis Sheets: the Observations Summary section of the EEG Analysis Sheets asks evaluators to record the general flow oftheir observations. The Evaluator Observations section of the EEG Analysis Shoots prompls evaluators to list their major observations, including strengths and areas for improvement. ‘Advance to the next screen to review an image ofthe Exercise Evaluation Guide (EEG). Example Exercise Evaluation Guide (EEG) tema goto Full alt text EEG Format ‘The EEG format presents the following evaluation requirements to exercise evaluators: pnertema gts naioups zn + Core capabilities: The distinct critical elements necessary to achieve a specific mission area (e.g, prevention). To assess both capacity and gaps, each core capabilly includes capability targets, + Capability target(s): The performance thresholds for each core capabilly; they state the exact amount of capabilly that players aim to achieve. Capabilly targets are typically written as quantitative or qualitative statements. + Critical tasks: The distinct elements required to perform a core capabilty; they describe how the capabilty target wll be met. Crical tasks generally include the actives, resources, and responsibilities requires to fulfil capabilty targets, Capability targets and critical tasks are based on operational plans, policies, and procedures to be exercised and tested during the exercise. + Performance ratings: The summary description of performance against target levels. Performance ratings include both Target Ratings, describing how exercise participants performed relative to each capabilty target, and Core Capabilty Ratings, describing ‘overall performance relative to the entre core capability. For each EEG, evaluators provide a target rating, observation notes including an explanation ofthe target rating, and a final core capability rating. In order to efficiently complete these sections of the EEG, evaluators focus their observations on the capabilly targets and critical tasks listed in the EEG. The Controller/Evaluator (C/E) Handbook ‘The ControlerEvaluator Handbook (C/E Handbook) isthe primary evaluation documentation forthe exercise. Itis used as an instructional ‘manual for controllers and evaluators and is sometimes called the Evaluation Plan. It typically contain the following information: + Exercise-Specific Details: Exercise scenario, schedule of events, and evaluation schedule + Evaluator Team Organization, Assignments, and Locations: A ist of evaluator locations, shit assignments, a map of the exercise site(s), evaluation team organizational chart, and evaluation team contact information ‘+ Evaluator Instructions: Step-by-step instructions for evaluators for activites before, during, and following the exercise + Evaluation Tools: EEGs, the MSEL, or a lst of venue-specfic injects, electronic or manual evaluation logs or data collection forms, relevant plans and procedures, Participant Feedback Forms, and Hot Wash templates The CIE Handbook may be a standalone document or a supplement o the Exercise Plan (ExPlan) It may also be broken into separate controller and evaluator versions, Master Scenario Events List (MSEL) In more complex exercises (operations-based or complex discussion-based exercises), a Master Scenario Events List (MSEL) provides a timeline and location forall expected exercise events and injects (actions that push the scenario forward). tis important that all evaluators have a copy of the MSEL. It keeps evaluators on trac, in their expected locations and observing actions, as assigned. pnertema gts naioups zn sae Evaluators can refer tothe MSEL to help determine the imes at which specific evaluators should be at certain locations. For discussion-based exercises, the assignment of evaluators depends on the numer of players, the organization ofthe players and discussion, and the exercise objectives, Components of the MSEL ‘The MSEL should contain: + Chronological listing that supplements exercise scenario + Scenario time + Event synopses + Allinjects, including the controller responsible fo the inject, as well as all evaluator special instructions + Intended player (agency or individual player) ‘+ Expected participant responses — when developing storyboards, be sure to add a quick note which clarifies that expected participant responses are what evaluators are evaluating (expected responses vs. actual responses) + Objectives and core capabilities + Notes section with special instructions ‘Advance to the next screen to review an image of the Master Senario Events List (MSEL). MSEL Example pnertema gts naioups zn 1s See = = aa Enouecssse ose. [Sates taciny ota cake Goan tacretr | Neccteni Gest, and scree Wace Teta fom Peviemnecereme ss Saeed eet eee tl i Lesson 3 Summary In this lesson, you learned about the Exercise Evaluation Guide (EEG), ControllerfEvaluator (CIE) Handbook, and the Master Scenario Events List (MSEL). sina 30a uge0 Pt sans Objectives: Having completed this lesson, you are able to + Identiy the purpose and content of an Exercise Evaluation Guide (EEG). + Identity the purpose and content ofthe Controller and Evaluator Handbook (CIE Handbook) + Identity the purpose and contont ofthe Master Scenario Events List (MSEL). The next lesson presents information on Exercise Observation and Data Collection. Lesson 4 Overview This lesson provides an overview of the observation and data collection and hot wash processes, Objectives: At the end ofthis lesson, you willbe able to: + Desoribe the exercise-related data that evaluators should collect + Define the challenges of evaluation ‘+ Identity the purpose and content ofthe player hot wash, pnertema gts naioups zn [emery ey eer wae eee eras Exercise Observations and Data Collection Exercise evaluators should observe exercise activity in a non-atribution environment, n accordance with the evaluation training and EEGs, Evaluators will generally be able to collect information about the folowing topics related to execution of capabilities and tasks examined during the exercise: Utlization of plans, policies, and procedures related to capabilities. Implementation of legal authorities. Understanding and assignment of roles and responsibilities of participating organizations and players. Decision-making processes used. ‘Activation and implementation of processes and procedures. How and what information is shared among participating agencies/organizations and the publi, Discussion-based exercises usually focus on issues involving plans, policies, and procedures’ consequently, observations ofthese exercises may consist of an evaluator or note-taker recording data from parcipant discussions on EEGs. On the other hand, operations-based ‘exercises focus on issues affecting the operational execution of capabilities and critical tasks. During exercises, evaluators collect and record participant actions, which forthe analytical basis for determining if critical tasks were successfully demonstrated and capabilty targets met, pnertema gts naioups zn se Using EEGs in Exercise Observation [As you leamed in Lessons 3, Exercise Evaluation Guides (EEGs) identiy the actives, tasks, and performance measures thatthe evaluator should observe during the exercise. Evaluators should complete the EEG so that: + Events can be reconstructed at a later time (such as during summary sessions). + Evaluators can conduct root cause analyses of problems. To ensure EEGs are fully complete, evaluators should ‘+ Synch their timekeeping with other evaluators before the exercise. + Record the name and time of the exercise (as applicable). + Log times and location accurately. + Take notes on whether exercise simulations affect the observed task Complete EEGs are essential to the development ofthe AMter-Action Reportimprovement Plan (AARP). The EEG Observations Section ‘The EEG Observations Section allows exercise evaluators to record general exercise events, specific actions deserving special recognition, particular challenges or concems, and areas needing Improvement. ‘The information recorded in the EEGs is used to develop the AARIIP. ‘The standard sources, such as EEGs, are not the only sources of information. Evaluators should make all ttempis to gather as much information as possible. Other sources of evaluation information are + Event logs, + Video or auo recordings, + Evaluator notes. + Photographs. For operations-based exercises, evaluators should be given a format that suits the environment, Types of Reporting pxmiratemaaonts 30a During an exercise, each evaluator performs three types of reporting + Descriptive reporting is the direct observation and documentation of actions listed on evaluation forms. For example, consider a checklist item that asks whether the outgoing Operations Section Chief briefed his or her replacement, This kem requires litle subjective judgment on the part ofthe evaluator. For that reason, i prompts descriptive reporting. Descriptive reporting typically yields reliable data. + Inferential reporting requires an evaluator to arrive ata conclusion before recording information. For example, consider a checklist item that asks whether a capability is “adequate.” In judging whether the capabilty s “adequate,” the evaluator must frst make an ‘assumption about what “adequate” means. Since no two evaluators will make the exact same assumption, inferential reporting yields inconsistent data, + Evaluative reporting requires evaluators to assess performance on a scale of success. For example, consider an evaluation item that ‘asks evaluators to rale the success ofthe Incident Commander's communications strategy. Ths item requires the evaluator to make an evaluative judgment. Reliable evaluative data is dificult to collect. For the most par, evaluators wil perform descriptive reporting Post-exercise activites ask the evaluator to assess data in relationship to exercise objectives. These assessments require inferential and evaluative judgments Note Taking and Data Collection Evaluators should retain their notes and records ofthe exercise to support the development of the AAR. As necessary, the lead evaluator may assign other evaluators to collect supplemental data during or immediately after the exorcise. Such data is ertial to fil in gaps identified during exercise evaluation, For example, sources of supplemental evaluation data might include records produced by automated systems or communication networks, ‘and writen records, such as duly logs and message forms, ‘Additional data collection methods include participant feedback forms and FaciltatorEvaluator(F/E) or CIE briefings. Hot washes also provide times data can be collected andlor validated. During these times, occurring immediately after the exercise, players can voice both positives and negatives ofthe exercise. They are attended by the players, evaluators, and the controllers of facilitators. Note Taking Skills Isrnaeupstonint ms Observation notes include if and how quantilatve or qualitative targets were met. For example, a capabiliy target might state, ‘Within 4 hours of the incident.” Observation notes an that target should include the actual time required for exercise players to complete the erical task(s). Additionally, observations should include: + Actual time required for exercise players to complete the cial task(s). ‘+ How target was or was not mt. + Decisions made and information gathered to make decision. This includes folowing up with other evaluators who may have been on the other end of the conversations, so all "sides ofthe story” are represented Requests made and how requests were handled Resources utilized. Plans, policies, procedures, or legislative authorities used or implemented. ‘Any other factors contributed to the outcomes. Based on their observations, evaluators assign a target rating for each capability target lsted on the EEG. Evaluators then consider all target ratings for the core capability and assign an overall core capability rating. The rating scale includes four ratings: + Performed without Challenge (P). + Performed with Some Challenges (S). + Performed with Major Challenges (M). + Unable to be Performed (U), Common Pitfalls of Evaluation valuations are only effective if evaluators perform systematic observation and generate unbiased records, To ensure unbiased records, evaluators should avoid seven pitas of exercise evaluation 1. Observer Drift occurs when evaluators lose interest or a common frame of reference during an exercise, It is usualy the result of fatigue or lack of motivation. Observer drift can be minimized by foedback from the lead evaluator, Beverages and snacks, breaks, and Fotational shifts of exercise observation, 2. Errors of Leniency occur when evaluators have a tendency to rate all actions positively. It can be minimized by pre-exercise taining 3. Errors of Gentral Tendency occur when evaluators describe all activites as average in order to avoid making difficult decisions. It can be minimized by pre-exercise training. 4, Halo Effect occurs when evaluators form a positive impression of a person or group erly in the exercise and permit this impression to influence their observations. Itcan be minimized by pre-exercse taining, 5. Hypereritical Effect occurs when evaluators believe i is ther job lo find something wrong, regardless ofthe players’ performance. It can be minimized by pre-exercse training Ismay 1 6, Contamination occurs when evaluators know how an activity was performed in earlier exercises and permit this knowledge to affect their expectations. It can be minimized by pre-exercise training, 7. Evaluator Bias refers to errors that are traceable fo characteristics of the evaluator. Evaluator bias can be minimized by careful selection of evaluators, or by employing multiple evaluators to observe the same functions Exercise Hot Wash After the exercise, one or more player hot washes are held. Itis attended by the Exercise Planning Team, players, evaluators, and facilitators or controllers. The player hot wash is an opportunity for players to describe their immediate impressions of demonstrated capabilties and the ‘exercise itsel, Fortis reason i affords valuable opportunity for evaluators to fl in gaps in their notes, Player hot washes allow time for players to address key topes, crass-disciphinary iesues, or conflicting recommendations thal were identified In earlier discussions. They are also an opportunty for players to comment on how wall the exercise was planned and conducted Player hot washes should be held as soon as possible after the exercise is complete, while player observations are sil fresh, They are most effective when led by an experienced faciitator who can keep the discussion constructve and focused During the hot wash, evaluators, controllers andlor facitators should distribute Participant Feedback Forms for players to submit. Hot Wash Goals for Evaluators For evaluators, a hot wash s an opportunity to ‘+ Collect thoughts and observations about what occurred and how the participants thought it went + Provide evaluators the opportunity to clay points and collect information that may have missed Although evaluators may be assigned to record a particular group discussion, they should capture information on cross-cutting issues. Exercise Hot Wash Questions ‘Some suggested questions are: + What happened? + What was supposed to happen? + Why is there a difference? pnertema gts naioups zn 108 ‘What isthe effect of the diference? ‘What should be leamed from this? ‘What improvements need to be implemented? Were the organizational roles and responsibil 9's clearly identified? Positive Aspects of Evaluation Strengths are identified and can be carried forward Lesson 4 Summary In this lesson, you learned about observation and data collection processes, including good note taking skls and lypes of reporting. In addition, the content of the hot wash meeting is explained. Objective Having completed this lesson, you are able to: + Describe the exercise-related data that evaluators should callect, + Define the challenges of evaluation, + Identity the purpose and content ofthe player hot wash. The next lesson presents information on Evaluation Data Analysis. Lesson 5 Overview This lesson provides an understanding of data analysis, issue identifcation, and root-cause analysis. Objective: At the end ofthis lesson, you wil be able to: + Describe the components ofthe post-exercise Controller/Evaluator (CIE) debriefing + Describe how to collect data to ensure a proper Root Cause Analysis. pxmiratemaaonts 30a [emery ey Pao eee Sena Controller/Evaluator (C/E) Debriefing Data Analysis takes starts when evaluators and controllers meet in the C/E Debriefing after the exercise, This meeting includes controllers because they are frequently teamed with evaluators, and because they can provide insights and observations based on the Master Scenario Evert List (MSEL). The C/E Debriefing allows evaluators to review results ofthe hot wash and participant feedback forms. It also enables evaluators to: ‘+ Compare notes with other evaluators and controllers. ‘© This helps all evaluator fil in information gaps. I also enhances continuity. Consider an evaluator who has notes about a situation that involved follow-up in another situation. Ifthe second situation related to the assigned objectives of another ‘evaluator, the two evaluators must compare notes. Comparing notes may also help evaluators resolve discrepancies witin their ‘own nates. + Refine evaluation documents, + Develop an overall capabilly summary, Reviewing Exercise Objectives pnertema gts naioups zn ae Data analysis isthe ime when evaluators assess player performance against exercise objectives. For this reason, evaluators should start by re-reading exercise objectives, These objectives provide the foundation for all data analysis, Ifthe exercise was complex, evaluators may only reed to re-read the objectives related to thelr assignments. \When reviewing the exercise objectives, consider the following points: ‘+ What was the intent ofthe objective? ‘+ What would demonstrate the successful performance of the objective? + IT the objective was not met, was it the result of poor exercise design or the decisions of players? Data Analysis ‘The goal of data analysis is to evaluate the ably of exercise participans to perform core capabilities and to determine if exercise objectives wore mt. During data analysis, the evaluation team frst consolidates the data collected during the exercise and determines whether partcipants performed ertical tasks and met capabily targets. Evaluators consider participant performance against al targets to determine the overall billy to perform core capabilities. Additionally, the evaluation team takes notes on the course of exercise play, demonstrated strengths, and ‘areas for improvement. This provides the evaluators with not only what happened, but why events happened. ‘tor this inital data analysis, evaluators examine each crcl task not completed as expected and each target not met, withthe aim of Identifying a root cause. A root cause isthe source of or underlying reason behind an identified issue toward which the evaluator can direct an improvement. When conducting a root-cause analysis, he evaluator should attempt to trace the origin of each event back to earler events and their respective causes. Identifying Issues In both discussion-based and operations-based exercises, evaluators identify issues by comparing exercise objectives to actual performance, Through this comparison, evaluators identify which capabilties (and thelr associated activities, performance measures, and tasks) were successfully demonstrated in the exercise. They also identify which capabilities need improvement. + During operations-based exercises, evaluators identify issues by answering the following questions: © What happened? © What did evaluators see? © What was supposed to happan based on plans and procedures? Isrnaeupstonint oo Was there a diference? Why or why not? What was the impact? Were the consequences of the action (or inaction or decision) positive, negative, ot neutral? What are the strengths and areas of improvement to remedy deficiencies? + In discussion-based exercises, evaluators seek to identify the folowing issues: In an incident, how would response personnel perform the actives and associated tasks? What decisions would need to be made, and who would make them? ‘Are personnel trained to perform the activites and associated tasks? ‘Are other resources needed? If so, how wil they be obtained? Do plans, policies, and procedures support the performance ofthe activities and associated tasks? ‘Ace players familiar with these documents? Do parsonnel from multiple agencies or jurisdictions need to work together to perform the activities? Ifs0, are agreements or relationships in place to support this? What ae the strengths and areas of improvement to remedy deficiencies? Determining Root Causes ‘Aer evaluators identi discrepancies between what happened and what was supposed tc happen (the issues), thay axplore the source of these issues, This step i aled root-cause analysis. When conducting roo-cause analyss, evaluators ask why each event happened or is not happen, When evaluators have identiied the root cause of a problem, they can be sure that corrective actions will actually address the problem, anc ot just a symptom of it Root Cause Analysis (The "Why Staircase") Root-Cause is defined as the source of, or underlying reason behind, an identiied issue toward which the evaluator can direct improvement. The Root Cause Analysis graphic identifies a series of questions that can help dil down to determine the cause(s) ofthe issue. First, a one sentence description of the event to be analyzed is recorded as the Problem Statement Consider asking the folowing questions to narrow down the issue of concern: ‘+ What happened? (What was observed?) + What capability targets ware met? If they were not met, what were the contributing factors? Isrnaeupstoonin a + Did discussion or activities suggest that critical tasks were executed to meet capability targets? Ifnot, what were the impacts or consequences? + Do current plans, policies, and procedures support citcal tasks and capabilly targets? + Were players familiar with them? There is room next tothe questions to record the, Why" results. Also, include suggested Corrective action(s). Finally, using alist format, ‘dentiy the root causes identified through this analysis. Isrnaeupstonint 06 PROBLEM STATEMENT Root Cause Analysis Consider the fallowing: ‘What happened? (Observations) + Were capability targets met? Ifnot, what factors contributed? + Did discussion or activities suggest the etcal tasks were executed to meet capability targets? If not, what were the Impacts or consequences + Do current plans, policies and procedures support critical tasks and capability targets? \Were players familiar with them? + Root-Cause: The source of, or underlying reason behind, an Identified Issue toward ‘which the evaluator can direct improvement, + Suggested corrective action(s} i ROOT CAUSES Developing Recommendations for Improvement ‘After identifying issues and their root causes, evaluators develop recommendations for enhancing preparedness, which are compiled into a draft After-Action Report (AAR) and submitted to the exercise sponsor. Once distributed, elected and appointed officials, or designees, review ‘and accept observations and recommendations and formalize the AAR, The recommendations in the AAR willbe the basis for corrective actions identified in the After-Action Meeting (AM) Honesty is key when wring recommendations. Ifyou have a eiticism, record it Exercises will only improve preparedness ifthey are followed by accurate and useful feedback Recommendations for improvement should + Identity areas to sustain or improve {+ Address both short-term and long-term solutions + Be consistent with ather recommendations + ldentily rterences for implementation To the extent possible, evaluators should detall how to implement improvements. They can even recommend who will implement them and provide suggested timeframes for completion, Recommended Improvements \Whon developing commendation for discussion based exercises, evaluators should guide ther discussion withthe flowing questions: ‘+ What changes need to be made to plans to improve performance? ‘+ What changes need to be made to organizational structures to improve performance? ‘+ What changes need io be made to leadership and management processes to improve performance? + What training is neccied to improve performance? ‘+ What changes to resources are needed to improve performance? + What practices should be shared with other communities? \When developing recommendations for operations-based exercises, evaluators should guide thelr discussion with the folowing questions: ‘+ What changes need to be made to plans or procedures to improve performance? ‘+ What changes need to be made to organizational structures to improve performance? ‘+ What changes need io be made to leadership and management processes to improve performance? + What training is needed to improve performance? pnertema gts naioups zn ce + What changes to equipment are needed to improve performance? ‘+ What are lessons learned for approaching a similar problem inthe future? Lesson 5 Summary Inti lesson, you leamed about data analysis, issu dentition, rot-cause analysis, and improvement recommendations. Objective: Having completed ti lesson, you are able to: + Describe the components ofthe post-exercise Controller/Evaluator (C/E) debriefing + Describe how to collect data to ensure a proper Root Cause Analysis. Lesson 6 Overview This lesson provides an understanding ofthe after-action reporting process, determining corrective actions, finalizing the After-Action Report (AAR), and improvement implementation. Objectives: Al the end ofthis lesson, you will be able to: ‘+ Identity the purpose and content ofthe After-Action Report (AARV/Improvement Plan (IP) + Deseribe the After-Action M ting (AM). pnertema gts naioups zn ame [emery ey eer Pao eee Ree a Exercise Evaluation Guide (EEG) and the Controller/Evaluator (C/E) Debrief ‘The Evaluator Evaluation Guide (EEG) is the basis ofthe exercise, The evaluators notes and after-exercise questions are recorded in the ‘observations section. These questions are the first step in the de process. The EEGs are intended to guide an evaluator’s ‘observations so thatthe evaluator focuses on capabilties and tasks relevant to exercise objectives. ‘The ControlerEvaluator (C/E) debrief follows the exercise and is where data is compiled and discusses among the team af evaluators. AS a result, areas that requite follow-on are identified and documented. The information gathered at this debrief would be used to buld the AARIIP document, which is presented tothe exercise sponsor. What is an After-Action Report (AAR)? All discussion-based and operations-based exercises result in the development of an AAR. The AAR is the document that summarizes key Information related to evaluation. The Homeland Securlty Exercise and Evaluation Program (HSEEP) has defined a standard format for the development of an AAR. By using this format, jurisdictions ensure thatthe style and the level of detaln their AAR is consistent with otner jurisdictions. Consistency across jurisdictions allows the nation-wide emergency preparedness community to gain a broad view of capabilties The length, format, and development timeframe of the AAR depend on the exercise type and scope. These parameters should be determined by the exercise planning team based on the expectations of elected and appointed officials as they develop the evaluation requirements in the pnertema gts naioups zn es design and development process. The main focus ofthe AAR isthe analysis of core capabilities. Generally, ARs also include basic exercise information, such as the exercise name, type of exercise, dates, location, participating organizations, mission area(s), specific threat or hazard, a bret scenario description, and the name of the exercise sponsor and POC. ‘The AAR should include an overview of performance related to each exercise abjective and associated core capabilities, while highlighting strengths and areas for improvement, Therefore, evaluators should review their evaluation nates and documentation to identify the strengths ‘and areas for improvement relovant o the participating organizations’ ablity to moet exercise objectives and demonstrate core capabilties. Information in AAR The main focus of the AAR is the analysis of core capabilities. Generally, ARS also include basic exercise information, such as the exercise name, type of exercise, dates, location, participating organizations, mission area(s), specific teat or hazard, a bref scenario description, and the name of the exercise sponsor and POC. The suggested AAR format specifically includes: + Basic exercise information: Exercise name Exercise soope Dates and location Participating organizations Mission area(s) and Core Capabilities ‘Specific threat or hazard Brief scenario desertion Exercise sponsor name and point of contact (POC) Executive summary Exercise goals and objectives ‘Analysis of capabiliies demonstrated Conclusion Improvement Plan (IP) matrix ‘Additional appendices may include: © Lessons leamed pnertema gts naioups zn oo ‘A participant feedback summary ‘An exercise events summary table Performance ralings ‘An acronyrn ist AAR Review Upon completion, the evaluation team provides the draft AAR to the exercise sponsor, who distributes it to participating organizations. Elected and appointed officials, or their designees, review and confirm observations identified inthe formal AAR, and determine which areas for improvement require further action, Areas for improvement that require action are those that will continue to seriously impede capability performance ieft unresolved. ‘As part ofthe improvement planning process, elected and appointed officials deny corrective actions to bring areas for improvement to resolution and determine the organization with responsibilty for those actions. Corrective Actions ‘After evaluation concludes, organizations should perform an additional qualitative assessment of the analyzed data to identiy potential Corrective actions. Corrective actions are concrete, actionable steps that are intended to resolve capability shortcomings identified in exercises of real-world events In developing corrective actions, elected and appointed officials or their designees should first review and revise the draft AAR, as needed, prior to the AAM to confirm that the issues identfied by evaluators are vala and require resolution, The reviewer then identifies which issues fall within their organization's authority, and assume responsibilty for taking action on those issues. Finally, they determine an inital lst of appropriate corrective actions to resolve identified issues, Each corrective action should identify what will be done to adrass the recommendation; who (person or agency) should be responsible; and a timeframe for implementation. A corrective action should contain enough detail to make it useful ‘The organization's reviewer should use the following questions to guide ther discussion when developing corrective actions: ‘What changes need to be made to plans and procedures to improve performance? ‘What changes need to be made to organizational structures to improve performance? ‘What changes need to be made to management processes to improve performance? ‘What changes to equipment or resources are needed to improve performance? What taining is needed to improve performance? ‘What are the lessons learned for approaching similar problems in the future? Isrnaeupstonn see Benchmarking Corrective Actions Corrective actions must include attainable benchmarks that wil allow the jurisdiction to measure progress towards its implementation Examples of benchmarks include the fllowing ‘+ The number of personnel trained in a task ‘+ The percentage of equipment that is up-to-date + The finalization ofan interagency agreement within a given amount of ime ‘These benchmarks should be defined against concrete deadlines so the jurisdiction can track gradual progress toward implementation. After-Action Meeting (AAM) (Once the organization's reviewer has confirmed the draft areas for improvement and identified intial corrective actions, a draft Improvement Plan (IP) is developed ‘or review at an AAM. The purpose of the AM is to review and refine the draft AAR. As part of the AAM, attendees develop an IP that articulates specific corrective actions by addressing issues identified in the AAR. The refined AAR and IP are then finalized a8 a combined AARIP. Prior to the AAM, as appropriate, the exercise sponsor will distribute the revised AAR, which incorporates feedback on the strengths and areas for improvement, and the draft IP to participants. Distributing these documents for review prior to the meeting helps to ensure that all attendees are familar with the content and are prepared to discuss exercise results, Identiied areas for improvement, and corrective actions. ‘To answer any questions or provide necessary details on the exercise itself, the organization's elected and appointed officials, or their designees, should attend the AM along with any other stakeholders, the Exercise Planning Team, and the Evaluation Team. During the AM, participants should seek to reach final consensus on strengths and areas for improvernent, as well as revise and gain consensus on draft corrective actions. Adkltionally, as appropriate, AM participants should develop concrete deadlines for implementation of corrective actions and identify specific corrective action ownersiassignoes. Participant organizations are responsible for developing Implementation processes and timelines, and keeping ther elected and apponted officials informed of the implementation status Characteristics of an ideal AAM 4 Tp ‘The meeting should be: + Scheduled fora full day, within several weeks of the end of the exercise pueirtema gets naiaups tz se '+ Held at a convenient location or at the site where the exercise took place Participants should + Validate observations and recommendations and provide insight into actvties that may have been overlooked or misinterpreted by evaluators ‘+ Participate in afaciitated discussion of ways in which participating organizations can build upon the strengths identified in the jurisdiction After-Action Report/Improvement Plan Finalization (Once all corrective actions have been consolidated inthe final IP, the IP may be included as an appendix to the AAR. The AARIIP is then Considered final, and may be distributed to exercise planners, participants, and other preparedness stakeholders as appropriate Corrective Action Tracking and Implementation Corrective Action Tracking and Implementation captured in the AARVIP should be tracked and continually reported on until completion, Organizations should assign points of contact responsible for tracking and reporting on their progress in implementing corrective actions. By tracking corrective actions to completion, preparedness slakeholders are able to demonstrate thal exercises have yielded tangible improvements in preparedness. Stakeholders should also ensure there Is a system in place to validate previous corrective actions that have been successfully implemented. Thase efforts should be considered part of a wider continuous improvement process that applies prior to, during, and ater an exercise is completed Using IPs to Support Continuous Improvement Conducting exercises and documenting he strengths, areas for improvement, and associated corrective actions isan important part of the National Preparedness System, and contrioutes to the strengthening of preparedness across the Whole Community and achievernent of the National Preparedness Goal. Overtime, exercises should yield observable improvements in preparedness for fulure exercises and real-world events The identification of strengths, areas for improvement and corrective actions that result from exercises help organizations bull capabilties as part of a larger continuous improvement process. The principles of continuous improvement ar: + Consistent Approach, Organizations should employ a consistent approach for continuous improvement-elated activities across. applicable mission areas—prevention, protection, mitigation, response, and recovery, This consistent approach enables a shared pnertema gts naioups zn sane Understanding of key terminology, functions, processes, and tools, This approach also fosters continuous improvement-elated interoperability and collaboration across an organization's components, + Support National Preparedness. By conducting continuous improvement actives, organizations suppor the development and sustainment of core capabilties across the whole community. Continuous improvement activities also ensure that organizations are able to support assessments of national preparedness in a timely, actionable, and meaningful way. + Effective Issue Resolution and Information Sharing. Through improvement planning, organizations complete continuous improvement action toms at the lowest level possible while facilitating the sharing of strengths and areas for improvement. + Application across Operational Phases. The functions, processes, and tools apply to all operational phases, including: «© Near-real time collection and analysis during real-world events or exercises © Posteventiexercise analysis. © Trend analysis across multiple eventslexercises overtime Lesson 6 Summary In this lesson, you learned about the after-action reporting process, determining corrective actions, nalizing the After Action Report (AAR), and improvement implementation. Objectives: Having completed this lesson, you are able to + Identity the purpose and content ofthe After-Action Report (AAR\/Improvement Plan (IP), + Describe the After Action Meeting (AM), Course Summary The course objectives include: + Define the oles and responsibilities of an exercise evaluator + Discover the tools necessary to suppor the exercise evaluator for successful exercise evaluation Isrnaeupstonint sare + Identify the necessary tasks in conducting an exercise evaluation. + Recognize methods of analyzing exercise data. This course discussed the fallowing topics: + Introduction to exercise evaluation This lesson distinguished between discussion and operations-based exercise evaluation, described how capabilties and objectives impact evaluation, explained the purpose of exercises, and defined the improvement planning process. + Explain an exercise This topic discussed the different types of exercises including discussion-based and operation based, evaluator qualifications, evaluator characterises, and evaluator sources. + Explain why exercises are important ‘This topic discussed the importance of exercises in national preparedness + Explain the exercise ovaluation steps. This topic described the exercise evaluation steps including planning, observing and collecting and analyzing data, developing the draft ater- scion oat stig trove, concn te far acon et, ion he Mer con Report AAR ard vckng + Role and Responsibilities of the exercise evaluator This topic identified the skllsets needed to be a lead evaluator, the criteria used to select evaluators. In this chapter you also identiied who should serve as an evaluator, defined the challenges of evaluation, and described an evaluator briefing + Exercise Observation and Data Collection This topic discussed the tools available for data collection, how to use the EEG as a guide, and describe how to document exercise logs and se them as evidence of a successful or unsuccessful exercise. ‘+ Evaluation data analysis: Isrnaeupstonint ee This topic described the components ofthe post-exercise controller/evaluator debriefing, and explained how to develop effective recommendations for improvement, + Exercise wrap-up activities This final topic listed the components of an AARIP, explained how to write an analysis of a capabily, described the purpose of Improvement plan and the improvement plan matrix and also how participating organizations generate corrective actions, Preparedness Resource Library We encourage you to review and become familar wth the following documents thet support his course: + Prasidantial Policy Decive 8 + National Provaredness Goal + Matanal Pranaradnass System + Qveniw ofthe National Planning Frameworks + Tvaatand Hazard Idoniication and Risk Assessment Guide: Compzshensive Pranaradness GuidelGPG) 201 + EEWA Preparedness TookWHSEEP Poley and Guidance + Homsland Sacury Dial Library + FEMA Home Page, pnertema gts naioups zn se

You might also like