Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

NATO UNCLASSIFIED

NORTH ATLANTIC TREATY ORGANIZATION

Supreme Allied Commander, Europe Supreme Allied Commander, Transformation


B-7010 SHAPE Norfolk, Virginia 23551-2490
Belgium United States of America

SHAPE: SH/PLANS/J7IPLLlFAl15-309195 1086 Tel: +32-(0)65-44-7111 (SHAPE)


SACT: 5000ITSC FET 0100ITT-150341/Ser: NU1086 Tel: + 1-(757)-747-3400 (SACT)
13 Fax: +32-(0)65-44-3545 (SHAPE)
Date: 13 November 2015 Fax: + 1-(757)-747-3242 (SACT)

Bi-STRATEGIC COMMAND DIRECTIVE 080-091

JOINT ANALYSIS REQUIREMENTS AND REPORTS

REFERENCES: A. MCM-0021 -2011, NATO Lessons Learned Policy, dated 18 May 2011 ..
B. Bi-SC Directive 080-006, Lessons Learned, dated 10 July 2015.

1. Status. Th is directive details the specific provisions from the NATO Lessons Learned
Policy (Reference A), concerning the Joint Analysis Requirements and Joint Analysis Reports,
and complements Bi-Strategic Command Directive 080-006, Lessons Learned (Reference B) .

2. Purpose. This directive provides guidance to NATO Commanders on how to prepare


and submit Joint Analysis Requirements and how to staff Joint Analysis Reports.

3. Applicability. This directive is applicable to NATO Strategic Commands and


subordinate organizations and constitutes a guide for other organizations which submit Joint
Analysis Requirements or produce Joint Analysis Reports.

4. Publication Updates . This document will be reviewed as required by the Bi-SC


Lessons Learned Steering Group. Updates are authorized when approved jointly by COS HQ
SACT and COS SHAPE .
NATO UNCLASSIFIED
Bi-SCD 080-091

5. Proponent. The lead proponent for this directive is HQ SACT, ACOS Capability
Engineering and Innovation (CEI), Innovation, Doctrine and Lessons Learned Branch (IDLL).

FOR THE SUPREME ALLIED COMMANDERS, EUROPE AND TRANSFORMATION:

Michel Yakovleff Phil Jones CB CBE


Lieutenant General, FRA A Lieutenant General, GBR A
Vice Chief of Staff Chief of Staff

DISTRIBUTION:

External:

Action:

List XV
List X
List VIII
List XIV

Internal:

Action:

List I
List II

Information:

External:

List III
List VII
List XI
List XII
List XIII
SHAPE J7

1
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

TABLE OF CONTENTS

CHAPTER 1 – GENERALITIES ON JOINT ANALYSIS PAGE PARAGRAPH

Analysis in NATO 6 1-1


NATO Joint Analysis Requirements 6 1-2
Output of Joint Analysis 6 1-3

CHAPTER 2 – DEVELOPING JOINT ANALYSIS REQUIREMENTS

Nature of Joint Analysis Requirements 8 2-1


Developing of Joint Analysis Requirements 8 2-2
Submission of Joint Analysis Requirements 8 2-3

CHAPTER 3 – SELECTING JOINT ANALYSIS REQUIREMENTS

General 9 3-1
Collection and selection of Joint Analysis Requirements 9 3-2
Flowchart of the JAR process 9 3-3
Timeline 9 3-4
Submission of Emergent Joint Analysis Requirements 10 3-5
Cancellation of Joint Analysis Requirements 10 3-6

CHAPTER 4 – PRODUCING JOINT ANALYSIS REPORTS

Joint Analysis Report Structure 10 4-1


Progress Reporting 11 4-2
Distribution of Joint Analysis Reports 11 4-3

CHAPTER 5 – HANDLING OF JOINT ANALYSIS REPORTS

Reviewing Draft Joint Analysis Reports 12 5-1


Staffing Final Joint Analysis Reports 12 5-2
13 5-3
Process
13 5-4
Implementing Remedial Actions

ANNEXES:

A. Guidance for writing Joint Analysis Requirements.


B. Example of Joint Analysis Requirements.
C. Clarification questions to refine Joint Analysis Requirements.
D. Comment sheet for draft reports.
E. Glossary of Terms.

2
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

CHAPTER 1 – GENERALITIES ON JOINT ANALYSIS

1-1. Analysis in NATO

a. AAP-06 defines analysis as “the study of a whole by examining its parts and their
interactions”. For the NATO lessons learned process, analysis is used to thoroughly
understand areas and issues identified for which there is potential for improvement.
Analysis supports decision makers by providing impartial advice derived from rigorous
methods.

b. Joint Analysis is the act of determining the root cause(s) of an observed issue
and identifying the Remedial Action(s) (RA) that will address those root causes to
correct the problem or sustain the success. Joint Analysis is generally conducted at the
operational and strategic level leading to Lessons Identified (LI) and/or Best Practices
(BP).

c. Subordinate to HQ SACT, the Joint Analysis and Lessons Learned Centre


(JALLC) is NATO’s lead agent for Joint Analysis. Consequently, this directive is mainly
addressing Joint Analysis as conducted by JALLC.

d. The NATO Lessons Learned Policy notes that the NATO accredited Centres of
Excellence (COE) also have the potential to conduct joint analysis projects of interest to
NATO. Such a role may be carried out in conjunction with other analysis entities such
as the JALLC.

e. It is recognized that other entities such as NATO agencies, NATO bodies and
nations may conduct or contribute to Joint Analysis.

1-2. NATO Joint Analysis Requirements

a. NATO Joint Analysis Requirement (JAR) identifies an observed complex NATO-


wide issue of an enduring nature.

b. NATO Strategic Commands receive JARs from a variety of sources including:


NATO HQ International Staff and International Military Staff, NATO nations, NATO
Centres of Excellence, NATO Agencies, and subordinate commands.

1-3. Output of Joint Analysis

The output of a JAR is a comprehensive Lessons Identified usually takes the form of a Joint
Analysis Report, as detailed in para 4.1.

6
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

CHAPTER 2 – DEVELOPING JOINT ANALYSIS REQUIREMENTS

2-1. Nature of Joint Analysis Requirements. A JAR should stem from a requirement to
thoroughly understand areas and issues identified for which there is potential for improvement
and that is of great importance to NATO or the Nations, and is of an enduring nature. In
addition, JARs should:

a. Be derived from observations of trends, patterns or risks rather than from isolated
events in respect of which findings from joint analysis will rapidly become obsolete.

b. Identify an issue/problem or a good practice for analysis of an enduring nature


that will still be of relevance to NATO (i.e. not overtaken by events) after Joint Analysis
has been conducted (see Annex A).

c. Primarily concern the strategic-operational sphere with strategic-operational


impact.

d. Focus on NATO as a whole, e.g. have an impact on NATO doctrine.

e. In principle, be approved through the respective chain of command.

2-2. Drafting of Joint Analysis Requirements. It is vital that JARs coherently articulate
the identified problem or issue and have applicability/utility to NATO in order to be accepted.
Annex A., B. and C. provide guidance for drafting a JAR, including suggestions for what can
strengthen, as well as weaken, a JAR.

2-3. Submission of Joint Analysis Requirements. Proposals for JARs are to be


submitted to HQ SACT or SHAPE biannually using the format in Annex A. A proposal for a
JAR should be submitted by the entity that is faced with an issue as described above in
paragraph 2.1, who stands to benefit from the findings from joint analysis, and will be the
customer of the joint analysis project to be undertaken. If more than one JAR is submitted by
the same originator, they should be listed in priority order. When considering whether a JAR
should be submitted, the originator should take into consideration that the Joint Analysis
process takes 9-12 months from start to finish (i.e. three months for the selection and tasking
process and then six to nine months to conduct Joint Analysis and produce a Joint Analysis
Product (usually a report). Additionally, the time to implement the LI to a LL has to be
considered.

7
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

CHAPTER 3 – SELECTING JOINT ANALYSIS REQUIREMENTS

3-1. General. The Strategic Commands are responsible for selecting JARs for analysis by
JALLC and other entities. The selection is performed by the Bi-SC Lessons Learned Steering
Group (LLSG) in accordance with Reference B., and should be the result of a process driven
by SACEUR and SACT priorities to ensure they are relevant to current situations and future
capability development in NATO.

3-2. Selection Process. Since JALLC is NATO’s lead agent for joint analysis, the selection
process is aligned with the timeline for issuing of the JALLC Programme of Work (POW). As
part of the JALLC POW, the PARL delineates ARs to be carried out by JALLC. The PARL is
formally issued twice a year by HQ SACT and is based on the prioritized list of JARs
developed by the Bi-SC LLSG. The PARL includes JARs assigned to JALLC, COEs, and
other entities.

3-3. Flowchart of the JAR Process. The semi-annual process to collect and select JARs is
visualized in the diagram.

8
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

Flowchart of the JAR process: Collection and Selection of JARs.

a. HQ SACT issues a biannual calling letter to NATO Commands, NATO Forces,


NATO Agencies, NATO COEs and NATO Nations.

b. SHAPE PLANS J7 collects, staffs and prioritizes JARs from SHAPE, subordinate
ACO Commands, and NFS HQs. A consolidated list of JARs is forwarded to HQ SACT
with copy to JALLC.

9
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

c. HQ SACT, CEI, IDLL Branch collects, staffs and prioritizes JARs from HQ SACT
and subordinate commands and forwards a consolidated list of JARs to SHAPE with
copy to JALLC.

d. HQ SACT CEI, IDLL Branch collects, staffs and prioritizes JARs from NATO HQ,
NATO COEs, NATO Agencies, NATO Nations and others and forwards a list to SHAPE
with copy to JALLC.

e. SCs, supported by the JALLC, will staff the JARs for final wording and final
selection and prioritization. Achievability and usefulness (in terms of time) are key
criterions for selecting a JAR.

f. The JARs will be prioritized by SHAPE and HQ SACT in coordination with JALLC
and approved by the LLSG in accordance with Reference B, resulting in the Prioritized
Analysis Requirement List (PARL). The LLSG will decide which JARs from the PARL
will be incorporated in the JALLC POW and issued to JALLC by HQ SACT. The
remaining JARs from the PARL will be considered by SCs for execution by NATO COEs
or other suitable entities. HQ SACT will forward those JARs to the respective NATO
COEs and/or entities.

3-4. Timeline. The calendar of actions for JAR submission, selection, and tasking (JALLC
POW and other tasking) is indicated in the table below.

WHEN WHO WHAT


1st / 2nd
semester
HQ SACT (in coordination
Jan / Jul Sends calling letter to request JARs.
with SHAPE)
Initiate internal JAR development and staff
NCS, NFS, COEs, NATO
Jan / Jul level coordination with the JALLC as
nations, NATO Agencies
needed.
NCS, NFS, COEs, NATO Submit draft JARs to SHAPE J7/HQ SACT
March / Sep
nations, NATO Agencies IDLL through chain of command.
Coordinate and prioritize JARs in order to
SHAPE, HQ SACT & JALLC
May / Nov develop the PARL, the JALLC POW, and
through the LLSG
for other tasking of analysis.
Releases the JALLC POW or
Jun / Dec HQ SACT amendments to JALLC POW. Other
tasking as needed.

3-5. Submission of Emergent Joint Analysis Requirement. It is recognised that JARs


may be submitted in respect of which analysis capabilities need to be engaged at shorter
notice than is possible through the formal PARL/Program of Work (POW) process. An
Emergent Joint Analysis Requirement (EJAR) provides a mechanism to accelerate requests

10
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

for analysis that were not known or could not be anticipated in time to be included in the
PARL/POW. Although the EJAR process permits acceleration of critical analyses, emergent
submissions follow the same overall procedure as PARL/POW submissions. A revised
PARL/POW will be made when the EJAR has an impact on the PARL/POW.

3-6. Cancellation of JARs. A formal request for cancellation of a JAR can be forwarded
through the chain of command for Bi-SC coordination, and reprioritization of the PARL/POW,
by the SCs, the Customer, or the entity who conducts the Joint Analysis. When approved, HQ
SACT sends out a formal cancellation and a revised PARL/POW.

11
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

CHAPTER 4 – PRODUCING JOINT ANALYSIS REPORTS

4-1. Joint Analysis Report Structure

a. Analysis reports should in principle include the following:

(1). Executive Summary

(2). JAR and Objectives.

(3). Factors affecting the analysis.

(4). Analysis Techniques applied/chosen.

(5). Analysis of the root causes/discussion.

(6). Conclusions.

(7). Recommendations.

(8). If appropriate, an Annex with all recommendations in the ODCR-format


(Observation, Discussion, Conclusion and Recommendation). As a minimum,
each recommendation should clearly identify:

(a). A proposed solution.

(b). A single proposed Tasking Authority1.

(c). One or more proposed Action Bodies1.

b. Where appropriate categorization of the recommendations can be made in the


DOTMLPFI lines of capability development (Doctrine, Organization, Training, Materiel,
Leadership, Personnel, Facilities, and Interoperability).

c. Classification of reports should be kept at the lowest possible level in accordance


with NATO security policy in order to facilitate distribution and sharing.

4-2. Progress Reporting. While executing a joint analysis project, the analysis team will
keep close contact with the Customer. Additionally, HQ SACT (IDLL Branch) is to be kept
informed on the progress status of all analysis projects. Monthly updates by the originator of
the Joint Analysis project, Lessons Learned Working Group meetings and Identified Point of
Contacts (POC) at staff level will facilitate the exchange of information.

1
Please see Glossary of terms (Annex E).
12
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

4-3. Distribution of Joint Analysis Reports

a. Experience has demonstrated that the distribution of a draft analysis report for
comment prior to the production of the final analysis report improves the quality of the
product.

b. Draft reports will generally be sent to the Customer, HQ SACT 2, SHAPE 3 and
stakeholders for comments. SCs will indicate any possible limitations in the initial
publication and distribution of the final report.

c. Final reports can be published by the originator prior to the formal approval
process (as described in Chapter 5) but with a disclaimer stating that content of the
report is the independent opinion of the originator and is not approved by the
appropriate NATO authority.

2
Always including IDLL Branch.
3
Always including SHAPE J7 PLL Branch.
13
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

CHAPTER 5 – HANDLING OF JOINT ANALYSIS REPORTS

5-1. Reviewing Draft Joint Analysis Reports

a. The draft analysis report review process facilitates quality, usefulness, approval,
and publication of the final report by:

(1). Providing comments and assessment of the analysis report and the
recommendations.

(2). Reviewing the proposed Tasking Authority and Action Bodies for the
recommendations of the final report.

(3). Determining the initial distribution of the final report.

(4). Ensuring that subject matter experts have early access to the content of
the report, and particularly to proposed Remedial Actions and Action Bodies.

(5). Providing recommendations on the structure of the report to allow better


tasking and tracking of the Remedial Actions.

b. Comments are to be submitted to the report originator using the comment sheet
in Annex D. The report originator must include this important process in their timeline
for the production of the Joint Analysis Report.

c. The originator of the report incorporates comments and provides rationale for
rejected comments using the comment sheet in Annex D.

5-2. Staffing Final Analysis Reports. Staffing of a final analysis report is focused on the
notation, endorsement or approval 4 of the analysis report and specific recommendations
in the report.

a. Notation reflects the receipt of information on an issue. Notation requires no


further action nor does it imply agreement. Implicit in this definition is that it is not
possible to refuse Notation.

b. Endorsement represents formal agreement, but where the matter requires


subsequent consideration and approval by another authority and/or at a higher level.

c. Approval constitutes final and formal agreement on matters which are within the
authority’s remit without reference to other authority. Such agreement will normally
result in approval for follow-on action or activity.

5-3. Process. The handling of an Analysis Report is described below:

4
The Bi-SC directive 080-06 will be revised accordingly in order to align this terminology as it is used by NATO.
14
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

a. The originator submits the Joint Analysis Report to HQ SACT. A copy thereof is
sent to SHAPE 5.

b. HQ SACT in coordination with SHAPE will confirm the Tasking Authority for each
recommendation/Remedial Action.

c. If the Tasking Authority is HQ SACT, HQ SACT will note or approve the analysis
report’s recommendations/Remedial Actions.

d. If the Tasking Authority is SHAPE, SHAPE will note or approve the


recommendations/Remedial Actions of the analysis report and inform HQ SACT.

e. If the Tasking Authority is outside the SC structure, for example MC or NAC, HQ


SACT submits the report to that authority with a cover letter. HQ SACT may note or
endorse the report or specific recommendations/Remedial Actions in this cover letter.

f. If there are recommendations/Remedial Actions for both SCs and some


authorities outside the SC structure, the Tasking Authority will be decided on a case by
case basis. In such a case there may be a number of Tasking Authorities and a
Coordinating Tasking Authority may be designated.

g. The Tasking Authority decides whether the recommendations/Remedial Actions


are noted or approved and informs HQ SACT.

h. HQ SACT (IDLL Branch) collects all decisions made by SCs and other Tasking
Authorities outside the SC structure on the final analysis report and issues a cover letter
to the report originator, summarizing the notation, endorsement and approval of
recommendations or each remedial action. HQ SACT may also distribute this cover
letter to relevant stakeholders in NATO and the nations for information and future
action. HQ SACT cover letter may also include issues such as publication instructions.

i. The final analysis report with the HQ SACT cover letter will be published by the
originator in the NATO Lessons Learned Portal.

j. HQ SACT will track the approval, endorsement and notation of Joint Analysis
Reports and a periodic update will be provided to SHAPE, JALLC and other entities.

5-4. Implementing Remedial Actions

a. Once approved, the Tasking Authority tasks the Action Body to implement the
Remedial Action. If there is more than one Tasking Authority identified, a Coordinating
Tasking Authority may be selected to coordinate the implementation.

5
Always including IDLL Branch and SHAPE J7 PLL Branch.
15
NATO UNCLASSIFIED
NATO UNCLASSIFIED
Bi-SCD 080-091

b. Tasking and implementation of Remedial Actions should begin immediately after


the decision on approval.

c. Action Bodies are responsible for implementing the Remedial Actions and to
report progress to the Tasking Authority. Based on this reporting, the Tasking Authority
is responsible to track the implementation and for the update in the NATO LL Portal of
the status of each approved recommendation.

16
NATO UNCLASSIFIED
NATO UNCLASSIFIED

ANNEX A TO
Bi-SC DIR 080-091
DATED:13NOV 15

GUIDANCE FOR WRITING JOINT ANALYSIS REQUIREMENTS.

1. Before Submitting a Joint Analysis Requirement. If the analysis can be handled


within own command/body, it is not an appropriate submission for analysis. However, if the
analysis cannot be remedied by own command/body, it may be appropriate to forward an
analysis request upwards, through the chain of command, for external analysis.

2. What is a JAR? The Joint Analysis Requirement (JAR) is the single, most critical
building block of any analysis. The JAR delineates the issues and problems that need to be
better understood and these, in turn, will drive the analysis that will be conducted. If the JAR is
not clear, the resulting findings will be less useful. The JAR is the primary tool that the
Originator and/or Customer has to influence, direct and guide the analysis. When drafting an
analysis requirement, make sure that key direction and guidance for the analysis is captured in
the JAR itself, not just as part of the background or other information provided. Seldom will an
initial draft JAR be perfect and most require some adjusting and rewording. One of the initial
tasks for every project is to discuss the JAR in detail with the Customer to ensure it is focused
appropriately, within the available capabilities and resources, and understood and agreed by
all parties. A JAR should capture the need to identify an issue/problem or best practice of an
enduring nature that is of importance to the Alliance and its transformation. When considering
whether a JAR should be submitted, take into consideration that the Joint Analysis process
takes 9-12 months from start to finish (i.e. three months for the selection and tasking process
and then six to nine months to conduct Joint Analysis).

3. What to avoid in a Joint Analysis Requirement. Some inclusions, inconsistencies


and omissions may reflect negatively on a JAR, such as:

a. Pre-Determined Conclusions and Directional Bias such as:”…the lack of


information sharing…”, “…the inefficiencies of…”, “…the difficulties created by…”. One
of the main problems with these types of phrases is that they skew the analysis from the
start. Generally they require focusing the analysis in just one direction and ignoring all
the other factors that may be relevant…but just don’t happen to be the dictated
direction. These tend to direct to “tell me why this is bad”, rather than asking what to
analyse in “this” and let the data and the analysis capture the entire set of factors that
are relevant, whether good or bad. In addition, only focusing on the negatives may lose
sight of what is working well and needs to be preserved as changes are made to
address the problems.

b. Agendas and Politically Sensitive Issues. These are similar to the above, but
are less easy to spot as they are usually not explicitly stated. One of the more frequent
is where the JAR states or implies that another NATO entity (rather than something they
do) is to be evaluated. For example, “analyse Training Centre X…”—this puts the

A-1
NATO UNCLASSIFIED
NATO UNCLASSIFIED

analyst in the position of passing judgement on the particular training centre. In contrast,
“analyse the effectiveness of the scenarios developed for deployable force exercises [by
Training Centre X]…” is a realistic task since a positive contribution is possible by
providing Training Centre X insights on a product they have developed and/or
processes they have employed.

c. Broad, Sweeping, Vague, and/or Ambiguous Phrasing such as: “Analyse the
information needs in ISAF”. Problem: information needs are extremely broad and the
statement needs to provide focus such as specifying more who/what/where and/or
when; “…determine how C2 can be improved” –problem: again needs more specificity
as to what C2, which level, in what context, which parts of it, etc.; “…look at the HQ”—
problem: again, what parts, where, in what context? In these particular examples, “C2”,
“HQ”, “information needs” are all huge areas and without greater clarity, it will be nearly
impossible to direct and focus the analysis efforts adequately.

d. Inconsistencies/omissions between the JAR wording and materials covered in


the various background information included with the JAR package. For example, the
background stresses a certain problem or requirement but these items are either
worded inconsistently or even omitted from the actual JAR.

e. Extremely narrow or specific wording that constrains any findings solely to the
particular HQ requesting the analysis. Often slight changes in the wording can result in
the broader, potentially NATO-wide applicability of the findings.

f. Unachievable JARs. A JAR can fall under the category of being unachievable
for a variety of reasons. Spotting flaws of this nature is not always easy, but factors can
be identified by carefully examining the JAR. The following highlight some specific
contests that can contribute to a JAR being unachievable:

(1) Lack of Scientific or Logical Soundness. It is important that the requests


be logically sound. In other words, the “flow” of the requirement from a starting
point to a clear end state must be traceable. Indicators that there may be some
logical/scientific flaws include dead ends and circular logic (internal loops). With
dead ends, there will be aspects of the requirement that leads in directions that
have no clear relationships to what seem to be the main intent underlying the
JAR. With loops, internal dependencies that cannot be clearly resolved are
found, for example if two separate sections of the JAR depend on successful
prior analysis of the other.

(2) Unrealistic Expectations. The JAR may be scientifically/logically sound,


but the expectations of the customer and the level of analysis are unrealistic in
the context of the resources and time available to conduct the analysis. Some
requests stipulate topics, requirements and/or conditions that just cannot be met
within the resource and time constraints. When these types of JARs are allowed
to go forward, the results are nearly always unsatisfactory.

A-2
NATO UNCLASSIFIED
NATO UNCLASSIFIED

(3) Lack of Access to Data. Success depends on being able to collect the right
data required by the analysis. If the analyst cannot get access to the data (or it
just doesn’t exist) analysis will be unachievable. Be concerned if the needed
data is very closely held, if there are questions as to whether the data even
exists, or if it is uncertain whether the individuals who have the knowledge will be
available / willing. Sometimes the data exists but the individuals who have the
knowledge of this data are no longer available or they are now scattered all
around the world. If so, finding these individuals can be very difficult (they may
no longer be in NATO as well) and if found, the cost/resources/time involved in
collecting the required data may be impractical. The challenge is to determine if
there will be access problems early on rather than well into the project.

4. Template. The form below must be used for submission of Joint Analysis
Requirements. Each field is explained in the form.

A-3
NATO UNCLASSIFIED
NATO UNCLASSIFIED

JOINT ANALYSIS REQUIREMENT TEMPLATE

IDENTIFICATION: The identification convention for JARs includes the Command followed by a
serial number, indicating the priority. For example: HQ ISAF 01, JFCNP 01, C-IED COE 01,
SHAPE 01, etc.
TITLE: The title should be short and descriptive.
CUSTOMER: The Customer is the owner of the problem. He is a decision making entity
(preferably an individual at one star level or higher) that stands to benefit from the analysis product
and will initiate the onward use of the analysis product in NATO.
CUSTOMER POC: The Customer representative is an individual nominated by the customer to
provide day-to-day advice and support to the analysis team. List contact details.
STAKEHOLDERS: Parties who are likely to be affected by or have an interest in the outcome or
conduct of the analysis. The stakeholders should be clearly stated by the JAR originator.
BACKGROUND / AREA OF OBSERVATION: The purpose for this section is to set the scene
and include background on the issue. It should state why the analysis topic is important for the
submitting HQ and why it is important for NATO. This section is designed to allow the requester to
provide insights into all of the different considerations and factors that led to the submission. It is a
means by which a deeper understanding of the entire issue/problem can be gained.
ITEMS FOR ANALYSIS: This will determine the scope of the analysis. This section describes
exactly what has to be analysed. This can usually be identified by expressions that include
processes or issues such as: ”the generation of …”, ”the execution of…”, ”the information
exchanges amongst…”, ”the barriers that hinder…”, ”the processes used to…”, etc.. It should
ensure that the analysis efforts are directed properly. It is advisable to include areas for analysis as
well as areas that the study should not consider. Specify constraints, limits, specific areas/subjects
of focus, or special instructions such as: ”within the Joint HQ…”, ”between the MAIN, Forward
Element and the Components…”, ”between ISAF HQ, the IJC and the RC Commands…”, ”Taking
into account…”, ”Particularly emphasizing the need for…”, etc.

ANALYSIS DELIVERABLE: This section has different purposes. The first is a clearly
articulated list of the products to be delivered based on the analysis such as: ”Recommendations
about…”, ”Point Paper…”, ”summary of lessons identified…”, ”inputs for the Final Exercise
Report…”, ”a briefing for…”, etc.. Note that on average, an analysis takes approximately six
months to complete. If analysis findings are required earlier then this needs to be specified here.
The second purpose is to provide a description of what the analysis is intended to achieve, i.e.,
what decisions, judgements, actions or doctrine the findings may influence, and how it is envisaged
the findings will influence these. It is a description of what the analysis needs to cover in order for it
to be considered successful. Indications of the customer’s “Desired End States” and “Strategic
Objectives”. These can usually be identified by expressions such as: “In order to optimize
information flow…”, “To enhance Unity of Command…”, ”to enable the Operational and Component
Commands to make the most effective and efficient use of available resources…”, etc.

FORMULATED JOINT ANALYSIS REQUIREMENT: Description of the JAR. Bring together


the problem, scope and desired outcomes.
TIMELINE: What is the requested target date for the report? List critical factors.
IMPACT STATEMENT: Describe the impact if the analysis is not performed.

A-4
NATO UNCLASSIFIED
NATO UNCLASSIFIED

ANNEX B TO
Bi-SC DIR 080-091
DATED: 13 NOV 15
EXAMPLE OF JOINT ANALYSIS REQUIREMENTS

IDENTIFICATION: IMS-01
TITLE: A Decade of Operations
CUSTOMER: DG IMS
CUSTOMER POC: LTC John Doe, NCN xxx-yyyy, john.doe@nato.mil
STAKEHOLDERS: ISAF, SHAPE, HQ SACT, JFCBS
BACKGROUND / AREA OF OBSERVATION: Since NATO assumed command of
military operations in Kabul in August 2003, NATO’s ISAF Operations in Afghanistan have
identified a multitude of Lessons. Many of those Lessons have either led or are leading to
significant transformations in how NATO is structured and how it functions. Some significant
Lessons, though, have still not been set on a proper course to properly benefit future NATO-led
operations.
ITEMS FOR ANALYSIS: A study is needed to identify important Lessons and best practices
from ISAF Operations that have already been identified and that, if Learned, could yield significant
benefit to future NATO-led operations.
ANALYSIS DELIVERABLE: The deliverable of this study should be a summary-level report
identifying functional areas where best practices and Lessons Identified during ISAF Operations
indicate that, while significant improvements to NATO policies, doctrine and capabilities have been
realized, further transformation could yield significant benefit to future NATO-led operations.
FORMULATED JOINT ANALYSIS REQUIREMENT: Conduct a comprehensive study of
ISAF, beginning with NATO assuming command in August 2003, portraying how the collective
experience has contributed to major evolutions in NATO policy, doctrine and capabilities, in order
to identify the enduring lessons for future NATO-led operations.
TIMELINE: Must be completed before end of 2014 due to end of ISAF mission.
IMPACT STATEMENT: Valuable lessons from many years of conflict may be lost and may not
be incorporated in a much needed update of doctrines.

B-1
NATO UNCLASSIFIED
NATO UNCLASSIFIED

ANNEX C TO
Bi-SC DIR 080-091
DATED: 13 NOV 15

CLARIFICATION QUESTIONS TO REFINE JARS

1. Purpose. This Annex is provided to assist staff in developing JARs, and should be
seen as a supplement to Annex A.

2. Check list. When drafting a Joint Analysis Requirement, for each one of the fields to fill
in the format, answering the following questions will be useful to refine the JAR:

a. JAR identification and title. Straight forward.

b. The customer

(1) Should your HQ/Command be the Primary Customer?

(2) Generally, the Command/HQ that submits an analysis request will be


deemed the appropriate level to be designated the Primary Customer for the
analysis.

(3) Occasionally, the analysis will have further reaching implications well
beyond the initiating Command. It may be deemed best for the Primary Customer
to be elevated to a higher HQ/Command.

c. Stakeholders

(1) Who could be affected by this analysis?

(2) Who could contribute to the analysis?

(3) What specialized expertise will be required to successfully conduct the


analysis?

d. Background/area of observation. Describe the issue to be looked into.

(1) What has been observed/seen that brought this to your attention?

(2) What facts/concrete points can you provide regarding this observation,
situation?

(3) When/In what context was this observed?

(4) What is your opinion as to the impact of the issue/problem?

(5) At what level of interest will these findings be to NATO (Polmil, Bi-SC
Strat, Joint, Tact, etc.)?

C-1
NATO UNCLASSIFIED
NATO UNCLASSIFIED

e. Items for analysis (scope). What you need to know as a result of the analysis.

(1) What are the specific questions you would like to have answered?

(2) What will you use the findings for (e.g., support decisions, change
procedures/policy, organizational/structural change, etc.)?

(3) Are you aware of any aspects of this analysis that will be "sensitive",
politically or otherwise that may cause the findings, no matter how accurate, to
be ignored?

(4) Are there any special constraints, limits, or specific areas/subjects to focus
upon during the analysis?

(5) Are you aware of any other analysis/work being done on this topic? If so,
what and by whom?

f. Analysis deliverable (outcomes). What should be the expected outcome of the


analysis?

(1) What is your desired end-state of the analysis?

(2) How will this deliverable help you? What do you hope it will help you do?

(3) What do you need deliverable to address/contain?

(4) How broad (number of areas, topics, etc.) do you need it to cover?

(5) When do you need the final deliverable?

(6) Are there any incremental deliverables you need (interim or initial
impression reports/summaries, briefings, etc.?

(7) When do you need these incremental deliverables?

(8) What do you NOT need answered/looked at (e.g., we have no concern


about the strategic level, etc.)?

(9) Will the findings only have an impact within a certain time frame?

(10) If required, can you provide subject matter expertise (SME) in support of
the analysis?

(11) Can you, or do you wish to provide support for the data collection and/or
analysis (i.e., do you want to have a Staff Officer as part of the actual Project
Team?

C-2
NATO UNCLASSIFIED
NATO UNCLASSIFIED

g. Formulated JAR (what). Writing a JAR.

(1) Attempt to write out, in fairly straight forward language the initial draft of
what you think/feel the request for analysis should say.

(2) List any questions or areas where there is uncertainty about what the JAR
should say or areas it should/should not cover.

Examples of formulated Joint Analysis Requirements:

(3) Redeployment from operations. Joint Analysis Requirement: Collect,


collate and summarize lessons and best practices from NATO, EU, UN, the
International Committee of the Red Cross (ICRC), and national redeployments in
order to inform the nations preparing to redeploy from the International Security
Assistance Force (ISAF).

(4) Conducting and resourcing combined training events and exercises. Joint
Analysis Requirement: Based on Training Event 12-1 and Unified Endeavor 12-
2 (TE 12-1 and UE 12-2), identify key factors affecting the outcome of Combined
Training Events and Exercises in order to improve future Preparations for
Operations.

(5) Exercise STEADFAST JUNO 10. Joint Analysis Requirement: Conduct


analysis of the Command and Control (C2) and Information flow issues among
the Joint Headquarters (JHQ) MAIN, JHQ Forward Element (FE) and
Subordinate Components in order to support further development of NATO’s
deployable forces concept(s).

(6) ISAF Pre-Deployment Training for the Police Operational Mentoring and
Liaison Teams. Joint Analysis Requirement: As part of the on-going ISAF
PDT, examine the POMLT PDT at Joint Multinational Readiness Centre (JMRC)
and Centre National d’ Entrainment des Forces de Gendarmerie (CNEFG),
analyse the individual and collective training content, suitability and resources,
with regard to mission deployment, in order to recommend improvement to the
training provided to individuals and units deployed to ISAF in the POMLT role.

(7) Shortfalls in the crisis response operations urgent requirement process.


Joint Analysis Requirement: Analyse the NATO-wide CUR approval process,
with emphasis on meeting ISAF operational needs, in order to make
recommendations to improve the performance of the CUR approval process.

(8) ISAF Command and Control. Joint Analysis Requirement: Examine the
functionality of the recently implemented ISAF C2 structure (HQ ISAF, IJC and
NTM-A) in order to identify recommendations to enhance the unity of command.

C-3
NATO UNCLASSIFIED
NATO UNCLASSIFIED

h. Timeline. Describes the customer requirement for completion of the Joint


Analysis. In time.

(1) Is there an urgent requirement for having the analysis completed earlier
than in six months?

(2) Will it be sufficient to get the preliminary findings early?

i. Impact Statement. What happens if the analysis is not conducted?

(1) Will it have operational or other consequences?

(2) What other projects are depending on this analysis?

C-4
NATO UNCLASSIFIED
NATO UNCLASSIFIED

ANNEX D TO
Bi-SC DIR 080-091
DATED:13 NOV 15
COMMENTS MATRIX TO DRAFT JOINT ANALYSIS REPORTS

The matrix shown below will be used to record comments during the staffing of draft Analysis Reports. The column headings depicted on the
matrix are self-explanatory. However, the following guidelines apply to the matrix. All comments will be numerically numbered and arranged in
chronological order. The comments will be categorized in the following manner: C – Critical (Contentious issue that will cause non-concurrence
with publication), S – Substantive (Factually incorrect, misleading, etc.). The originator, and paragraph, sub-paragraph and line is self-
explanatory. Comment should be placed in the Comment column. General observations without proposed solutions should not be submitted.
Rationale will be submitted for all comments. The adjudication column is used by the report OPR to record the adjudication of the comment. The
responses are Accepted (A), Accepted w/ Amendment (AA), Withdrawn (W), or Not Accepted (NA). All amendments to a comment are recorded
on the matrix. The matrix becomes the record of decisions for the publication review.

COMMENTS AND CHANGE PROPOSALS FOR ANALYSIS REPORT “XXX XXX XXX”
Serial C/S Originator Para Sub- Line Comment Rationale Adjudication
Para
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]

D-1
NATO UNCLASSIFIED
NATO UNCLASSIFIED
ANNEX E TO
Bi-SC DIR 080-091
DATED:13 NOV 15

GLOSSARY OF TERMS

Action Body (AB). The organisation or staff tasked with the implementation of assigned
remedial action (RA) or recommendations in association with a Lesson Identified (LI) and/or a
Joint Analysis Report. The AB develops an Action Plan to guide the remedial action activities.

Action Plan. The written plan of action and milestones developed by an Action Body to
implement assigned remedial actions and/or recommendations for a lesson identified and/or a
Joint Analysis Report.

Analysis. NATO defines analysis as “the study of a whole by thoroughly examining its parts
and their interactions”. In the LL process, analysis should allow discovery of the root cause of a
problem and identification of the appropriate RA to correct the problem and the appropriate AB
to achieve the correction. Analysis is initiated by an observation originator or a LL Staff Officer
creating an Analysis Requirement.

Analysis Objective (AO). A clear, demonstrable and achievable analysis tasking that identifies
what the analysis will accomplish. AOs derive from the analysis requirement and have a
tangible output.

Approval. Approval constitutes final and formal agreement on matters which are within the
authority’s remit without reference to other authority. Such agreement will normally result in
approval for follow-on action or activity (such as tasking the appropriate Action Body, committing
resources to implement one or more of the remedial actions from a recommended lesson
identified).

Coordinating Tasking Authority (CTA). The Coordinating Tasking Authority leads and
coordinates the handling of all recommendations and remedial actions. The CTA is established
when a report has many recommendations and actions with more than one Tasking Authority
involved. Normally, the CTA is the highest level of the involved Tasking Authorities.

Customer. The Customer is the Head of a HQ, body or entity (normally Flag Officer) who
normally is the Originator of the AR and takes ownership of and benefit from the AR. The Joint
Analysis Project Team will engage with the Customer throughout the analysis work as required
to ensure delivery of a useable report. The Customer may also be the Tasking Authority (TA).
However, the TA is finally defined when the recommendations or Remedial Actions are made.

Endorsement. Endorsement represents formal agreement, but where the matter requires
subsequent consideration and approval by another authority and/or at a higher level.

Implementation. For the LL process, implementation is “the work of the action body to
complete the tasked remedial action in accordance with the action plan”. Implementation may
include one or more action bodies completing a wide variety of actions across the DOTMLPF-I
spectrum.
E-1
NATO UNCLASSIFIED
NATO UNCLASSIFIED

Lesson Identified (LI). A LI is a mature observation with a determined root cause of the
observed issue and a recommended Remedial Action and Action Body, which has been
developed and proposed to the appropriate Tasking Authority. An Analysis Report is considered
as a comprehensive Lesson Identified.

Notation. Notation reflects the receipt of information on an issue. Notation requires no further
action nor does it imply agreement. Implicit in this definition is that it is not possible to refuse
Notation.

Originator of an AR. The originator is the Head of a HQ, body or entity (normally a Flag
Officer) who create the Joint Analysis Requirement (JAR). The Originator is normally
considered as the Customer. However, the Customer may change to ensure ownership of the
JAR at the appropriate level.

Remedial Action (RA). An activity or set of activities that correct an issue identified for
improvement or facilitates the implementation of a best practice.

Stakeholder. An organization which is involved, affected by, or has a special interest in or can
benefit from an Analysis Report.

Tasking. The act of formally directing an Action Body to execute the Remedial Action from a
Lesson Identified to correct an issue or implement a Best Practice. Tasking is directed by an
appropriate, authoritative NATO organisation (Tasking Authority) and usually includes a request
for an action plan.

Tasking Authority (TA). The Tasking Authority can decide on recommendations and Remedial
Actions (note or approve), commit resources and appoint and task one or more Action Bodies.
The TA is responsible for the implementation and the tracking from a LI to a LL.

E-2
NATO UNCLASSIFIED

You might also like