Integration Readiness Levels: Abstract-Virtually All Successful Systems Experience

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Integration Readiness Levels

Jennifer M. Long
Naval Air Systems Command
Patuxent River, MD 20670
301-757-7448
jennifer.m.long@navy.mil

AbstractVirtually all successful systems experience unique to the items functionality. These changes are often
evolution of their technical baseline over time. The introduced to a fielded product baseline via the Engineering
Technology Readiness Assessment (TRA) framework is Change Proposal (ECP) process, and are managed as mini
structured to assess the maturity of technology elements via programs embedded within a larger production or
the established metrics of Technology Readiness Levels sustainment effort. One of the challenges facing the systems
(TRLs), which address the risk associated with developing engineer is understanding the cost and schedule risk
and transitioning new technologies in the context of system associated with introduction of new subsystems to a mature
development. technical baseline. Often, there are established cost and
schedule figures allocated for the integration effort, which
The TRL methodology may need to be supplemented when may be based on considerations that have nothing to do with
a program is introducing a new configuration item that its complexity. Examples of this might include meeting a
doesnt leverage a new technology into a fielded product deadline when the new capability has to be available, or
baseline. The cost and schedule risk associated with fitting within a total budget that is considered reasonable or
developing the configuration item and subsequently affordable within the context of the program. However,
integrating it into the product baseline are often there are many factors that can influence the difficulty of an
underestimated at the start of the program. The integration effort, and generating high-fidelity cost and
consequences of this are usually cost and schedule growth schedule estimates prior to starting the effort is challenging.
that may threaten the health and executability of the As a result, it is often difficult to predict whether or not the
overarching program. planned upgrade is executable within the allocated schedule
and budget.
Systems engineers and program managers need a metric to
help evaluate the cost and schedule risk of integration The purpose of this paper is to present a methodology for
efforts prior to committing resources via contract award. predicting the cost and schedule risk associated with
This paper presents a list of questions that establishes a integrating new configuration items to established product
qualitative program risk assessment framework. The paper baselines. An Integration Readiness Level (IRL) metric is
also introduces the Integration Readiness Level metric, proposed that allows a risk assessment based on integration
which can be used to evaluate system integration risk based characteristics of the configuration item of interest. The
on integration characteristics. methodology is based on the Technology Readiness Level
(TRL) metric [1] in use in the defense industry to assess the
risk associated with introducing new technologies in the
TABLE OF CONTENTS context of development programs. The TRL metric is
presented for background information. The proposed IRL
1. INTRODUCTION.1
metric is then discussed, with an example of how it could be
2. DISCUSSION...1
used to assess integration cost and schedule risk.
3. CONCLUSIONS...9
4. REFERENCES.9
5. BIOGRAPHY..9
2. DISCUSSION
Experience within Department of Defense acquisition
1. INTRODUCTION
programs has demonstrated the risk associated with
1 introducing new technologies into development programs.
Virtually all successful systems experience evolution of the
For example, history shows that developing a new aircraft
technical baseline for a variety of reasons, including
that incorporates evolutionary technology can result in an
obsolescence, technology upgrades, user requirements
effort that grows well beyond the promised cost and
changes, planned incremental or spiral development, etc.
schedule, and an end item whose unit cost far exceeds what
This may involve software upgrades, or introduction of new
was originally planned. In some cases there may be
hardware subsystems, which typically involve software
performance gaps due to the new technology not being able
to provide anticipated capabilities. This situation typically
1
U. S. Government work not protected by U.S. copyright triggers a range of unpleasant consequences for the
2
IEEEAC Paper #1214, Version 4, Updated 2011:01:11 acquisition organization. Users may be left with a capability
1
gap if they need the new system to replace a legacy system Addition of requirements after contract award, or failure to
that is obsolete or scheduled for disposal, or to provide a maintain configuration control over the governing
new capability to meet evolving requirements. requirements document, will result in cost and schedule
growth. The magnitude of this is difficult to predict in
The Technology Readiness Assessment (TRA) framework advance.
[1] was developed to assess the maturity of technology
elements via the established metrics of Technology 2. Are the basic development milestones (hardware and
Readiness Levels (TRLs). The TRL metric assigns a ranking software) for the configuration item understood, and is the
to the maturity of the technology element using specific infrastructure (contractual and process) in place to track
evaluation criteria; the scale can be used to assess the risk of them at a sufficient level of fidelity to be linked into the
developing and transitioning new technologies in the overarching systems integrated master schedule/integrated
context of a larger system acquisition. Table 1 contains a master plan at contract award?
description of the Technology Readiness Levels. DoD Discussion: The overarching system will have dependencies
policy requires that technologies are at TRL 6 prior to with the configuration item under development. If the
entering System Design and Development. development milestones are not linked into the overarching
systems program planning tools and control processes, then
The TRL methodology outlined in Table 1 is designed to the effect of delays in configuration item development will
assess the maturity of a new technology, i.e., is it far enough not be visible at the system level, resulting in unpleasant
along in development that it can reasonably be included in a surprises when the configuration item fails to meet its
system? This guidance may need to be supplemented when development and delivery milestones.
a program is integrating a configuration item that
incorporates an established technology. The maturity of the 3. Does the configuration item contain embedded software?
technology is not in question in that case; the risk If so, is that software already developed, or is it pending
assessment should focus on the maturity of the hardware maturation to accomplish software development
configuration item and on the difficulty of integrating it into and integration?
the overarching system. If the configuration item is not Discussion: If the configuration item contains software that
ready for integration (i.e., in a prototype state or under is a development item in and of itself, the cost and schedule
development), then there is some level of risk inherent in risks are significant unless substantial amounts of time and
trying to run an integration effort concurrently. If the funding are allocated for that element of the program. Risk
configuration item is ready for integration, but was may actually increase if software is being extensively
developed for a different application that is not similar to modified from a previous application, rather than developed
the current system, then the integration effort may be from scratch for the new application.
difficult.
4. If the answer to question 3 is yes, does the
Qualitative Risk AssessmentThere are some basic configuration items software have to interact with the
questions that can be considered to establish a qualitative overarching systems hardware/software? If so, is there an
level of risk for the integration of configuration items (CIs) ICD already in work, and what is the maturity of the ICD?
into a technical baseline. The questions are general, and Discussion: Adding a new interface to any system typically
some may not apply to any given integration effort. And, for results in an iterative integration process that can be time
any given integration effort, other questions are possible. consuming. If the ICD is still in flux at contract award,
This list is intended to illustrate the basic concept of additional time will be required to mature the interface.
qualitative risk assessment.
5. If the answers to questions 3 and 4 are yes, is there a
1. Are the configuration items performance requirements reasonable expectation that the (sub)/contractor will perform
established prior to starting integration, and does a process at a CMMI Level 3 for the execution of this effort?
exist to place them under configuration control by the Discussion: Lack of process discipline in software
governing program? development, particularly as it relates to interface design,
Discussion: Performance requirements should be will result in excessive re-work.
documented in a specification, which is under the control of
the acquisition organization, to ensure that the contractor
prices a program that includes the necessary work to
develop and integrate a system that performs as intended.

2
Technology Readiness Level Description

1. Basic principles observed and Lowest level of technology readiness. Scientific research
reported. begins to be translated into applied research and development.
Examples might include paper studies of a technology's basic
properties.

2. Technology concept and/or application Invention begins. Once basic principles are observed, practical
formulated. applications can be invented. Applications are speculative and
there may be no proof or detailed analysis to support the
assumptions. Examples are limited to analytic studies.

3. Analytical and experimental critical Active research and development is initiated. This includes
function and/or characteristic proof of analytical studies and laboratory studies to physically validate
concept. analytical predictions of separate elements of the technology.
Examples include components that are not yet integrated or
representative.

4. Component and/or breadboard Basic technological components are integrated to establish


validation in laboratory environment. that they will work together. This is relatively "low fidelity"
compared to the eventual system. Examples include
integration of "ad hoc" hardware in the laboratory.

5. Component and/or breadboard Fidelity of breadboard technology increases significantly. The


validation in relevant environment. basic technological components are integrated with
reasonably realistic supporting elements so it can be tested in
a simulated environment. Examples include "high fidelity"
laboratory integration of components.

6. System/subsystem model or prototype Representative model or prototype system, which is well


demonstration in a relevant environment. beyond that of TRL 5, is tested in a relevant environment.
Represents a major step up in a technology's demonstrated
readiness. Examples include testing a prototype in a high-
fidelity laboratory environment or in simulated operational
environment.

7. System prototype demonstration in an Prototype near, or at, planned operational system. Represents
operational environment. a major step up from TRL 6, requiring demonstration of an
actual system prototype in an operational environment such as
an aircraft, vehicle, or space. Examples include testing the
prototype in a test bed aircraft.

8. Actual system completed and qualified Technology has been proven to work in its final form and
through test and demonstration. under expected conditions. In almost all cases, this TRL
represents the end of true system development. Examples
include developmental test and evaluation of the system in its
intended weapon system to determine if it meets design
specifications.

9. Actual system proven through Actual application of the technology in its final form and
successful mission operations. under mission conditions, such as those encountered in
operational test and evaluation. Examples include using the
system under operational mission conditions.

Table 1: Technology Readiness Levels (Ref. 1)

3
6. Is the final form factor of the configuration item 10. Does the incorporation of the configuration item into the
significantly different from existing applications or system trigger any new regulatory processes involving
prototypes? If so, is miniaturization part of the development certification agencies (for example, the FAA)?
effort? Discussion: Certification agencies tend to have a high
Discussion: Often, a prototype item will be put forth to demand for their services, with a finite amount of manpower
demonstrate that the contractor knows how to do this, and to accomplish the work. This can pose a schedule risk if not
the development effort is assumed to be low risk. This may properly accounted for in the planning. Assuming that
be a dangerous assumption, particularly if the item needs to certification activities can happen more quickly than
be miniaturized for use in the intended application. Thermal normal is usually risky.
control (if electronics are involved) and tolerance
management become critical in this case. The preceding discussion seems fairly intuitive. The
questions are all good ones to ask when starting an
7. Does the contractor have experience with developing integration effort, and careful consideration of the answers
items that are ruggedized to withstand real world electro- establishes a risk framework for program planning.
magnetic and mechanical environments? However, when integration efforts are being put on contract,
Discussion: Depending on the application, real world systems engineers may be asked to assess the realism of
environments can be tremendously stressing, and the design the schedule. Contractors are often over-aggressive in their
techniques to ensure robustness require specialized proposed price and schedule in a highly competitive
knowledge and experience. A contractor whose main environment. Program managers generally want contracted
business is designing electronics that normally sit in a living efforts to be structured as moderate risk to keep the team
room may not have the necessary expertise to design working hard to meet the promised cost and schedule. As a
electronics that are used in military applications. result, there is generally pressure to make the program fit
Qualification test failures and costly re-work late in the within pre-established cost and schedule boundaries.
program will occur.
Integration Risk AssessmentThe systems engineer needs
8. If the system is carried on and launched from an aircraft, to be able to provide a realistic estimate of the cost and
does incorporation of the configuration item involve mass schedule required to complete the effort, and given that the
properties or external mold line changes that would answer may not be what the contractor and program
invalidate previous analysis and flight test results? manager want to hear, a qualitative assessment will
Discussion: Seemingly minor changes in external mold line probably not be sufficiently convincing. Quantitative criteria
or mass properties can trigger extensive analysis and test based on metrics are needed to substantiate the argument.
efforts. Specifics of the changes need to be socialized with An Integration Readiness Level (IRL) metric is proposed
certification agencies prior to starting the program, so that that relates the integration characteristics of configuration
necessary analysis and test can be planned. items to integration risk levels. The metric is structured
along the same lines as the TRL metric, but is intended to
9. Is the configuration item under development slated for allow assessment of the risk associated with software and
multiple programs? If so, is there a joint set of hardware development of configuration items (CIs), rather
requirements that is overly-conservative based on than technologies. The discussion herein focuses on cost
skylining requirements from the individual programs? and schedule risk. Systems engineers need to be concerned
Discussion: Skylining requirements is easy, designing items about performance risk as well, but in most integration
to work in environments that could never realistically occur efforts performance can be achieved given enough time and
may not be. It takes some effort up front to analyze the money, since the basic technologies are well-established. In
individual requirements sets and break them out by that case, the underlying risk actually reaches back to the
condition, but it may help to avoid design difficulties and resources available to conduct the integration effort.
qualification test failures downstream.

4
Risk Level Integration Readiness Level Integration Characteristics
High 1. Configuration item (CI) is a The CI is a software-intensive system, no re-use
software-intensive system with no of code is planned, the supplier has little or no
previous baseline to build on. The experience with the current application, and
interface with the host system is not interfaces with the host system are not defined.
defined.
High 2. CI is a software-intensive system The CI is a software-intensive system, some re-
with some re-use possible, and use of code is planned, the supplier has
interface requirements are not experience with a similar application, and
defined. interfaces with the host system are not defined.
High 3. CI is a software-intensive system The CI is a software-intensive system, a high
with a high degree of re-use possible, degree of re-use of code is planned, the supplier
but interface requirements are has experience with a similar application, and
incomplete. interface documentation is in work.
High 4. CI is a software-intensive system The CI is a software-intensive system, a high
with a high degree of re-use possible, degree of re-use of code is planned, the supplier
and interface requirements are has experience with a similar application, and
defined. interface documentation is baselined.
High 5. CI software is mature, but needs to The CI is being updated to comply with new
be ported to new hardware hardware architecture, but the basic functionality
architecture. of the software is unchanged. A high degree of
integration testing will be required, and the
interface with the host system needs to be re-
designed
High 6. CI software is mature, but needs to The CI is being updated to comply with modified
work with modified hardware hardware architecture, and a significant degree of
architecture. integration testing will be required. The interface
with the host system needs significant changes.
Med 7. CI software is mature, but changes The CI is being updated to slightly modified
are needed for the modified hardware hardware architecture, and a moderate degree of
architecture. integration testing will be required. The interface
with the host system needs minor changes.
Med 8. CI software is mature and The CI is being updated to slightly modified
compatible with the modified hardware architecture, and only minor integration
hardware architecture. testing will be required. The interface with the
host system will work as is.
Low 9. CI software and hardware work The CI software does not require any changes,
as is in a very similar system. and the interface with the host system will work
as is.
Low 10. CI software is embedded, and the The CI contains only embedded software that is
system is fielded. already integrated. There is no software interface
with the host system.

Table 2: Software IRL Descriptions

5
Risk Level Integration Readiness Level Integration Characteristics
High 1. Supplier has access to the The CI supplier owns or has ready access to the
necessary technology. necessary technology, but no experience with
building or testing items.
High 2. Supplier has some experience with The CI supplier has developed items that are not
the technology for different similar to the current CI, for programs with
applications. different applications.
High 3. Supplier has developed similar The CI supplier has developed similar CIs for
items for different applications. programs with different applications.
High 4. Supplier has developed a similar The CI supplier has developed a similar item that
item for a related application. has been used in a program with a related
application, but major changes are needed.
(Example: all circuit cards need to be re-spun).
Med 5. Supplier has developed a very The CI supplier has developed a very similar item
similar item for a closely related that has been used in a program with a closely
application. related application, such that only moderate
changes are needed. (Example: multiple
components, other than CCAs, need to be
changed)
Med 6. Supplier has developed a very The CI supplier has developed a very similar item
similar item for a very similar that has been used in a program with a very
application. similar application, and only minor changes are
needed. (Example: only a few components need
to be changed, mostly for obsolescence rather
than performance)
Med 7. Supplier has developed a The CI supplier has already developed a
representative prototype. prototype of the CI, but it hasnt undergone any
electromagnetic or environmental testing.
Low 8. Supplier has developed a The CI supplier has already developed a
representative prototype with some prototype of the CI, and it has passed testing to
testing. environments that are expected to cause the
highest performance risk.
Low 9. Supplier is using the CI in a The CI supplier has already integrated the item
similar system. into another system with similar requirements,
and that system is in test with some positive
results.
Low 10. Supplier is providing the CI to a The item is integrated into a mature system with
fielded system. similar requirements.

Table 3: Hardware IRL Descriptions

6
The software IRLs merit some discussion. The risk level levels can be assessed in the context of the specific program
associated with software does not drop to low until a as the first step towards quantifying overall program risk.
significant degree of maturity is present in the existing item,
and the degree of compatibility with the host application is Case StudyTo illustrate how IRLs could be used, lets
well established. Stated another way, any new software consider a program that is integrating a new piece of
development, software re-work, or new software interface electronics that adds a new functionality to the system. The
involves some degree of risk for the program. Software is system-level requirements for the new functionality are
the area that is most consistently underestimated from a documented in a performance specification. The new box
risk/difficulty perspective at program inception. One reason is based on an earlier version thats already in use in a
for this is that its difficult to evaluate both the quality of related, fielded application, but it needs to be miniaturized
existing software and the capability of the personnel to fit into the space available in the current system. The
assigned to the development effort to perform the necessary legacy box is being advertised as a prototype for the new
work. Hardware architecture changes are also tricky; porting item, even though it bears little resemblance to the
code to a new architecture can be extremely difficult, configuration that will be needed for the current system. The
particularly with some of the older languages like Ada that prototype box has functional software for its current
are still being used in some sectors. If the electronics application, but the software needs to be re-written to work
architecture is being re-worked, software engineers should with the miniaturized version of the hardware. At contract
be involved in the design process so that the updated award, box-level requirements are documented in the item
architecture is feasible given the framework of the existing specification, which is only in draft form, and the interface
software. If the hardware and software dont readily fit control document does not exist. The system specification is
together, the resulting test-fix-test iteration will be also in draft form.
prolonged and difficult.
In this example, there are hardware and software
Hardware IRLs are somewhat more straightforward. Since development efforts associated with maturation of the box
you can see and touch hardware, its easier to assess how to final form, and modification of the system hardware and
close it is to the desired end state. Items that only involve software to host the box, with a follow-on integration and
hardware, i.e., no embedded software, may encounter test effort to incorporate the box and verify the new
development difficulties, but as long as the established TRL functionality is working per the system specification. The
guidance for technology readiness is observed, the box in this example is essentially the configuration item,
difficulties should be manageable with easily quantified or CI, in the IRL tables. Based on the description, the
levels of risk. This is why a distinction is made in Table 3 software effort would be assigned an IRL level of 2 in Table
between efforts involving changes to circuit cards (high 2. There was not a lot of specific information provided
risk), and changes to other types of components (medium about the status of the hardware, but based on the
risk). Circuit cards are a more complex type of hardware description its probably at an IRL of 4 in Table 3. With
that usually interact with software, and hence carry a higher both software and hardware IRLs in the high risk region, the
level of risk than purely mechanical or electrical devices. In effort cannot be expected to go smoothly, either from an
general, hardware is more predictable, and better item development perspective or a follow-on integration
understood, than software. Hardware obeys the laws of perspective. Based on an assumed cost/schedule profile, the
physics, and there are only so many ways that it can fail. impacts will be discussed.
The engineering profession has a long history of solving
these types of problems. However, the readiness for This is a very typical example of the type of integration
integration should be assessed to ensure the necessary effort that is often underestimated from a complexity
cost/schedule resources are allocated for a successful effort. perspective. All thats being proposed is to integrate a new
subsystem to a mature system, and the box supplier already
For planning purposes, the software and hardware risk has a similar item working in a different fielded system. The
levels should be evaluated at contract award to assess cost box supplier is persuasive in their arguments that the
and schedule risk for the program. If both software and modification of the box for the new application is low risk,
hardware fall into the high risk regions, it may not be and integrating it into the system is assumed to be moderate
feasible to introduce the configuration item unless sufficient risk. Based on this assessment, the development/integration
schedule margin and funding resources are available to effort is put on contract for 29 months, at a price of $92M.
accommodate a prolonged development, integration and test One of the assumptions embedded in this schedule is that
effort. A combination of high and medium would carry a fully-functional engineering models of the box will be
similar risk, with the risk dropping somewhat for a medium- delivered to the system integrator 6 months after contract
medium combination. If both hardware and software fall award. The box supplier contractually commits to a delivery
into the low risk regions, the configuration item is probably schedule that meets this timeline.
admissible with low-to-moderate schedule and cost risk.
The various permutations of hardware and software risk The program plays out very differently in reality. Box
hardware and software are not as applicable to the new

7
configuration as was initially assumed, and a significant sponsor level, and it was terminated after Milestone B. It
development effort is required to produce boxes for should be noted that in the context of the other program, the
integration. The first boxes are delivered to the system technology readiness of the box was evaluated using the
integrator 16 months after contract award (10 months TRL methodology as part of a formal Technology
behind the baseline), and they are early prototypes with Readiness Assessment (TRA), and was not assessed to be a
partial software functionality. So, in addition to cost and Critical Technology Element since the underlying
schedule growth at the box supplier due to the unplanned technology was well-established. This is an example of how
difficulty of the development effort, there is cost and the IRL methodology could be used to supplement the TRL
schedule growth at the system integrator due to not methodology for integration applications, since it assesses
receiving boxes when they expected them. Since the ICD the maturity and suitability of the item rather than the
was not in place at contract award, it is written in technology.
conjunction with the development effort, and the system
integrator doesnt have a stable baseline to design and ApplicationsHow could the list of questions and IRLs
implement their side of the interface. So, when the first discussed in this paper have prevented these problems?
boxes are delivered and connected to the system, even the Questions 1-6, 9 and 10 all have relevance to this case
minimal functionality available in box software doesnt study. The box performance requirements and ICD were not
work as intended. The system integrator spends fully established at contract award, nor were the system
considerable time and money getting the system working to performance requirements finalized. The prototype box put
some degree to support integration testing. forth initially to demonstrate the low-risk nature of the effort
was nothing at all like the final version of the box; major
Meanwhile, at the box supplier, system design is continuing hardware and software development efforts were required.
to evolve, as is the host interface. As newer versions of the The box was intended for use by more than one program,
box come online they are delivered to the system integrator, and the skylined requirements caused some difficulties
who continues to iteratively refine the interface and mature with design and qualification, leading to further
system-level functionality. An additional 9 months are requirements iterations. Finally, approvals from external
consumed before a box that is suitable for system-level test agencies were required, however, box development and
in real world environments is delivered. The subsequent integration were so time consuming that obtaining these
box-to-system integration iteration consumes another ten approvals did not prove to be a major schedule driver. Its
months, at the end of which period the first end-to-end worth mentioning that considerable touch labor was
system test is completed. By this time, production dedicated to this aspect of the program to ensure the positive
representative boxes are being delivered to the system outcome in that regard. This occurred because the risks
integrator, and another iteration is required to integrate the associated with external approvals were well-known and
final configuration. System-level development, documented in the risk management database at the start of
qualification and regression testing takes another 27 months the program, triggering the dedication of program resources
to complete, and the program finally finishes over 3 years to the mitigation activities.
behind schedule at roughly double the original contract
price. Concerning the IRLs, it was stated at the beginning of the
discussion that software would have been assessed at IRL 2
This example probably sounds extreme, but its based on a and hardware at IRL 4. In the tables, both of these would
real integration effort that played out as described above. In fall into the high risk area, which is an indicator that the
the real situation, integration of the box was not the only effort probably has a significant degree of complexity. This
change made to the system, so not all of the cost and assessment alone does not indicate how long the effort will
schedule growth are necessarily attributed to difficulties take, or how much it will cost, but it can help to evaluate the
with box development and integration. However, its a fact realism of any estimates that are developed prior to
that box development and integration turned out to be a committing to contract award.
major effort that drove total program cost and schedule far
beyond what was originally planned, to the extent that the In the example discussed herein, if the IRL tables existed
program had to be re-baselined. Obtaining the necessary and had been used by program management/systems
funding to complete the effort was extremely difficult, and engineering, the cost and schedule proposed for the effort
depleted resources originally planned for other projects. might not have been accepted. Realistic estimates of the cost
Only the high priority (at the resource sponsor level) placed and schedule might have been considered unaffordable at
on the capability being introduced saved the program from the time, but perhaps would have sparked more discussion
termination. If the program had been terminated, a of capability and requirements. This might have helped to
significant near-term capability gap for the user would have mature the concept prior to starting development, and
resulted. Worthy of note is that the boxes had also been put allowed resources to be applied to fleshing out and
on contract to be delivered to another program with a documenting a more finalized set of requirements at both
different system integrator, and that program was unable to the system and box levels. Also, a better understanding of
absorb the cost and schedule growth of the box development the resources required for this effort might have prevented
effort. That program had less support at the resource the termination of the other program that was supposed to

8
integrate the box; perhaps an alternate material or other relies on engineered products, however, the list of questions
solution could have been found to meet that programs and specifics of the IRL table would probably need to be
requirements within their allotted cost and schedule. adjusted to account for differences in the type of end
Clearly, a thorough and realistic risk assessment up front is product being developed.
critical to overall program success.
Future work in this area could proceed in various ways.
First, the IRL tables need refinement even for DoD
3. CONCLUSIONS applications; they were developed by the author based
solely on experience with a few specific efforts, and inputs
There are numerous examples in the defense industry of from other systems engineers and program managers would
programs that have experienced major cost and schedule be beneficial to increase the robustness of the metric. The
growth for efforts that were assumed to be relatively current tables may have more levels than are needed; it
straightforward upgrades to fielded product baselines. This seems that similar integration characteristics could be
situation occurs because the programs are structured to fit consolidated to increase the separation between the levels
within available resources and desired capability delivery and simplify the descriptions. Its emphasized that the IRL
dates, without careful consideration of the integration methodology is not intended to be a direct read-across from
characteristics discussed herein. TRLs, and the levels are not a one-to-one mapping of the
rationale expressed in the TRL scale. Second, it may be of
The Integration Readiness Level metric is proposed to interest to apply this methodology to other technology
provide systems engineers with a tool to evaluate cost and sectors. Referring again to the TRL methodology, it is used
schedule risk based on characteristics of the configuration by NASA as well as the DoD, and although the levels and
item being considered for integration. Obviously, there is no descriptions are similar for the two industries, they are not
way to attach cost and schedule estimates to the various identical. If the IRL methodology ever becomes well-
Integration Readiness Levels, since those figures will vary established and agreed to for DoD, it could be extended to
depending on the specifics of the proposed effort. The IRLs use in other industries.
simply provide a means to translate characteristics of the
integration item into risk levels for system integration. The 4. REFERENCES
systems engineer will need to draw on experience and
judgment to arrive at meaningful conclusions concerning [1] Defense Acquisition Guidebook, Defense Acquisition
cost and schedule estimates for the integration effort. The University, www.dag.dau.mil
intent is to provide a framework for estimating risk prior to
contract award, so that the program can be set up with an
executable baseline right from the start. It may turn out that 5. BIOGRAPHY
the cost and schedule resources needed to construct an
executable baseline are not available, and its good to know
that as well. Perhaps the new system isnt the only solution,
or the best solution given available resources, for the needed
capability. Its up to the systems engineer to provide those
recommendations to program management to facilitate the
decision process. Jennifer Long is currently
employed by the Department of Defense as a systems
The ultimate goal is to dress the program for success by engineer in the field of weapon systems acquisition. She has
putting an executable baseline on contract at the start of the 20 years of experience in the defense industry in various
effort. This facilitates an efficient use of available dollars, technical areas, including structural engineering for
and helps to prevent capability gaps within the user aircraft and spacecraft, military aircraft flight test, and
community. As with any tool, it needs to be used with weapons system acquisition. In 2005 she received the David
judgment based on experience to be effective. Packard Award for Acquisition Excellence. Jennifer holds a
doctoral degree in Civil Engineering, and she is DAU Level
The discussion herein is based entirely on experience with III certified in the Acquisition Professional Fields of
DoD development programs, and as such the list of Systems Engineering, Test and Evaluation and Program
questions and integration characteristics is tailored to those Management.
types of engineering applications. It seems likely that this
methodology could be adapted to virtually any industry that

You might also like