Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Amity Institute of Space Science & Technology

Avionics Software
Engineering : LN 5 :
DO 178 B/C : A Brief Note

Prof M S Prasad

This brief note on DO 178 B/C standards is for Grad/ Post


Grad students of AVIONICS. This LN has been prepared
based on available open literature for instructional purposes
only.
AVIONICS SOFTWARE ENGINEERIN : LN 5
DO 178 B /C : Aviation Software Standard
[ This is a brief introduction to DO 178 B/C standards , for Grad/ Post Grad
students of Avionics . The software developers are suppose to go
through the standards manual in original ]

DO-178B has become a de facto standard. Produced by Radio Technical


Commission for Aeronautics, Inc. (RTCA), the FAA's Advisory Circular AC20 -
115B established DO-178B as the accepted means of certifying all new aviation
software. Sometimes refereed as RTCA DO 178 B also. The RTCA creates DO-
178 in 1980, while EUROCEA works on ED-35.The merged result is DO-178 / ED-
12: the first common certification criteria for production of avionics software.

DO-178B is primarily concerned with development processes. As a result,


certification to DO-178B requires delivery of multiple supporting documents and
records. The quantity of items needed for DO -178B certification, and the amount
of information that they must conta in, is determined by the level of certification
being sought.

The targeted DO-178B certification level is either A, B, C, D, or E.


Correspondingly, these DO-178B levels describe the consequences of a
potential failure of the software: Catastrophic ( A) , Hazardous-severe( B), Major
( C) , Minor ( D) , or no-effect( E) .

The DO-178B certification process is most demanding at higher levels. A


product certified to DO-178B level A would have the largest potential market,
but it would require thorough, labor-intensive preparation of most of the items
on the DO-178B support list.

Conversely, a product certifying to DO -178B level E would require fewer


support items and be less taxing on company resources. Unfortunately, the
product would have a smaller range of applicability than if certified at a
higher level.

The failure criteria as defined in DO 178 A/B ( software levels )


Level Failure Description Objective
condition associated

A Catastrophic Failure may cause a crash 66

Failure has a large negative 65


B Hazardous impact on safety or performance,
or reduces the ability of the crew
to operate the plane due to
physical distress or a higher
workload ,or causes serious or
fatal injuries among the
passengers.
Failure is significant, but has a 57
C Major lesser impact than a Hazardous
failure (for example, leads to
passenger discomfort rather than
injuries)
Failure is noticeable, but has a 28
D Minor lesser impact than a Major
failure (for example ,causing
passenger inconvenience or a
routine flight plan change)

Failure has no impact on safety, 0


E No effect aircraft operation, or crew
workload

DO-178B establishes processes that are intended to support the objectives,


according to the software level. In general, there’s Integral and Development
processes as shown in Figure 2 below .
Fig 2 Association of processes in DO 178

DO-178B translates these software levels into software specific objectives


that must be satisfied during the development process. An assigned software
level determines the level of effort required to show compliance with
certification requirements, which varies with the failure condition category.
This means that the effort and expense of producing a system critical to the
continued safe operation of an aircraft (e.g. a fly-by-wire system) is
necessarily higher than that required to produce a system with only a minor
impact on the aircraft in the case of a failure (e.g. the in-flight entertainment
system).

Intended to provide confidence in the safety of avionic systems by ensuring


that they comply with airworthiness objectives, DO-178B covers the complete
software lifecycle: planning, development and integral processes to ensure
correctness, control and confidence in the software . These integral processes
include requirements traceability, software design, coding and software
validation and verification.
DO-178B Process Objectives

DO-178B recognizes that software safety and security must be addressed in


a systematic way throughout the software development life (S DLC). This
includes the requirements traceability, software design, coding, validation
and verification processes used to ensure correctness, control and
confidence in the software.

Key elements of the DO-178B SDLC are the practices of traceability and
coverage analysis. Traceability (or Requirements Traceability) refers to the
ability to link system requirements to software high -level requirements, from
software high-level requirements to low-level requirements, and then from
low-level requirements to source code and the associated test cases.
Coverage (or code coverage analysis) refers to measuring the degree to
which the source code of a system has been tested. Through the use of these
practices it is possible to ensure that code has been implemented to address
every system requirement and that the implemented code has been tested to
completeness.

The planning process is where all the Software Development plans are made.
We have gathered system -level requirements and we want to calculate an
approach towards effectively and safely satisfying the requirements.

The Software Development Plan is meant to define the software life -cycle and
the development environment.

The Software Verification Plan is the proposed method of verification that will
satisfy all objectives of the software verification process.

The Software Configuration Plan is similarly the proposed method of


configuration management.

The Software Quality Assurance Plan is the proposed plan for satisfying the
objectives of the software quality assurance process.

So again these documents are intended to outline what will be done to


achieve the objectives of the following processes : ( Refer Fig below )

There are 2 sections of the DO-178B guidance document where the use of
the Software testing tools are significantly used.
Section 5.0 - Software Development Processes

Section 6.0 - Software Verification Process

Software Development Processes (Section 5.0 )

Fig : DO 178 Document Structure

Software Life Cycle Process

Software Software Life Cycle


Life Cycle Objective Data
Process (process outputs)
Planning Produce the software plans and PSAC Plan for SW
standards that direct the software Aspects of
development processes and the SDP Certification
integral processes SW Development
SVP Plan
SW Verification
SCMP Plan
SW Config
SQAP Management Plan
Software Software Life Cycle
Life Cycle Objective Data
Process (process outputs)
SRS SW Quality
Assurance Plan
SDS SW Requirements
Standards
SCS SW Design
Standards
SW Code
Standards
Requirements Develop the high-level SRD SW Requirements
requirements Data
Design Develop the software architecture SDD SW Design
and low-level requirements from Description
the high-level requirements SECI SW Environment
Config Index
Coding Implement the source code from SRC Source Code
the software architecture and the
low-level requirements
Integration Build the executable object code EOC Executable Object
and load it into the target Code
hardware to detect and report SVCP SW Verification
errors/bugs that may have been Cases & Proc.
introduced during the software SVR SW Verification
development processes by Results
creating test cases, test SCA Structural
procedures and analyses Coverage
SCI Analysis
SW Configuration
SAS Index
SW
Accomplishment
Summary
Configuration Provide a defined and controlled - SCM Records
Management configuration of the software Problem Reports
throughout the software life cycle
and to provide the ability to
consistently replicate the
Executable Object Code
Verification Detect and report errors/bugs that - Review Records
may have been introduced during
the development processes
Quality Provide confidence that the - SW Conformity
Software Software Life Cycle
Life Cycle Objective Data
Process (process outputs)
Assurance software life cycle processes Review
produce software that conforms to SQA Records
its requirements by assuring that
these processes are performed in
compliance with the approved
software plans and standards
Certification Establish communication and - PSAC, SAS, SCI
Liaison understanding between the
applicant and the certification
authority throughout the software
life cycle to assist the certification
process

Outputs are in the form of Conformance documents which are to be


certified.

Four high level activities are identified in the DO -178B Software Development
Processes section; Software requirements process, Software design process,
Software coding process and Integration process . In addition, there is a
section on requirements traceability (section 5.5) that embodies the evolution
and traceability of requirements from system level requirements to source
code.

As part of Section 5.3, the software development process, DO -178B specifies


that software must meet certain software coding process requirements. These
include adherence to a set of software coding standards and traceability from
low level design requirements to the source code and object code.

Further definition of the software coding standards are provided in Section


11.8 of DO-178B:

• Programming language(s) to be used and/or defined subset(s). For a


programming language, reference the data that unambiguously defines the
syntax, the control behaviour, the data behaviour and side -effects of the
language. This may require limiting the use of some features of a l anguage.

• Source code presentation standards, for example, line length restriction,


Indentation, and blank line usage and source code documentation standards,
for example, name of author, revision history, inputs and outputs, and
affected global data.

• Naming conventions for components, subprograms, variables and


constants.

• Conditions and constraints imposed on permitted coding conventions, such


as the degree of coupling between software components and the complexity
of logical or numerical expressions and rationale for their use .

Software Verification Process (Section 6.0)

Per Section 6.1 of DO-178B, the objectives of the Software Verification


Process are “to detect and report errors that may have been introduced
during the software development processes.” Per Section 6.2 of DO-
178B,these objectives are satisfied “through a combination of reviews,
analyses, the development of test cases and procedures, and the subsequent
execution of those test procedures. Review and analyses provide an
assessment of the accuracy, completeness and verifiability of the software
requirements, software architecture and source code.”

DO-178B Structural Coverage Analysis Objectives

Structural Coverage Analysis (SCA) is used to analyse the effectiveness of


the requirements-based test procedures. All testing must be performed at the
system level and be driven from the software requirements – that is, using
requirements-based functional tests. Following that, SCA is applied to
measure the effectiveness of this testing, i.e. measuring how much of the
code has been exercised as a result of the requirements -based functional
tests. The feedback gained as a result of this analysis helps to validate that
the code base has been tested to completeness, and also to ensure that
there is no unintended code in the system software by validating that all of
the code is traceable to a specific system requirement or set of requirements.
SCA is one of the closing steps in the requirements coverage/traceability
process. SCA is an empirical measurement of requirements test
effectiveness, helping to link the source code implementation and associated
test procedures back to the system requirements while ensuring that the
system code has been tested to the level of completeness required by the
system software level.
Following is the pertinent extract from DO-178B with respect to SCA:

Structural Coverage Analysis

The objective of this analysis is to determine which code structure was not
exercised by the requirements-based test procedures. The requirements
based test cases may not have completely exercised the code structure, so
structural coverage analysis is performed and additional verification produced
to structural coverage. Guidance includes:

a. The analysis should confirm the degree of structural coverage appropriate


to the software level.

b. The structural coverage analysis may be performed on the source code,


unless the software is level A and the compiler generates object code that is
not directly traceable to source code statements. Then, additional verification
should be performed on the object code to establish the correctness of such
generated code sequences. A compiler-generated array bound check in the
object code is an example of object code that is not directly traceable to the
source code.

c. The analysis should confirm data coupling and control coupling between
the code components. The SCA objectives for each software level are
summarized in the following table :-

Ite Description DO178 DO DO DO Do17


m 17 17 17 8
Referenc 8 A 8 B 8 C
e D

5 Test Coverage 6.4.4.2 R NR NR NR


of software
structure(MC/D
C is achieved)
6 Test coverage 6.4.4.2.a R R NR NR
SW structure ( and
decision 6.4.4.2 b
coverage is
satisfied )
7 Statement 6.4.4.2 a R R R NR
coverage is & b
satisfied
8 Data Coupling 6.4.4.2c R R R NR
&Control
coupling is
satisfied
Item no 1 -4 are manual procedures .

Control Coupling

Defined by DO-178B to be “The manner or degree by which one software component


influences the execution of another software component.

Data Coupling

Defined by DO-178B to be “The dependence of a software component on data not


exclusively under the control of that software component,”

Software Configuration Management

Verification of the various outputs discussed in DO-178B are only credible when there is
clear definition of what has been verified. This definition or configuration is the intent of
the DO-178B objectives for configuration management. The six objectives in this area
are unique, in that they must be met for all software levels. This includes identification of
what is to be configured, how baselines and traceability are established, how problem
reports are dealt with, how the software is archived and loaded, and how the
development environment is controlled.

While configuration management is a fairly well-understood concept within the


software engineering community (as well as the aviation industry as a whole), DO-178B
does introduce some unique terminology that has proven to be problematic. The
concept of control categories is often misunderstood in a way that overall development
costs are increased, sometimes dramatically. DO-178B defines two control categories
(CC1 and CC2) for data items produced throughout the development.

The authors of DO-178B intended the two levels as a way of controlling the overhead
costs of creating and maintaining the various data items. Items controlled as CC2 have
less requirements to meet in the areas of problem reporting, base lining, change control,
and storage. The easiest way to understand this is to provide an example. Problem
reports are treated as a CC2 item. If problem reports were a CC1 item and a problem
was found with one of the entries on the problem report itself, a second problem report
would need to be written to correct the first one.
A second nuance of control categories is that the user of DO-178B may define what
CC1 and CC2 are within their own CM system as long as the DO-178B objectives are
met. One example of how this might be beneficial is in defining different retention
periods for the two levels of data. Given the long life of airborne systems, these costs
can be quite sizeable. Another consideration for archival systems selected for data
retention is technology obsolescence of the archival medium as well as means of
retrieval.

Software Quality Assurance

Software quality assurance (SQA) objectives provide oversight of the entire DO-178B
process and require independence at all levels. It is recognized that it is prudent to have
an independent assessment of quality.

SQA is active from the beginning of the development process. SQA assures that any
deviations during the development process from plans and standards are detected,
recorded, evaluated, tracked, and resolved. For levels A and B, SQA is required to
assure transition criteria are adhered to throughout the development process.

SQA works with the CM process to assure that proper controls are in place and applied
to life cycle data. This last task culminates in the conduct of a software conformity
review. SQA is responsible for assuring that the as-delivered products matches the as-
built and as-verified product. The common term used for this conformity review in
commercial aviation industry is “First Article Inspection.”

The Software Testing Process in general is shown below.


Certification Liaison Process

As stated earlier, the certification liaison process is designed to streamline the


certification process by ensuring that issues are identified early in the process. While
DO-178B outlines twenty distinct data items to be produced in a compliant process,
three of these are specific to this process and must be provided to the certifying
authority. They are

• Plan for Software Aspects of Certification (PSAC)

• Software Configuration Index

• Software Accomplishment Summary

Other data items may be requested by the certification authority, if deemed necessary.
As mentioned earlier, applicants are encouraged to start a dialogue with certification
authorities as early in the process as possible to reach a common understanding of a
means of achieving compliance with DO-178B. This is especially important as new
technology is applied to avionics and as new personnel enter the field.

Good planning up front, captured in the PSAC, should minimize surprises later in the
development process, thus minimizing cost. Just as the PSAC states what you intend to
do , the accomplishment summary captures what you did . It is used to gauge the
overall completeness of the development process and to ensure that all objectives of
DO-178B have been satisfied.

Finally, the configuration index serves as an overall accounting of the content of the
final product as well as the environment needed to recreate it.

Previously Developed Software

PDS is software that falls in one of the following categories:

• Commercial off-the-shelf software (e.g., shrink-wrap)

• Airborne software developed to other standards (e.g., MIL-STD-498)

• Airborne software that predates DO-178B (e.g., developed to the original DO-178 or
DO-178A)

• Airborne software previously developed at a lower software level The use of one or
more of these types of software should be planned for and discussed in the PSAC.

In every case, some form of gap analysis must be performed to determine where
specific objectives of DO-178B have not been met. It is the applicant‟s responsibility to
perform this gap analysis and propose to the regulatory authority a means for closing
any gaps. Alternate sources of development data, service history, additional testing,
reverse engineering, and wrappers* are all ways of ensuring the use of PDS issafe in
the new application.

In all cases, usage of PDS must be considered in the safety assessment process and
may require that the process be repeated if the decision to use a PDS component
occurs after the approval of PSAC. A special instance of PDS usage occurs when
software is used in a system to be installed on an aircraft other than the one for which it
was originally designed. Although the function may be the same, interfaces with other
aircraft systems may behave differently. As before, the system safety assessment
process must be repeated to assure that the new installation operates and behaves as
intended.

If service history is employed in making the argument that a PDS component is safe for
use, the relevance and sufficiency of the service history must be assessed. Two tests
must be satisfied for the service history approach to work. First, the application for
which history exists must be shown to be similar to the intended new use of the PDS.
Second, there should be data, typically problem reports, showing how the software has
performed over the period for which credit is sought. The authors of DO-178B intended
that any use of PDS be shown to meet the same objectives required of newly developed
code.

Prior to identifying PDS as part of a new system, it is prudent to investigate and truly
understand the costs of proving that the PDS satisfies the DO-178B objectives.
Sometimes, it is easier and cheaper to develop the code again!

*Wrappers is a generic term used to refer to hardware or software components that


isolate and filter inputs to and from the PDS for the purposes of protecting the system
from erroneous PDS behavior.

DO 178 Documents

 Plan for Software Aspects of Certification (PSAC)


 Software Development Plan (SDP)
 Software Verification Plan (SVP)
 Software Configuration Management Plan (SCMP)
 Software Quality Assurance Plan (SQAP)
 Software Requirements Standards (SRS)
 Software Design Standards (SDS)
 Software Code Standards (SCS)
 Software Requirements Data (SRD)
 Software Design Description (SDD)
 Software Verification Cases and Procedures (SVCP)
 Software Life Cycle Environment Configuration Index (SECI)
 Software Configuration Index (SCI)
 Software Accomplishment Summary (SAS) also includes
supplementary documents such as : -
 Software Design Document
 Software Requirements Document
 Traceability
 Test Cases and Procedures
 Verification Results
 Quality Assurance Records
 Configuration Management Records
 Problem Reports Records
 Software Verification Results (SVR)
 Problem Reports
 Software Configuration Management Records
 Software Quality Assurance Records

Tool Selection

The use of traceability and analysis tools for an avionics project that must meet the DO-
178B certification requirements offers significant productivity and cost benefits. Tools
make compliance checking easier, less error prone and more cost effective. In addition,
they make the creation, management, maintenance and documentation of requirements
traceability straightforward and cost effective. When selecting a tool to assist in
achieving DO-178B acceptance the following criteria should be considered:

• Does the tool provide a complete „end-to-end‟ Requirements Traceability capability to


enable linkage and documentation from all levels to the source code and associated
test cases?

• Does the tool enable analysis for all Structural Coverage Analysis requirements as laid
out in section 6.4.4.2 of the standard?

• Can the tool perform MC/DC analysis in assignment statements as well as conditional
statements?

• Is there tool availability for all the languages required in your project?

• Has the tool been utilised in this manner successfully already?


• Will the tool vendor assist in tool qualification?

• Is tool support both flexible and extensive enough to meet changing requirements?

Notes

The difficulties that have been identified are the DO-178 requirements for evidence and
rigorous verification... Systematic records of accomplishing each of the objectives and
guidance are necessary. A documentation trail must exist demonstrating that the
development processes not only were carried out, but also were corrected and updated
as necessary during the program life cycle. Each document, review, analysis, and test
must have evidence of critique for accuracy and completeness, with criteria that
establishes consistency and expected results.

This is usually accomplished by a checklist which is archived as part of the program


certification records. The degree of this evidence varies only by the safety criticality of
the system and its software.

References

[1] RTCA Inc., RTCA/DO-178B: Software considerations in airborne systems and


equipment certification, 1992.

[2] V. Hilderman and T. Baghi, Avionics certification: a complete guide to DO-178


(software), DO-254 (hardware), Avionics Communications Inc., 2007.

[3] RTCA Inc., RTCA/DO-248B: Final report for clarification of DO-178B “Software
considerations in airborne systems and equipment certification”, 2001.
Appendix A

Guidelines for Software Development

(a) A detailed description of how the software satisfies the specified software high-level
requirements, including algorithms, data-structures and how software requirements are
allocated to processors and tasks.

(b) The description of the software architecture defining the software structure to
implement the requirements.

c) The input/output description, for example, a data dictionary, both internally and
externally throughout the software architecture.

(d) The data flow and control flow of the design.

(e) Resource limitations, the strategy for managing each resource and its limitations, the
margins and the method for measuring those margins, for example timing and memory.

(f) Scheduling procedures and inter processor/inter task communication mechanisms,


including time-rigid sequencing, pre-emptive scheduling, interrupts.

(g) Design methods and details for their implementation, for example, software data
loading, user modifiable software, or multiple-version dissimilar software.

(h) Partitioning methods and means of preventing partitioning breaches.

(i) Descriptions of the software components, whether they are new or previously
developed, with reference to the baseline from which they were taken.

(j) Derived requirements from the software design process.

(k) If the system contains deactivated code, a description of the means to ensure that
the code cannot be enabled in the target computer.

(l) Rationale for those design decisions that are traceable to safety-related system
requirements.

---------------------------------------------------------------------------------------------------------------------
--

You might also like