Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

Safety Through History organizing properties, and there is no “recipe” that assures the system will behave as

it has before.
In order to know where we are going, it is useful to look back at how
safety has evolved over time. For the purposes of this exercise, let’s go back to the When Frederick Taylor introduced the principles of scientific
early days of the industrial revolution. At that time, especially in the U.S., the term management, the aviation industry was not yet complex, but over time, international
“revolution” was no stretch. The industrialization of America was a radical shift air transportation has become an increasingly complex, adaptive system, and the
from an agrarian economy to one that increasingly relied on industrial processes reductionist approach of the turn of the 20th century no longer broadly applies.
such as manufacturing, mining and textile milling. Machines entered peoples’ lives
in completely new ways and, as a result, new opportunities for productivity In the last 100 years or so, safety has adapted to these changes, though
emerged. Concurrent with changing production methods was an increasing diversity sometimes reluctantly, as shown below.
of on-the-job injuries and accidents. Safety in this paradigm was primarily a matter
of luck more than it was a product of intentional design. It was common for workers
to be blamed for their own, sometimes grave, injuries regardless of the impact
training or poorly designed equipment may have had. In fact, a worker who was
injured or involved in equipment failure often was punished or even fired if he
wasn’t killed in the accident. Luck, essentially, was viewed as the division between
things going as planned, or a worker being hurt or killed. As society adjusted to the
new reality of mechanized work, cultural norms shifted, and workers began to
demand not only better employment conditions, but some basic safety standards as
well.

A man named Frederick Winslow Taylor also refused to adhere to old


views of work. Among other improvements – you may remember his study of
factory lighting and the effects on efficiency – Taylor was interested in
understanding work as a process to maximize productivity. As a student of
engineering, Taylor approached work as a scientific undertaking, breaking work
practices down to their simplest components in what he termed scientific
management. While it may not seem directly related to safety, many of Taylor’s
ideas persist today in the methods we use for safety management. Scientific Assuming that safety was a matter of luck or skill naturally pushed
management relied on a reductionist view of work, and often, of the organizations technical issues to the forefront in the early days of aviation. This wasn’t an
engaged in work. Reductionism is simply the view that a system can be understood improper approach, because as our industry made tremendous technical advances,
by reducing it to smaller, simpler opportunities to address mechanical failure abounded. The fatal crash that killed
parts. Knute Rockne, the Notre Dame Football coach, in Kansas in 1931 is an example of
the kind of technical failures that occurred with relative regularity in commercial
In the early days of manufacturing, as in aviation, systems were not aviation’s formative years. The publicity that followed Rockne’s death played an
terribly complex. They were often complicated, but there is a crucial distinction important role in advancing regulation of the U.S. air transportation industry,
between these two constructs. When something is complicated, it may have a notably leading to the public release of aviation accident investigation results, and
number of component parts, but a recipe essentially can be derived that will deliver contributing to the formation of what would later become the National
a high probability of success. Designing and flying an airplane is a complicated Transportation Safety Board, or NTSB.
process. Solving the problem was difficult, but once visionaries such as the Wright
brothers and Glenn Curtiss unlocked the basic principles of flight, a number of In the 1950s and 1960s, as the aerospace industry began rapidly to
aircraft very quickly were created following the same general “recipe.” In contrast, advance, the “fly-fix-fly” approach worked well enough, and helped to identify a
complex systems cannot be understood by their component parts. Systems that are number of obvious safety shortcomings that, once addressed, dramatically improved
complex are typically synergistic, in that the output of the system cannot be derived safety. The pressurization issues that plagued the de Havilland Comet aircraft, and
by simply adding individual pieces. Complex systems generally exhibit some self- manifested in the May 2, 1953, crash of BOAC Flight 783 and subsequent Comet
losses, are a well-known example of this type of failure . However, as system opposed to the heavily descriptive tools of the past. These shifts set the stage for
complexity began to increase, waiting for failures to materialize became moving toward a holistic, management approach to safety.
increasingly untenable.
The SHELL Model, is a conceptual model used to illustrate the dynamic interface
Starting in the early 1970s, the aviation industry began to realize the between many diverse system components. This model is a direct reflection of the
efforts of improving aircraft, airport, and ATC technology and capabilities; evolution of our understanding of safety as a complex system problem, and it shows
however, accidents continued to occur – but for reasons outside the technical ones four discrete components: Liveware (L; humans in the organization), Software (S;
that plagued early commercial aviation. This required a shift in investigations from a procedures, training, etc.), Environment (E; the working environment in which the
mechanical focus to one based on human factors. United Airlines 173 crashed in system operates), and Hardware (H; machinery and equipment).
Portland in December 1978 after the crew failed to monitor the fuel quantity while
troubleshooting a landing gear problem on approach. The crash killed 10 people, and
marked the first time NTSB directly addressed resource management as a key causal
factor in the accident. United Airlines developed Crew Resource Management
(CRM) training as a direct result of the accident, and over the years that followed,
the concept was extended to all areas of operations, including airports. Until the
early to mid1990s, most accidents focused on individual human performance issues,
rather than investigating them in the context of an organization and in the broader
industry. That focus began to turn to address the operational environment as the
industry realized that organizations were complex systems with multiple interacting
influences on human performance.

CRM was a terrific step forward, as was the increasing focus on the
interface between people and technology, in learning more about how to create more
error-resistant systems. The individual error approach had many limitations, though,
and safety professionals began to realize during the 1990s that examining how
normal work was completed was often more valuable than attempting to deconstruct
individual failures. This change introduced new methods of monitoring routine
operational data and applying proactive and predictive methodologies to identify
underlying safety risks. The concept of the organizational accident was introduced
by Dr. James Reason during this time, and the concepts of resilience and normal
Although the SHELL Model is relatively simple, the interactions between each
variations in human performance that resulted from research from safety
subsystem are an excellent way to address potential interface mismatches and
professionals such as Erik Hollnagel, Jens Rasmussen and Sidney Dekker gained
provide a sound assessment foundation for system design and evaluation. Most
increasing acceptance The failure of the Space Shuttle Columbia as it re-entered
importantly, the model does not focus on a single system or component, but rather
Earth’s atmosphere on February 1, 2003, is widely regarded as an example of an
on the complex, adaptive interactions inherent to the aviation industry.
organizational accident. In that case, multiple layers of organizational influence
allowed the probability of heat-resistant tile failure to be downplayed without
effective communication of the risk.

As a result of accidents like the Columbia breakup, the art and science of safety
began to consider broader influences on risk controls, including organizational
culture and the interface between environment, humans and technology. In addition,
regulators and operators began to work more collaboratively to identify not only
proactive methodologies, but also predictive tools that rely on the collection and
analysis of routine operational data to allow powerful inferential analysis, as

You might also like