Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

UNIVERSITY MALAYSIA PAHANG

BTM 3912
ENGINEERING ETHICS
SECTION 01

LECTURER: DR. MUNIRA BINTI MOHD ALI

PROJECT REPORT
TITTLE: BOEING 737 MAX CRASH INCIDENTS

PREPARED BY:
1. MUHAMMAD NAKAEI BIN RIDZWAN TB19070
2. NUR MAWADDAH TASHAHIRAH BINTI HASBULLAH TB19076
3. NUR IRMA NAZASYIDA BINTI ALIAS TB19089
4. WAN MOHD SYAFIQ BIN WAN MOHD TA19161
1.0 INTRODUCTION

THE HISTORY OF B737MAX

The original 737 took its first flight in 1967, back when jet fuel was cheap. At the time,
people didn’t understand the implications of pollution, so environmental concerns were an
afterthought. Most airports were small and rural. They lacked infrastructure, such as jet gates and
fancy luggage-loading machinery. In response, airlines asked manufacturers for low-to-the-ground
airplanes with easy-to-reach engines, which reduced operating costs. And that’s exactly what
Boeing gave the airlines.

The 737 Max’s structure resembles the original 737. The big difference is the engines are
larger, the fuselage is bigger, and “winglets” were added to the tip of the wings to improve fuel
efficiency. By all accounts, the 737 fleet has been a smashing success. In 2005, more than 25% of
all large commercial airliners were Boeing 737s. However, the recent crashes demonstrate the
challenges of modernizing the Boeing 737 fleet.

The Boeing 737 is a victim of its own success. The airplane thrived for more than half a
century during a period when airplanes were safer and more automated. The 737 brand was so
trusted that when aircraft upgrades were needed, Boeing re-designed the 737 instead of creating a
new fleet of airplanes. To compete with the Airbus A320-NEO, the Max had larger engines than
previous 737 models. They were designed for greater range and fuel efficiency but came with a
tradeoff. Since the 737 sits so low to the ground, Boeing had to change the position of the engines
on the wing to give the plane ground clearance and account for the extended length of the fuselage,
but by solving an old problem, Boeing created a new one. The new engines were too big to fit in
their traditional spot under the wings. To combat the problem, Boeing mounted them forward on
the wings. Moving the engine position forward shifted the plane’s center of gravity, which altered
the aerodynamics of the aircraft. The position of the new engines pulled the 737 tail down, pushed
its nose up, and put it at risk of stalling.

Boeing installed extra software to make the updated 737 fly like traditional ones. It was
designed to prevent stalls, compensate for the position of the engine on the wing, and force the
aircraft’s nose down automatically when the sensors determined the airplane was flying at a
dangerous angle.
The stall-prevention system known as MCAS was poorly designed and implemented. Since
it was intended to work in the background, Boeing didn’t brief pilots about the software or train
them in simulators. The software didn’t activate when the flaps were down or the autopilot was on
and when the MCAS system went haywire, pilots could deactivate it with a switch on the center
pedestal of the 737 cockpit. As pilots yoked the airplane upwards, the software automatically
pushed the aircraft nose back down. This led to the crash of the two Boeing airplanes.

Instead of dreaming up new ideas, Boeing pours its resources into incremental designs.
Boeing executives knew the Boeing 737 design wouldn’t work with the larger engines. But instead
of swallowing some short-term risk for long-term gain and building a new airplane from scratch,
Boeing did the safe thing and iterated upon the existing 737 line. The airplane industry suffers from
a lack of innovation. The basic design of airplanes hasn’t changed in more than 70 years. When
innovation disappears, companies are incentivized to engage in exactly the kind of behavior that
led to the 737 Max crash.

2.0 DISCUSSION
2.1 ETHICAL ISSUES

The 737 MAX case remains unfolding and can still do so for a few time. Yet important
lessons can already be learned (or relearned) from the case. a number of those lessons
are straightforward, et al. are more subtle. A key and clear lesson is that engineers may
have reminders about prioritizing the general public good, and more specifically, the
public’s safety. A more subtle lesson pertains to the ways during which the matter of
the many hands may or might not apply here. Other lessons involve the necessity for
companies, engineering societies, and engineering educators to rise to the challenge of
nurturing and supporting ethical behavior on the a part of engineers, especially in light
of the difficulties revealed during this case.
All contemporary codes of ethics promulgated by major engineering societies state that
an engineer’s paramount responsibility is to guard the “safety, health, and welfare” of
the general public. The American Institute of Aeronautics and Astronautics Code of
Ethics indicates that engineers must “[H]old paramount the security, health, and welfare
of the general public within the performance of their duties” (AIAA 2013). The Institute
of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further, pledging
its members: “…to hold paramount the security, health, and welfare of the general
public, to strive to suits ethical design and sustainable development practices, and to
disclose promptly factors which may endanger the general public or the environment”
(IEEE 2017). The IEEE Computer Society (CS) cooperated with the Association for
Computing Machinery (ACM) in developing a Software Engineering Code of Ethics
(1997) which holds that software engineers shall: “Approve software as long as they
need a well-founded belief that it's safe, meets specifications, passes appropriate tests,
and doesn't diminish quality of life, diminish privacy or harm the environment….”
consistent with Gotterbarn and Miller (2009), the latter code may be a useful guide
when examining cases involving software design and underscores the very fact that in
design, as altogether engineering practice, the well-being of the general public should
be the overriding concern. While engineering codes of ethics are plentiful in number,
they differ in their source of ethical authority (i.e., organizational codes vs. professional
codes), are often unenforceable through the law, and formally apply to different groups
of engineers (e.g., supported discipline or organizational membership). However, the
codes are generally recognized as a press release of the values inherent to engineering
and its ethical commitments (Davis 2015).

An engineer’s ethical responsibility doesn't preclude consideration of things like cost


and schedule (Pinkus et al. 1997). Engineers always need to grapple with constraints,
including time and resource limitations. The engineers performing at Boeing did have
legitimate concerns about their company losing contracts to its competitor Airbus. But
being an engineer means public safety and welfare must be the very best priority (Davis
1991). The aforementioned software and other design errors within the development of
the 737 MAX, which resulted in many deaths, would thus seem to be clear violations
of engineering codes of ethics. additionally to pointing to engineering codes, Peterson
(2019) argues that Boeing engineers and managers violated widely accepted ethical
norms like consent and therefore the precautionary principle.

From an engineering perspective, the central ethical issue within the MAX case
arguably circulates round the decision to use software (i.e., MCAS) to “mask” a
questionable hardware design—the repositioning of the engines that disrupted the
aerodynamics of the airframe (Travis 2019). As Johnston and Harris (2019) argue: “To
meet the planning goals and avoid an upscale hardware change, Boeing created the
MCAS as a software Band-Aid.” Though a reliance on software fixes often happens
during this manner, it places a high burden of safety on such fixes that they'll not be
ready to handle, as is illustrated by the case of the Therac-25 radiotherapy machine.
within the Therac-25 case, hardware safety interlocks employed in earlier models of the
machine were replaced by software safety controls. additionally, information about how
the software might malfunction was lacking from the user manual for the Therac
machine. Thus, when certain sorts of errors appeared on its interface, the machine’s
operators didn't skills to reply. Software flaws, among other factors, contributed to 6
patients being given massive radiation overdoses, leading to deaths and high injuries
(Leveson and Turner 1993). A newer case involves problems with the embedded
software guiding the electronic throttle in Toyota vehicles. In 2013, “…a jury found
Toyota liable for two unintended acceleration deaths, with expert witnesses citing bugs
within the software and throttle fail safe defects” (Cummings and Britton 2020).

Boeing’s use of MCAS to mask the many change in hardware configuration of the MAX
was compounded by not providing redundancy for components susceptible to failure
(i.e., the AOA sensors) (Campbell 2019), and by failing to notify pilots about the new
software. In such cases, it's especially crucial that pilots receive clear documentation
and relevant training in order that they skills to manage the hand-off with an automatic
system properly (Johnston and Harris 2019). a part of the need for such training is said
to trust calibration (Borenstein et al. 2020; Borenstein et al. 2018), an element that has
contributed to previous airplane accidents (e.g., Carr 2014). for instance, if pilots don't
place enough trust in an automatic system, they'll add risk by intervening in system
operation. Conversely, if pilots trust an automatic system an excessive amount of, they'll
lack sufficient time to act once they identify a drag. this is often further complicated
within the MAX case because pilots weren't fully aware, if at all, of MCAS’s existence
and the way the system functioned.

In additionally to engineering decision-making that did not prioritize public safety,


questionable management decisions were also made at both Boeing and therefore the
FAA. As noted earlier, Boeing managerial leadership ignored numerous warning signs
that the 737 MAX wasn't safe. Also, FAA’s shift to greater reliance on self-regulation
by Boeing was ill-advised; that lesson appears to possess been learned at the expense
of many lives (Duncan and Aratani 2019).

Rules of Practice
In the fulfillment of their professional duties, shall:
Engineers shall hold paramount the safety, health, and welfare of the public.
1. If engineers’ judgment is overruled under circumstances that endanger life or
property, they shall notify their employer or client and such other authority as may
be appropriate.
2. Engineers shall approve only those engineering documents that are in conformity
with applicable standards.
3. Engineers shall not reveal facts, data, or information without the prior consent of
the client or employer except as authorized or required by law or this Code.
4. Engineers shall not permit the use of their name or associate in business ventures
with any person or firm that they believe is engaged in fraudulent or dishonest
enterprise.
5. Engineers shall not aid or abet the unlawful practice of engineering by a person or
firm.
6. Engineers having knowledge of any alleged violation of this Code shall report
thereon to appropriate professional bodies and, when relevant, also to public
authorities, and cooperate with the proper authorities in furnishing such
information or assistance as may be required.

2.2 WHAT A FAILED BOEING 737 MAX CRASH INCIDENTS

Lion Air Flight 610 took off from Jakarta, Indonesia on Monday, October 29th,
2018, at 6:20AM local time. Its destination was Pangkal Pinang, the largest city of
Indonesia’s Bangka Belitung Islands. Twelve minutes after takeoff, the plane crashed
into the Java Sea, killing all 189 passengers and crew.

Nearly five months later, Ethiopian Airlines Flight 302 took off from Addis Ababa,
Ethiopia on Sunday, March 10th, 2019, at 8:38AM local time. Its destination was
Nairobi, Kenya. Six minutes after takeoff, the plane crashed near the town of Bishoftu,
Ethiopia, killing all 157 people aboard.

Both crashed jets were Boeing 737 Max 8s, a variant of the best-selling aircraft in
history. When Airbus announced in 2010 it would make a new fuel-efficient and cost-
effective plane, Boeing rushed to get out its own version. That version was the 737 Max
airplanes. The Air Current has a great (if slightly insider-y) retelling of the Max jets’
origins.
2.3 WHY IT FAILED?

 Mistakes began nearly a decade ago when Boeing was caught flat-footed after its archrival
Airbus announced a new fuel-efficient plane that threatened the company’s core business.
It rushed the competing 737 Max to market as quickly as possible.
 In developing the Max, Boeing not only cut corners, but it touted them as selling points for
airlines. Since the 737 Max was the same plane type as its predecessors, pilots would only
need a 2.5-hour iPad training to fly its newest iteration.
 MCAS is the new software system blamed for the deadly Lion Air and Ethiopian Airlines
crashes. But its failure in both crashes was the result of Boeing and the Federal Aviation
Administration’s reluctance to properly inform pilots of its existence or to regulate it for
safety.
 The FAA has admitted to being incompetent when regulating software, and, as a policy, it
allows plane manufacturers to police themselves for safety. Nowhere in its amended type
certification of the 737 Max is MCAS mentioned.
 Even still, Boeing only recommends a 30-minute self-study course for pilots on MCAS,
rather than additional simulator or classroom instruction.
 Despite the two crashes, neither Boeing nor the FAA believes they’ve done anything wrong.
A Boeing spokesperson said the company believes the system is still “a robust and effective
way for the FAA to execute its oversight of safety.”
2.4 HOW TO MAKE IT NOT FAIL
A notice of proposed rulemaking (NPRM) has been issued by the Federal Aviation
Administration (FAA). The goal is to replace the previous Airworthiness Directive
(AD) for Boeing 737 MAX aircraft. The agency has decided final remedial measures
for the aircraft to resume flying since issuing the 2018 Airworthiness Directive (AD).
These steps are required to resolve the MAX's hazardous state.

Possible corrective actions of Boeing 737 MAX.


1. Setup new software for the Flight Control Computer (FCC).
2. Integrate and improve flight crew procedures, modify the existing Airplane Flight
Manual (AFM).
3. Install the latest version of the Max Display System (MDS) software.
4. Trim wire routing installations and change the horizontal stabilizer.
5. Conduct a sensor system angle of attack test
6. Do a flight of operational readiness.
7. Make changes to the MCAS, such as having the system rely on data from both AOA
sensors rather than just one.
8. The bigger engines had to be installed higher and further forward on the wings than on
earlier 737 variants.

2.5 WHO IS AT FAULT? WHY?

The Boeing 737 MAX disasters, which killed all 346 passengers and crew members
on board, were the "horrific climax" of mistakes by the plane makers of the Federal
Aviation Administration (FAA). According to the House Transportation and
Infrastructure Committee’s Democratic majority, the crashes were not caused by a
single breakdown, technological error, or poorly handled incident. They were the tragic
result of a series of incorrect technical assumptions made by Boeing engineers, a lack
of accountability on the side of Boeing management, and completely inadequate
monitoring by the FAA. The FAA failed to oversee Boeing's design and development
of the MAX, and the FAA failed to oversee Boeing's supervision and certification of
the aircraft. The certification procedure for Boeing's Max planes was hurried and may
have been hampered. FAA officials pressed FAA’s engineers to transfer safety
evaluations to Boeing and accept the ensuing study as soon as possible. Under pressure
from the FAA to approve its new Max aircraft so it could catch up to Airbus, Boeing
submitted a safety evaluation that was filled with mistakes.

Then, Boeing has been hesitant to confess to a design defect in its aircraft, instead
blaming pilot mistake, as it did in prior tragedies. The corporation blamed the pilot’s
apparent incompetence to manage the planes in stall situations in the 737 MAX case.
According to Langewiesche, the mishaps were caused by the airline’s cost-cutting
policies and the loose regulatory regimes in which they operated. The pilots lacked
adequate training. During the construction of the Max aircraft, officials judged that
pilots could fly the jets without considerable retraining since they were nearly identical
to earlier versions. Boeing was able to save money on further training as a result of this,
which assisted the business in its fight with Airbus to launch newer or more fuel-
efficient planes. Following the disaster of Lion Air 610, the FAA did not amend the
restrictions. Rather than spending hours learning about the 737's new features in
massive, multimillion-dollar simulators, many pilots learnt about them on an iPad.

2.6 A LACK OF "PROFESSIONALISM" OR "CONSCIENTIOUSNESS"

Two Boeing 737 MAX 8 airplanes crashed shortly after takeoff, on October 28, 2018
near Jakarta, Indonesia and March 10, 2019, near Addis Ababa, Ethiopia. The disasters
cost the lives of 346 passengers and crew. recorder data recovered from the 2 planes
indicate that bad engineering practices and surprisingly simple design errors contributed
to both calamities. The Boeing 737 MAX 8 only recently went into service, in May
2017.

i) The Fundamental Canon


According to the primary “fundamental canon” of the National Society of Professional
Engineers (NSPE) Code of Ethics, engineers “shall hold paramount the security, health
and welfare of the general public.” consistent with preliminary findings from the
continued investigations leaked to the ny Times, both disasters were caused by one
faulty sensor, which triggered a replacement automatic anti-stall system to repeatedly
push the plane’s nose down. Several newspapers have reported that Boeing until
recently charged extra for relatively simple and cheap warning displays within the
cockpit that alert pilots to divergent sensor readings. If such displays had been installed
on the 2 737 MAX 8s that crashed, it's more likely (but not certain) that pilots would
are ready to diagnose the malfunctioning anti-stall system. An aircraft manufacturer
that attempts to extend its profit by charging extra for relatively simple but vital safety
devices doesn't “hold paramount the security, health, and welfare of the general
public.”Does it matter that the choice to charge extra for the displays was presumably
made by managers within the sales division instead of by engineers? this is often likely
to depend upon what opinions engineers expressed because the decision was made. The
NSPE Code clearly states: “If engineers’ judgment is overruled under circumstances
that endanger life or property, they shall notify their employer or client and such other
authority as could also be appropriate.”There is, of course, a limit to what proportion
money aircraft manufacturers are often asked to spend on making their products safe,
but that doesn't seem to possess been a relevant consideration during this case.
Compare, as an example, the car industry. Consumers are permitted to shop for cars that
are less safe than the safest models on the market, but regulators don't permit
manufacturers to supply cheap and straightforward safety systems as optional upgrades.
Seatbelts, ABS brakes, and airbags are mandatory equipment altogether new cars sold
in most countries. The plausible concept engineers shall “hold paramount the security
… of the public” explains why this is often so.

ii) Informed Consent


Pilots were never informed that the remake of the 737 MAX 8 model had been equipped
with the new automatic anti-stall system, nor that it might be activated by a faulty
reading of one sensor. Because pilots didn't know that the automated anti-stall system
existed, they were unable to know why the onboard computers repeatedly pushed the
nose of the jet down. this will be construed as a violation of the principle of consent.
even as doctors are obliged to ask patients for consent before any medical intervention,
aircraft manufacturers arguably have an identical obligation to form sure that pilots
liable for the safe operation of their products are properly informed about all critical
systems, and consent to using systems that deduct control from the pilots ultimately
liable for the security of the passengers.The principle of consent is widely accepted in
medical ethics but arguably deserves more attention by engineering ethicists. It is, as an
example, uncontroversial to demand that telephone manufacturers need to ask
customers for consent before their gadgets share the cell phone’s position with third
parties. This moral requirement are often understood as an application of the principle
of consent. That said, the principle of consent is usually not as easy to use in engineering
contexts as in medical ethics. The doctor-patient relationship is more direct and
predictable than the engineer-user relationship. Engineers seldom interact directly with
the user and technological devices are sometimes (mis)used in ways in which can't be
reasonably foreseen by engineers.

iii) The Precautionary Principle


The third ethical principle violated by Boeing is that the precautionary principle.
Several days after the 737 MAX 8 was grounded by aviation authorities round the
world, Boeing CEO Dennis Muilenburg called President Trump to assure him that there
was no got to ground the model within the us. it had been still unclear what had caused
the crashes, Muilenburg claimed. From this epistemic premise, he inferred that it had
been too early to require action. For several days, the Federal Aviation Administration
agreed with this policy. The regulators claimed that foreign civil-aviation authorities
had not “provided data to us that might warrant action.”According to a plausible
formulation of the precautionary principle I defend within the Ethics of Technology,
“reasonable precautionary measures” should be taken by engineers et al. “to safeguard
against uncertain but nonnegligible threats.” Few would dispute that it might are an
inexpensive precautionary measure to ground the 737 MAX 8 immediately after the
second crash. If two fresh airplanes of an equivalent model crash shortly after one
another under what appears to be similar circumstances, regulators don't need to wait
until they know needless to say what caused the crashes before they take action. The
second crash changed the epistemic situation enough to warrant action, albeit it didn't
prove that the anti-stall system was responsible.To avoid a number of the objections
related to the precautionary principle, it's appropriate to consider it as an epistemic
principle instead of as a principle that ought to directly guide our actions. In essence,
it's better (from an ethical point of view) to believe that something is unsafe when it's
not, than to believe that something is safe when it's not. If construed as a belief-guiding
principle grounded on moral consideration, the precautionary principle is compatible
with the principle of maximizing arithmetic mean. we should always first adjust our
beliefs about the planet by applying the precautionary principle then maximize
arithmetic mean relative to those modified beliefs.

3.0 CONCLUSION

The conclusion is we get in an article quote written by Joseph Herkert1, Jason Borenstein2, and
Keith Millerkes entitled Boeing 737 MAX that gives us valuable lessons as an experienced
engineer and educator about the ethical responsibilities of the profession. We must prioritize
Safety in engineering design that and not carelessly minimize costs and adhere to delivery
schedules. we must use almost any standard ethical analysis or framework, for the sake of
Boeing's actions on 737 MAX safety, especially decisions on MCAS, fail. Boeing failed to
meet its responsibilities to protect the public. At the very least, the company must notify airlines
and pilots of significant design changes, particularly the role of the MCAS in offsetting engine
replacement on the MAX from the previous 737 version. That is a “significant” change because
it has a direct, and unfortunately tragic, impact on public safety. The interaction of Boeing and
FAA underscores the fact that conflicts of interest are a serious concern in regulatory action in
the aviation industry. There for Internal and external organizational factors can interfere with
the fulfilment of responsibilities of Boeing and FAA professional engineers; This is a common
problem that requires serious attention from industry leaders, regulators, the professional
community, and educators. The lessons to be learned in this case are nothing new. After large-
scale tragedies involving engineering decision-making calls for change often emerge. But such
a lesson seems to have to be learned and reviewed by every generation of engineers.
References.
 https://www.businessinsider.com/boeing-737-max-timeline-history-full-details-2019-
9#the-boeing-737-first-flew-in-1967-that-model-plane-the-737-100-along-with-a-
slightly-longer-version-the-200-were-the-original-generation-1
 https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-
human-error-mcas-faa
 https://www.youtube.com/watch?v=PdYcJldzOdw
 https://en.wikipedia.org/wiki/Boeing_737_MAX_groundings#:~:text=The%20Boeing
%20737%20MAX%20passenger,302%20on%20March%2010%2C%202019.

You might also like