Advertisement

Advertisement

The Boeing 737 MAX: Lessons for Engineering Ethics

  • Original Research/Scholarship
  • Published: 10 July 2020
  • Volume 26 , pages 2957–2974, ( 2020 )

Cite this article

engineering ethics case studies with solution

  • Joseph Herkert 1 ,
  • Jason Borenstein 2 &
  • Keith Miller 3  

108k Accesses

58 Citations

108 Altmetric

11 Mentions

Explore all metrics

The crash of two 737 MAX passenger aircraft in late 2018 and early 2019, and subsequent grounding of the entire fleet of 737 MAX jets, turned a global spotlight on Boeing’s practices and culture. Explanations for the crashes include: design flaws within the MAX’s new flight control software system designed to prevent stalls; internal pressure to keep pace with Boeing’s chief competitor, Airbus; Boeing’s lack of transparency about the new software; and the lack of adequate monitoring of Boeing by the FAA, especially during the certification of the MAX and following the first crash. While these and other factors have been the subject of numerous government reports and investigative journalism articles, little to date has been written on the ethical significance of the accidents, in particular the ethical responsibilities of the engineers at Boeing and the FAA involved in designing and certifying the MAX. Lessons learned from this case include the need to strengthen the voice of engineers within large organizations. There is also the need for greater involvement of professional engineering societies in ethics-related activities and for broader focus on moral courage in engineering ethics education.

Similar content being viewed by others

engineering ethics case studies with solution

Repentance as Rebuke: Betrayal and Moral Injury in Safety Engineering

engineering ethics case studies with solution

Airworthiness and Safety in Air Operations in Ecuadorian Public Institutions

engineering ethics case studies with solution

Safety in Numbers? (Lessons Learned From Aviation Safety Assessment Techniques)

Avoid common mistakes on your manuscript.

Introduction

In October 2018 and March 2019, Boeing 737 MAX passenger jets crashed minutes after takeoff; these two accidents claimed nearly 350 lives. After the second incident, all 737 MAX planes were grounded worldwide. The 737 MAX was an updated version of the 737 workhorse that first began flying in the 1960s. The crashes were precipitated by a failure of an Angle of Attack (AOA) sensor and the subsequent activation of new flight control software, the Maneuvering Characteristics Augmentation System (MCAS). The MCAS software was intended to compensate for changes in the size and placement of the engines on the MAX as compared to prior versions of the 737. The existence of the software, designed to prevent a stall due to the reconfiguration of the engines, was not disclosed to pilots until after the first crash. Even after that tragic incident, pilots were not required to undergo simulation training on the 737 MAX.

In this paper, we examine several aspects of the case, including technical and other factors that led up to the crashes, especially Boeing’s design choices and organizational tensions internal to the company, and between Boeing and the U.S. Federal Aviation Administration (FAA). While the case is ongoing and at this writing, the 737 MAX has yet to be recertified for flight, our analysis is based on numerous government reports and detailed news accounts currently available. We conclude with a discussion of specific lessons for engineers and engineering educators regarding engineering ethics.

Overview of 737 MAX History and Crashes

In December 2010, Boeing’s primary competitor Airbus announced the A320neo family of jetliners, an update of their successful A320 narrow-body aircraft. The A320neo featured larger, more fuel-efficient engines. Boeing had been planning to introduce a totally new aircraft to replace its successful, but dated, 737 line of jets; yet to remain competitive with Airbus, Boeing instead announced in August 2011 the 737 MAX family, an update of the 737NG with similar engine upgrades to the A320neo and other improvements (Gelles et al. 2019 ). The 737 MAX, which entered service in May 2017, became Boeing’s fastest-selling airliner of all time with 5000 orders from over 100 airlines worldwide (Boeing n.d. a) (See Fig.  1 for timeline of 737 MAX key events).

figure 1

737 MAX timeline showing key events from 2010 to 2019

The 737 MAX had been in operation for over a year when on October 29, 2018, Lion Air flight JT610 crashed into the Java Sea 13 minutes after takeoff from Jakarta, Indonesia; all 189 passengers and crew on board died. Monitoring from the flight data recorder recovered from the wreckage indicated that MCAS, the software specifically designed for the MAX, forced the nose of the aircraft down 26 times in 10 minutes (Gates 2018 ). In October 2019, the Final Report of Indonesia’s Lion Air Accident Investigation was issued. The Report placed some of the blame on the pilots and maintenance crews but concluded that Boeing and the FAA were primarily responsible for the crash (Republic of Indonesia 2019 ).

MCAS was not identified in the original documentation/training for 737 MAX pilots (Glanz et al. 2019 ). But after the Lion Air crash, Boeing ( 2018 ) issued a Flight Crew Operations Manual Bulletin on November 6, 2018 containing procedures for responding to flight control problems due to possible erroneous AOA inputs. The next day the FAA ( 2018a ) issued an Emergency Airworthiness Directive on the same subject; however, the FAA did not ground the 737 MAX at that time. According to published reports, these notices were the first time that airline pilots learned of the existence of MCAS (e.g., Bushey 2019 ).

On March 20, 2019, about four months after the Lion Air crash, Ethiopian Airlines Flight ET302 crashed 6 minutes after takeoff in a field 39 miles from Addis Ababa Airport. The accident caused the deaths of all 157 passengers and crew. The Preliminary Report of the Ethiopian Airlines Accident Investigation (Federal Democratic Republic of Ethiopia 2019 ), issued in April 2019, indicated that the pilots followed the checklist from the Boeing Flight Crew Operations Manual Bulletin posted after the Lion Air crash but could not control the plane (Ahmed et al. 2019 ). This was followed by an Interim Report (Federal Democratic Republic of Ethiopia 2020 ) issued in March 2020 that exonerated the pilots and airline, and placed blame for the accident on design flaws in the MAX (Marks and Dahir 2020 ). Following the second crash, the 737 MAX was grounded worldwide with the U.S., through the FAA, being the last country to act on March 13, 2019 (Kaplan et al. 2019 ).

Design Choices that Led to the Crashes

As noted above, with its belief that it must keep up with its main competitor, Airbus, Boeing elected to modify the latest generation of the 737 family, the 737NG, rather than design an entirely new aircraft. Yet this raised a significant engineering challenge for Boeing. Mounting larger, more fuel-efficient engines, similar to those employed on the A320neo, on the existing 737 airframe posed a serious design problem, because the 737 family was built closer to the ground than the Airbus A320. In order to provide appropriate ground clearance, the larger engines had to be mounted higher and farther forward on the wings than previous models of the 737 (see Fig.  2 ). This significantly changed the aerodynamics of the aircraft and created the possibility of a nose-up stall under certain flight conditions (Travis 2019 ; Glanz et al. 2019 ).

figure 2

(Image source: https://www.norebbo.com )

Boeing 737 MAX (left) compared to Boeing 737NG (right) showing larger 737 MAX engines mounted higher and more forward on the wing.

Boeing’s attempt to solve this problem involved incorporating MCAS as a software fix for the potential stall condition. The 737 was designed with two AOA sensors, one on each side of the aircraft. Yet Boeing decided that the 737 MAX would only use input from one of the plane’s two AOA sensors. If the single AOA sensor was triggered, MCAS would detect a dangerous nose-up condition and send a signal to the horizontal stabilizer located in the tail. Movement of the stabilizer would then force the plane’s tail up and the nose down (Travis 2019 ). In both the Lion Air and Ethiopian Air crashes, the AOA sensor malfunctioned, repeatedly activating MCAS (Gates 2018 ; Ahmed et al. 2019 ). Since the two crashes, Boeing has made adjustments to the MCAS, including that the system will rely on input from the two AOA sensors instead of just one. But still more problems with MCAS have been uncovered. For example, an indicator light that would alert pilots if the jet’s two AOA sensors disagreed, thought by Boeing to be standard on all MAX aircraft, would only operate as part of an optional equipment package that neither airline involved in the crashes purchased (Gelles and Kitroeff 2019a ).

Similar to its responses to previous accidents, Boeing has been reluctant to admit to a design flaw in its aircraft, instead blaming pilot error (Hall and Goelz 2019 ). In the 737 MAX case, the company pointed to the pilots’ alleged inability to control the planes under stall conditions (Economy 2019 ). Following the Ethiopian Airlines crash, Boeing acknowledged for the first time that MCAS played a primary role in the crashes, while continuing to highlight that other factors, such as pilot error, were also involved (Hall and Goelz 2019 ). For example, on April 29, 2019, more than a month after the second crash, then Boeing CEO Dennis Muilenburg defended MCAS by stating:

We've confirmed that [the MCAS system] was designed per our standards, certified per our standards, and we're confident in that process. So, it operated according to those design and certification standards. So, we haven't seen a technical slip or gap in terms of the fundamental design and certification of the approach. (Economy 2019 )

The view that MCAS was not primarily at fault was supported within an article written by noted journalist and pilot William Langewiesche ( 2019 ). While not denying Boeing made serious mistakes, he placed ultimate blame on the use of inexperienced pilots by the two airlines involved in the crashes. Langewiesche suggested that the accidents resulted from the cost-cutting practices of the airlines and the lax regulatory environments in which they operated. He argued that more experienced pilots, despite their lack of information on MCAS, should have been able to take corrective action to control the planes using customary stall prevention procedures. Langewiesche ( 2019 ) concludes in his article that:

What we had in the two downed airplanes was a textbook failure of airmanship. In broad daylight, these pilots couldn’t decipher a variant of a simple runaway trim, and they ended up flying too fast at low altitude, neglecting to throttle back and leading their passengers over an aerodynamic edge into oblivion. They were the deciding factor here — not the MCAS, not the Max.

Others have taken a more critical view of MCAS, Boeing, and the FAA. These critics prominently include Captain Chesley “Sully” Sullenberger, who famously crash-landed an A320 in the Hudson River after bird strikes had knocked out both of the plane’s engines. Sullenberger responded directly to Langewiesche in a letter to the Editor:

… Langewiesche draws the conclusion that the pilots are primarily to blame for the fatal crashes of Lion Air 610 and Ethiopian 302. In resurrecting this age-old aviation canard, Langewiesche minimizes the fatal design flaws and certification failures that precipitated those tragedies, and still pose a threat to the flying public. I have long stated, as he does note, that pilots must be capable of absolute mastery of the aircraft and the situation at all times, a concept pilots call airmanship. Inadequate pilot training and insufficient pilot experience are problems worldwide, but they do not excuse the fatally flawed design of the Maneuvering Characteristics Augmentation System (MCAS) that was a death trap.... (Sullenberger 2019 )

Noting that he is one of the few pilots to have encountered both accident sequences in a 737 MAX simulator, Sullenberger continued:

These emergencies did not present as a classic runaway stabilizer problem, but initially as ambiguous unreliable airspeed and altitude situations, masking MCAS. The MCAS design should never have been approved, not by Boeing, and not by the Federal Aviation Administration (FAA)…. (Sullenberger 2019 )

In June 2019, Sullenberger noted in Congressional Testimony that “These crashes are demonstrable evidence that our current system of aircraft design and certification has failed us. These accidents should never have happened” (Benning and DiFurio 2019 ).

Others have agreed with Sullenberger’s assessment. Software developer and pilot Gregory Travis ( 2019 ) argues that Boeing’s design for the 737 MAX violated industry norms and that the company unwisely used software to compensate for inadequacies in the hardware design. Travis also contends that the existence of MCAS was not disclosed to pilots in order to preserve the fiction that the 737 MAX was just an update of earlier 737 models, which served as a way to circumvent the more stringent FAA certification requirements for a new airplane. Reports from government agencies seem to support this assessment, emphasizing the chaotic cockpit conditions created by MCAS and poor certification practices. The U.S. National Transportation Safety Board (NTSB) ( 2019 ) Safety Recommendations to the FAA in September 2019 indicated that Boeing underestimated the effect MCAS malfunction would have on the cockpit environment (Kitroeff 2019 , a , b ). The FAA Joint Authorities Technical Review ( 2019 ), which included international participation, issued its Final Report in October 2019. The Report faulted Boeing and FAA in MCAS certification (Koenig 2019 ).

Despite Boeing’s attempts to downplay the role of MCAS, it began to work on a fix for the system shortly after the Lion Air crash (Gates 2019 ). MCAS operation will now be based on inputs from both AOA sensors, instead of just one sensor, with a cockpit indicator light when the sensors disagree. In addition, MCAS will only be activated once for an AOA warning rather than multiple times. What follows is that the system would only seek to prevent a stall once per AOA warning. Also, MCAS’s power will be limited in terms of how much it can move the stabilizer and manual override by the pilot will always be possible (Bellamy 2019 ; Boeing n.d. b; Gates 2019 ). For over a year after the Lion Air crash, Boeing held that pilot simulator training would not be required for the redesigned MCAS system. In January 2020, Boeing relented and recommended that pilot simulator training be required when the 737 MAX returns to service (Pasztor et al. 2020 ).

Boeing and the FAA

There is mounting evidence that Boeing, and the FAA as well, had warnings about the inadequacy of MCAS’s design, and about the lack of communication to pilots about its existence and functioning. In 2015, for example, an unnamed Boeing engineer raised in an email the issue of relying on a single AOA sensor (Bellamy 2019 ). In 2016, Mark Forkner, Boeing’s Chief Technical Pilot, in an email to a colleague flagged the erratic behavior of MCAS in a flight simulator noting: “It’s running rampant” (Gelles and Kitroeff 2019c ). Forkner subsequently came under federal investigation regarding whether he misled the FAA regarding MCAS (Kitroeff and Schmidt 2020 ).

In December 2018, following the Lion Air Crash, the FAA ( 2018b ) conducted a Risk Assessment that estimated that fifteen more 737 MAX crashes would occur in the expected fleet life of 45 years if the flight control issues were not addressed; this Risk Assessment was not publicly disclosed until Congressional hearings a year later in December 2019 (Arnold 2019 ). After the two crashes, a senior Boeing engineer, Curtis Ewbank, filed an internal ethics complaint in 2019 about management squelching of a system that might have uncovered errors in the AOA sensors. Ewbank has since publicly stated that “I was willing to stand up for safety and quality… Boeing management was more concerned with cost and schedule than safety or quality” (Kitroeff et al. 2019b ).

One factor in Boeing’s apparent reluctance to heed such warnings may be attributed to the seeming transformation of the company’s engineering and safety culture over time to a finance orientation beginning with Boeing’s merger with McDonnell–Douglas in 1997 (Tkacik 2019 ; Useem 2019 ). Critical changes after the merger included replacing many in Boeing’s top management, historically engineers, with business executives from McDonnell–Douglas and moving the corporate headquarters to Chicago, while leaving the engineering staff in Seattle (Useem 2019 ). According to Tkacik ( 2019 ), the new management even went so far as “maligning and marginalizing engineers as a class”.

Financial drivers thus began to place an inordinate amount of strain on Boeing employees, including engineers. During the development of the 737 MAX, significant production pressure to keep pace with the Airbus 320neo was ever-present. For example, Boeing management allegedly rejected any design changes that would prolong certification or require additional pilot training for the MAX (Gelles et al. 2019 ). As Adam Dickson, a former Boeing engineer, explained in a television documentary (BBC Panorama 2019 ): “There was a lot of interest and pressure on the certification and analysis engineers in particular, to look at any changes to the Max as minor changes”.

Production pressures were exacerbated by the “cozy relationship” between Boeing and the FAA (Kitroeff et al. 2019a ; see also Gelles and Kaplan 2019 ; Hall and Goelz 2019 ). Beginning in 2005, the FAA increased its reliance on manufacturers to certify their own planes. Self-certification became standard practice throughout the U.S. airline industry. By 2018, Boeing was certifying 96% of its own work (Kitroeff et al. 2019a ).

The serious drawbacks to self-certification became acutely apparent in this case. Of particular concern, the safety analysis for MCAS delegated to Boeing by the FAA was flawed in at least three respects: (1) the analysis underestimated the power of MCAS to move the plane’s horizontal tail and thus how difficult it would be for pilots to maintain control of the aircraft; (2) it did not account for the system deploying multiple times; and (3) it underestimated the risk level if MCAS failed, thus permitting a design feature—the single AOA sensor input to MCAS—that did not have built-in redundancy (Gates 2019 ). Related to these concerns, the ability of MCAS to move the horizontal tail was increased without properly updating the safety analysis or notifying the FAA about the change (Gates 2019 ). In addition, the FAA did not require pilot training for MCAS or simulator training for the 737 MAX (Gelles and Kaplan 2019 ). Since the MAX grounding, the FAA has been become more independent during its assessments and certifications—for example, they will not use Boeing personnel when certifying approvals of new 737 MAX planes (Josephs 2019 ).

The role of the FAA has also been subject to political scrutiny. The report of a study of the FAA certification process commissioned by Secretary of Transportation Elaine Chao (DOT 2020 ), released January 16, 2020, concluded that the FAA certification process was “appropriate and effective,” and that certification of the MAX as a new airplane would not have made a difference in the plane’s safety. At the same time, the report recommended a number of measures to strengthen the process and augment FAA’s staff (Pasztor and Cameron 2020 ). In contrast, a report of preliminary investigative findings by the Democratic staff of the House Committee on Transportation and Infrastructure (House TI 2020 ), issued in March 2020, characterized FAA’s certification of the MAX as “grossly insufficient” and criticized Boeing’s design flaws and lack of transparency with the FAA, airlines, and pilots (Duncan and Laris 2020 ).

Boeing has incurred significant economic losses from the crashes and subsequent grounding of the MAX. In December 2019, Boeing CEO Dennis Muilenburg was fired and the corporation announced that 737 MAX production would be suspended in January 2020 (Rich 2019 ) (see Fig.  1 ). Boeing is facing numerous lawsuits and possible criminal investigations. Boeing estimates that its economic losses for the 737 MAX will exceed $18 billion (Gelles 2020 ). In addition to the need to fix MCAS, other issues have arisen in recertification of the aircraft, including wiring for controls of the tail stabilizer, possible weaknesses in the engine rotors, and vulnerabilities in lightning protection for the engines (Kitroeff and Gelles 2020 ). The FAA had planned to flight test the 737 MAX early in 2020, and it was supposed to return to service in summer 2020 (Gelles and Kitroeff 2020 ). Given the global impact of the COVID-19 pandemic and other factors, it is difficult to predict when MAX flights might resume. In addition, uncertainty of passenger demand has resulted in some airlines delaying or cancelling orders for the MAX (Bogaisky 2020 ). Even after obtaining flight approval, public resistance to flying in the 737 MAX will probably be considerable (Gelles 2019 ).

Lessons for Engineering Ethics

The 737 MAX case is still unfolding and will continue to do so for some time. Yet important lessons can already be learned (or relearned) from the case. Some of those lessons are straightforward, and others are more subtle. A key and clear lesson is that engineers may need reminders about prioritizing the public good, and more specifically, the public’s safety. A more subtle lesson pertains to the ways in which the problem of many hands may or may not apply here. Other lessons involve the need for corporations, engineering societies, and engineering educators to rise to the challenge of nurturing and supporting ethical behavior on the part of engineers, especially in light of the difficulties revealed in this case.

All contemporary codes of ethics promulgated by major engineering societies state that an engineer’s paramount responsibility is to protect the “safety, health, and welfare” of the public. The American Institute of Aeronautics and Astronautics Code of Ethics indicates that engineers must “[H]old paramount the safety, health, and welfare of the public in the performance of their duties” (AIAA 2013 ). The Institute of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further, pledging its members: “…to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment” (IEEE 2017 ). The IEEE Computer Society (CS) cooperated with the Association for Computing Machinery (ACM) in developing a Software Engineering Code of Ethics ( 1997 ) which holds that software engineers shall: “Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment….” According to Gotterbarn and Miller ( 2009 ), the latter code is a useful guide when examining cases involving software design and underscores the fact that during design, as in all engineering practice, the well-being of the public should be the overriding concern. While engineering codes of ethics are plentiful in number, they differ in their source of moral authority (i.e., organizational codes vs. professional codes), are often unenforceable through the law, and formally apply to different groups of engineers (e.g., based on discipline or organizational membership). However, the codes are generally recognized as a statement of the values inherent to engineering and its ethical commitments (Davis 2015 ).

An engineer’s ethical responsibility does not preclude consideration of factors such as cost and schedule (Pinkus et al. 1997 ). Engineers always have to grapple with constraints, including time and resource limitations. The engineers working at Boeing did have legitimate concerns about their company losing contracts to its competitor Airbus. But being an engineer means that public safety and welfare must be the highest priority (Davis 1991 ). The aforementioned software and other design errors in the development of the 737 MAX, which resulted in hundreds of deaths, would thus seem to be clear violations of engineering codes of ethics. In addition to pointing to engineering codes, Peterson ( 2019 ) argues that Boeing engineers and managers violated widely accepted ethical norms such as informed consent and the precautionary principle.

From an engineering perspective, the central ethical issue in the MAX case arguably circulates around the decision to use software (i.e., MCAS) to “mask” a questionable hardware design—the repositioning of the engines that disrupted the aerodynamics of the airframe (Travis 2019 ). As Johnston and Harris ( 2019 ) argue: “To meet the design goals and avoid an expensive hardware change, Boeing created the MCAS as a software Band-Aid.” Though a reliance on software fixes often happens in this manner, it places a high burden of safety on such fixes that they may not be able to handle, as is illustrated by the case of the Therac-25 radiation therapy machine. In the Therac-25 case, hardware safety interlocks employed in earlier models of the machine were replaced by software safety controls. In addition, information about how the software might malfunction was lacking from the user manual for the Therac machine. Thus, when certain types of errors appeared on its interface, the machine’s operators did not know how to respond. Software flaws, among other factors, contributed to six patients being given massive radiation overdoses, resulting in deaths and serious injuries (Leveson and Turner 1993 ). A more recent case involves problems with the embedded software guiding the electronic throttle in Toyota vehicles. In 2013, “…a jury found Toyota responsible for two unintended acceleration deaths, with expert witnesses citing bugs in the software and throttle fail safe defects” (Cummings and Britton 2020 ).

Boeing’s use of MCAS to mask the significant change in hardware configuration of the MAX was compounded by not providing redundancy for components prone to failure (i.e., the AOA sensors) (Campbell 2019 ), and by failing to notify pilots about the new software. In such cases, it is especially crucial that pilots receive clear documentation and relevant training so that they know how to manage the hand-off with an automated system properly (Johnston and Harris 2019 ). Part of the necessity for such training is related to trust calibration (Borenstein et al. 2020 ; Borenstein et al. 2018 ), a factor that has contributed to previous airplane accidents (e.g., Carr 2014 ). For example, if pilots do not place enough trust in an automated system, they may add risk by intervening in system operation. Conversely, if pilots trust an automated system too much, they may lack sufficient time to act once they identify a problem. This is further complicated in the MAX case because pilots were not fully aware, if at all, of MCAS’s existence and how the system functioned.

In addition to engineering decision-making that failed to prioritize public safety, questionable management decisions were also made at both Boeing and the FAA. As noted earlier, Boeing managerial leadership ignored numerous warning signs that the 737 MAX was not safe. Also, FAA’s shift to greater reliance on self-regulation by Boeing was ill-advised; that lesson appears to have been learned at the expense of hundreds of lives (Duncan and Aratani 2019 ).

The Problem of Many Hands Revisited

Actions, or inaction, by large, complex organizations, in this case corporate and government entities, suggest that the “problem of many hands” may be relevant to the 737 MAX case. At a high level of abstraction, the problem of many hands involves the idea that accountability is difficult to assign in the face of collective action, especially in a computerized society (Thompson 1980 ; Nissenbaum 1994 ). According to Nissenbaum ( 1996 , 29), “Where a mishap is the work of ‘many hands,’ it may not be obvious who is to blame because frequently its most salient and immediate causal antecedents do not converge with its locus of decision-making. The conditions for blame, therefore, are not satisfied in a way normally satisfied when a single individual is held blameworthy for a harm”.

However, there is an alternative understanding of the problem of many hands. In this version of the problem, the lack of accountability is not merely because multiple people and multiple decisions figure into a final outcome. Instead, in order to “qualify” as the problem of many hands, the component decisions should be benign, or at least far less harmful, if examined in isolation; only when the individual decisions are collectively combined do we see the most harmful result. In this understanding, the individual decision-makers should not have the same moral culpability as they would if they made all the decisions by themselves (Noorman 2020 ).

Both of these understandings of the problem of many hands could shed light on the 737 MAX case. Yet we focus on the first version of the problem. We admit the possibility that some of the isolated decisions about the 737 MAX may have been made in part because of ignorance of a broader picture. While we do not stake a claim on whether this is what actually happened in the MAX case, we acknowledge that it may be true in some circumstances. However, we think the more important point is that some of the 737 MAX decisions were so clearly misguided that a competent engineer should have seen the implications, even if the engineer was not aware of all of the broader context. The problem then is to identify responsibility for the questionable decisions in a way that discourages bad judgments in the future, a task made more challenging by the complexities of the decision-making. Legal proceedings about this case are likely to explore those complexities in detail and are outside the scope of this article. But such complexities must be examined carefully so as not to act as an insulator to accountability.

When many individuals are involved in the design of a computing device, for example, and a serious failure occurs, each person might try to absolve themselves of responsibility by indicating that “too many people” and “too many decisions” were involved for any individual person to know that the problem was going to happen. This is a common, and often dubious, excuse in the attempt to abdicate responsibility for a harm. While it can have different levels of magnitude and severity, the problem of many hands often arises in large scale ethical failures in engineering such as in the Deepwater Horizon oil spill (Thompson 2014 ).

Possible examples in the 737 MAX case of the difficulty of assigning moral responsibility due to the problem of many hands include:

The decision to reposition the engines;

The decision to mask the jet’s subsequent dynamic instability with MCAS;

The decision to rely on only one AOA sensor in designing MCAS; and

The decision to not inform nor properly train pilots about the MCAS system.

While overall responsibility for each of these decisions may be difficult to allocate precisely, at least points 1–3 above arguably reflect fundamental errors in engineering judgement (Travis 2019 ). Boeing engineers and FAA engineers either participated in or were aware of these decisions (Kitroeff and Gelles 2019 ) and may have had opportunities to reconsider or redirect such decisions. As Davis has noted ( 2012 ), responsible engineering professionals make it their business to address problems even when they did not cause the problem, or, we would argue, solely cause it. As noted earlier, reports indicate that at least one Boeing engineer expressed reservations about the design of MCAS (Bellamy 2019 ). Since the two crashes, one Boeing engineer, Curtis Ewbank, filed an internal ethics complaint (Kitroeff et al. 2019b ) and several current and former Boeing engineers and other employees have gone public with various concerns about the 737 MAX (Pasztor 2019 ). And yet, as is often the case, the flawed design went forward with tragic results.

Enabling Ethical Engineers

The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994 ), Space Shuttle Challenger (Werhane 1991 ), and GM ignition switch (Jennings and Trautman 2016 ). In the Pinto case, Ford engineers were aware of the unsafe placement of the fuel tank well before the car was released to the public and signed off on the design even though crash tests showed the tank was vulnerable to rupture during low-speed rear-end collisions (Baura 2006 ). In the case of the GM ignition switch, engineers knew for at least four years about the faulty design, a flaw that resulted in at least a dozen fatal accidents (Stephan 2016 ). In the case of the well-documented Challenger accident, engineer Roger Boisjoly warned his supervisors at Morton Thiokol of potentially catastrophic flaws in the shuttle’s solid rocket boosters a full six months before the accident. He, along with other engineers, unsuccessfully argued on the eve of launch for a delay due to the effect that freezing temperatures could have on the boosters’ O-ring seals. Boisjoly was also one of a handful of engineers to describe these warnings to the Presidential commission investigating the accident (Boisjoly et al. 1989 ).

Returning to the 737 MAX case, could Ewbank or others with concerns about the safety of the airplane have done more than filing ethics complaints or offering public testimony only after the Lion Air and Ethiopian Airlines crashes? One might argue that requiring professional registration by all engineers in the U.S. would result in more ethical conduct (for example, by giving state licensing boards greater oversight authority). Yet the well-entrenched “industry exemption” from registration for most engineers working in large corporations has undermined such calls (Kline 2001 ).

It could empower engineers with safety concerns if Boeing and other corporations would strengthen internal ethics processes, including sincere and meaningful responsiveness to anonymous complaint channels. Schwartz ( 2013 ) outlines three core components of an ethical corporate culture, including strong core ethical values, a formal ethics program (including an ethics hotline), and capable ethical leadership. Schwartz points to Siemens’ creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time (Schnebel and Bienert 2004 ) and has taken efforts in the past to evaluate its effectiveness (Boeing 2003 ). Yet it is clear that more robust measures are needed in response to ethics concerns and complaints. Since the MAX crashes, Boeing’s Board has implemented a number of changes including establishing a corporate safety group and revising internal reporting procedures so that lead engineers primarily report to the chief engineer rather than business managers (Gelles and Kitroeff 2019b , Boeing n.d. c). Whether these measures will be enough to restore Boeing’s former engineering-centered focus remains to be seen.

Professional engineering societies could play a stronger role in communicating and enforcing codes of ethics, in supporting ethical behavior of engineers, and by providing more educational opportunities for learning about ethics and about the ethical responsibilities of engineers. Some societies, including ACM and IEEE, have become increasingly engaged in ethics-related activities. Initially ethics engagement by the societies consisted primarily of a focus on macroethical issues such as sustainable development (Herkert 2004 ). Recently, however, the societies have also turned to a greater focus on microethical issues (the behavior of individuals). The 2017 revision to the IEEE Code of Ethics, for example, highlights the importance of “ethical design” (Adamson and Herkert 2020 ). This parallels IEEE activities in the area of design of autonomous and intelligent systems (e.g., IEEE 2018 ). A promising outcome of this emphasis is a move toward implementing “ethical design” frameworks (Peters et al. 2020 ).

In terms of engineering education, educators need to place a greater emphasis on fostering moral courage, that is the courage to act on one’s moral convictions including adherence to codes of ethics. This is of particular significance in large organizations such as Boeing and the FAA where the agency of engineers may be limited by factors such as organizational culture (Watts and Buckley 2017 ). In a study of twenty-six ethics interventions in engineering programs, Hess and Fore ( 2018 ) found that only twenty-seven percent had a learning goal of development of “ethical courage, confidence or commitment”. This goal could be operationalized in a number of ways, for example through a focus on virtue ethics (Harris 2008 ) or professional identity (Hashemian and Loui 2010 ). This need should not only be addressed within the engineering curriculum but during lifelong learning initiatives and other professional development opportunities as well (Miller 2019 ).

The circumstances surrounding the 737 MAX airplane could certainly serve as an informative case study for ethics or technical courses. The case can shed light on important lessons for engineers including the complex interactions, and sometimes tensions, between engineering and managerial considerations. The case also tangibly displays that what seems to be relatively small-scale, and likely well-intended, decisions by individual engineers can combine collectively to result in large-scale tragedy. No individual person wanted to do harm, but it happened nonetheless. Thus, the case can serve a reminder to current and future generations of engineers that public safety must be the first and foremost priority. A particularly useful pedagogical method for considering this case is to assign students to the roles of engineers, managers, and regulators, as well as the flying public, airline personnel, and representatives of engineering societies (Herkert 1997 ). In addition to illuminating the perspectives and responsibilities of each stakeholder group, role-playing can also shed light on the “macroethical” issues raised by the case (Martin et al. 2019 ) such as airline safety standards and the proper role for engineers and engineering societies in the regulation of the industry.

Conclusions and Recommendations

The case of the Boeing 737 MAX provides valuable lessons for engineers and engineering educators concerning the ethical responsibilities of the profession. Safety is not cheap, but careless engineering design in the name of minimizing costs and adhering to a delivery schedule is a symptom of ethical blight. Using almost any standard ethical analysis or framework, Boeing’s actions regarding the safety of the 737 MAX, particularly decisions regarding MCAS, fall short.

Boeing failed in its obligations to protect the public. At a minimum, the company had an obligation to inform airlines and pilots of significant design changes, especially the role of MCAS in compensating for repositioning of engines in the MAX from prior versions of the 737. Clearly, it was a “significant” change because it had a direct, and unfortunately tragic, impact on the public’s safety. The Boeing and FAA interaction underscores the fact that conflicts of interest are a serious concern in regulatory actions within the airline industry.

Internal and external organizational factors may have interfered with Boeing and FAA engineers’ fulfillment of their professional ethical responsibilities; this is an all too common problem that merits serious attention from industry leaders, regulators, professional societies, and educators. The lessons to be learned in this case are not new. After large scale tragedies involving engineering decision-making, calls for change often emerge. But such lessons apparently must be retaught and relearned by each generation of engineers.

ACM/IEEE-CS Joint Task Force. (1997). Software Engineering Code of Ethics and Professional Practice, https://ethics.acm.org/code-of-ethics/software-engineering-code/ .

Adamson, G., & Herkert, J. (2020). Addressing intelligent systems and ethical design in the IEEE Code of Ethics. In Codes of ethics and ethical guidelines: Emerging technologies, changing fields . New York: Springer ( in press ).

Ahmed, H., Glanz, J., & Beech, H. (2019). Ethiopian airlines pilots followed Boeing’s safety procedures before crash, Report Shows. The New York Times, April 4, https://www.nytimes.com/2019/04/04/world/asia/ethiopia-crash-boeing.html .

AIAA. (2013). Code of Ethics, https://www.aiaa.org/about/Governance/Code-of-Ethics .

Arnold, K. (2019). FAA report predicted there could be 15 more 737 MAX crashes. The Dallas Morning News, December 11, https://www.dallasnews.com/business/airlines/2019/12/11/faa-chief-says-boeings-737-max-wont-be-approved-in-2019/

Baura, G. (2006). Engineering ethics: an industrial perspective . Amsterdam: Elsevier.

Google Scholar  

BBC News. (2019). Work on production line of Boeing 737 MAX ‘Not Adequately Funded’. July 29, https://www.bbc.com/news/business-49142761 .

Bellamy, W. (2019). Boeing CEO outlines 737 MAX MCAS software fix in congressional hearings. Aviation Today, November 2, https://www.aviationtoday.com/2019/11/02/boeing-ceo-outlines-mcas-updates-congressional-hearings/ .

Benning, T., & DiFurio, D. (2019). American Airlines Pilots Union boss prods lawmakers to solve 'Crisis of Trust' over Boeing 737 MAX. The Dallas Morning News, June 19, https://www.dallasnews.com/business/airlines/2019/06/19/american-airlines-pilots-union-boss-prods-lawmakers-to-solve-crisis-of-trust-over-boeing-737-max/ .

Birsch, D., & Fielder, J. (Eds.). (1994). The ford pinto case: A study in applied ethics, business, and technology . New York: The State University of New York Press.

Boeing. (2003). Boeing Releases Independent Reviews of Company Ethics Program. December 18, https://boeing.mediaroom.com/2003-12-18-Boeing-Releases-Independent-Reviews-of-Company-Ethics-Program .

Boeing. (2018). Flight crew operations manual bulletin for the Boeing company. November 6, https://www.avioesemusicas.com/wp-content/uploads/2018/10/TBC-19-Uncommanded-Nose-Down-Stab-Trim-Due-to-AOA.pdf .

Boeing. (n.d. a). About the Boeing 737 MAX. https://www.boeing.com/commercial/737max/ .

Boeing. (n.d. b). 737 MAX Updates. https://www.boeing.com/737-max-updates/ .

Boeing. (n.d. c). Initial actions: sharpening our focus on safety. https://www.boeing.com/737-max-updates/resources/ .

Bogaisky, J. (2020). Boeing stock plunges as coronavirus imperils quick ramp up in 737 MAX deliveries. Forbes, March 11, https://www.forbes.com/sites/jeremybogaisky/2020/03/11/boeing-coronavirus-737-max/#1b9eb8955b5a .

Boisjoly, R. P., Curtis, E. F., & Mellican, E. (1989). Roger Boisjoly and the challenger disaster: The ethical dimensions. J Bus Ethics, 8 (4), 217–230.

Article   Google Scholar  

Borenstein, J., Mahajan, H. P., Wagner, A. R., & Howard, A. (2020). Trust and pediatric exoskeletons: A comparative study of clinician and parental perspectives. IEEE Transactions on Technology and Society , 1 (2), 83–88.

Borenstein, J., Wagner, A. R., & Howard, A. (2018). Overtrust of pediatric health-care robots: A preliminary survey of parent perspectives. IEEE Robot Autom Mag, 25 (1), 46–54.

Bushey, C. (2019). The Tough Crowd Boeing Needs to Convince. Crain’s Chicago Business, October 25, https://www.chicagobusiness.com/manufacturing/tough-crowd-boeing-needs-convince .

Campbell, D. (2019). The many human errors that brought down the Boeing 737 MAX. The Verge, May 2, https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-human-error-mcas-faa .

Carr, N. (2014). The glass cage: Automation and us . Norton.

Cummings, M. L., & Britton, D. (2020). Regulating safety-critical autonomous systems: past, present, and future perspectives. In Living with robots (pp. 119–140). Academic Press, New York.

Davis, M. (1991). Thinking like an engineer: The place of a code of ethics in the practice of a profession. Philos Publ Affairs, 20 (2), 150–167.

Davis, M. (2012). “Ain’t no one here but us social forces”: Constructing the professional responsibility of engineers. Sci Eng Ethics, 18 (1), 13–34.

Davis, M. (2015). Engineering as profession: Some methodological problems in its study. In Engineering identities, epistemologies and values (pp. 65–79). Springer, New York.

Department of Transportation (DOT). (2020). Official report of the special committee to review the Federal Aviation Administration’s Aircraft Certification Process, January 16. https://www.transportation.gov/sites/dot.gov/files/2020-01/scc-final-report.pdf .

Duncan, I., & Aratani, L. (2019). FAA flexes its authority in final stages of Boeing 737 MAX safety review. The Washington Post, November 27, https://www.washingtonpost.com/transportation/2019/11/27/faa-flexes-its-authority-final-stages-boeing-max-safety-review/ .

Duncan, I., & Laris, M. (2020). House report on 737 Max crashes faults Boeing’s ‘culture of concealment’ and labels FAA ‘grossly insufficient’. The Washington Post, March 6, https://www.washingtonpost.com/local/trafficandcommuting/house-report-on-737-max-crashes-faults-boeings-culture-of-concealment-and-labels-faa-grossly-insufficient/2020/03/06/9e336b9e-5fce-11ea-b014-4fafa866bb81_story.html .

Economy, P. (2019). Boeing CEO Puts Partial Blame on Pilots of Crashed 737 MAX Aircraft for Not 'Completely' Following Procedures. Inc., April 30, https://www.inc.com/peter-economy/boeing-ceo-puts-partial-blame-on-pilots-of-crashed-737-max-aircraft-for-not-completely-following-procedures.html .

Federal Aviation Administration (FAA). (2018a). Airworthiness directives; the Boeing company airplanes. FR Doc No: R1-2018-26365. https://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.nsf/0/fe8237743be9b8968625835b004fc051/$FILE/2018-23-51_Correction.pdf .

Federal Aviation Administration (FAA). (2018b). Quantitative Risk Assessment. https://www.documentcloud.org/documents/6573544-Risk-Assessment-for-Release-1.html#document/p1 .

Federal Aviation Administration (FAA). (2019). Joint authorities technical review: observations, findings, and recommendations. October 11, https://www.faa.gov/news/media/attachments/Final_JATR_Submittal_to_FAA_Oct_2019.pdf .

Federal Democratic Republic of Ethiopia. (2019). Aircraft accident investigation preliminary report. Report No. AI-01/19, April 4, https://leehamnews.com/wp-content/uploads/2019/04/Preliminary-Report-B737-800MAX-ET-AVJ.pdf .

Federal Democratic Republic of Ethiopia. (2020). Aircraft Accident Investigation Interim Report. Report No. AI-01/19, March 20, https://www.aib.gov.et/wp-content/uploads/2020/documents/accident/ET-302%2520%2520Interim%2520Investigation%2520%2520Report%2520March%25209%25202020.pdf .

Gates, D. (2018). Pilots struggled against Boeing's 737 MAX control system on doomed Lion Air flight. The Seattle Times, November 27, https://www.seattletimes.com/business/boeing-aerospace/black-box-data-reveals-lion-air-pilots-struggle-against-boeings-737-max-flight-control-system/ .

Gates, D. (2019). Flawed analysis, failed oversight: how Boeing, FAA Certified the Suspect 737 MAX Flight Control System. The Seattle Times, March 17, https://www.seattletimes.com/business/boeing-aerospace/failed-certification-faa-missed-safety-issues-in-the-737-max-system-implicated-in-the-lion-air-crash/ .

Gelles, D. (2019). Boeing can’t fly its 737 MAX, but it’s ready to sell its safety. The New York Times, December 24 (updated February 10, 2020), https://www.nytimes.com/2019/12/24/business/boeing-737-max-survey.html .

Gelles, D. (2020). Boeing expects 737 MAX costs will surpass $18 Billion. The New York Times, January 29, https://www.nytimes.com/2020/01/29/business/boeing-737-max-costs.html .

Gelles, D., & Kaplan, T. (2019). F.A.A. Approval of Boeing jet involved in two crashes comes under scrutiny. The New York Times, March 19, https://www.nytimes.com/2019/03/19/business/boeing-elaine-chao.html .

Gelles, D., & Kitroeff, N. (2019a). Boeing Believed a 737 MAX warning light was standard. It wasn’t. New York: The New York Times. https://www.nytimes.com/2019/05/05/business/boeing-737-max-warning-light.html .

Gelles, D., & Kitroeff, N. (2019b). Boeing board to call for safety changes after 737 MAX Crashes. The New York Times, September 15, (updated October 2), https://www.nytimes.com/2019/09/15/business/boeing-safety-737-max.html .

Gelles, D., & Kitroeff, N. (2019c). Boeing pilot complained of ‘Egregious’ issue with 737 MAX in 2016. The New York Times, October 18, https://www.nytimes.com/2019/10/18/business/boeing-flight-simulator-text-message.html .

Gelles, D., & Kitroeff, N. (2020). What needs to happen to get Boeing’s 737 MAX flying again?. The New York Times, February 10, https://www.nytimes.com/2020/02/10/business/boeing-737-max-fly-again.html .

Gelles, D., Kitroeff, N., Nicas, J., & Ruiz, R. R. (2019). Boeing was ‘Go, Go, Go’ to beat airbus with the 737 MAX. The New York Times, March 23, https://www.nytimes.com/2019/03/23/business/boeing-737-max-crash.html .

Glanz, J., Creswell, J., Kaplan, T., & Wichter, Z. (2019). After a Lion Air 737 MAX Crashed in October, Questions About the Plane Arose. The New York Times, February 3, https://www.nytimes.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html .

Gotterbarn, D., & Miller, K. W. (2009). The public is the priority: Making decisions using the software engineering code of ethics. Computer, 42 (6), 66–73.

Hall, J., & Goelz, P. (2019). The Boeing 737 MAX Crisis Is a Leadership Failure, The New York Times, July 17, https://www.nytimes.com/2019/07/17/opinion/boeing-737-max.html .

Harris, C. E. (2008). The good engineer: Giving virtue its due in engineering ethics. Science and Engineering Ethics, 14 (2), 153–164.

Hashemian, G., & Loui, M. C. (2010). Can instruction in engineering ethics change students’ feelings about professional responsibility? Science and Engineering Ethics, 16 (1), 201–215.

Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics, 3 (4), 447–462.

Herkert, J. R. (2004). Microethics, macroethics, and professional engineering societies. In Emerging technologies and ethical issues in engineering: papers from a workshop (pp. 107–114). National Academies Press, New York.

Hess, J. L., & Fore, G. (2018). A systematic literature review of US engineering ethics interventions. Science and Engineering Ethics, 24 (2), 551–583.

House Committee on Transportation and Infrastructure (House TI). (2020). The Boeing 737 MAX Aircraft: Costs, Consequences, and Lessons from its Design, Development, and Certification-Preliminary Investigative Findings, March. https://transportation.house.gov/imo/media/doc/TI%2520Preliminary%2520Investigative%2520Findings%2520Boeing%2520737%2520MAX%2520March%25202020.pdf .

IEEE. (2017). IEEE Code of Ethics. https://www.ieee.org/about/corporate/governance/p7-8.html .

IEEE. (2018). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems (version 2). https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf .

Jennings, M., & Trautman, L. J. (2016). Ethical culture and legal liability: The GM switch crisis and lessons in governance. Boston University Journal of Science and Technology Law, 22 , 187.

Johnston, P., & Harris, R. (2019). The Boeing 737 MAX Saga: Lessons for software organizations. Software Quality Professional, 21 (3), 4–12.

Josephs, L. (2019). FAA tightens grip on Boeing with plan to individually review each new 737 MAX Jetliner. CNBC, November 27, https://www.cnbc.com/2019/11/27/faa-tightens-grip-on-boeing-with-plan-to-individually-inspect-max-jets.html .

Kaplan, T., Austen, I., & Gebrekidan, S. (2019). The New York Times, March 13. https://www.nytimes.com/2019/03/13/business/canada-737-max.html .

Kitroeff, N. (2019). Boeing underestimated cockpit chaos on 737 MAX, N.T.S.B. Says. The New York Times, September 26, https://www.nytimes.com/2019/09/26/business/boeing-737-max-ntsb-mcas.html .

Kitroeff, N., & Gelles, D. (2019). Legislators call on F.A.A. to say why it overruled its experts on 737 MAX. The New York Times, November 7 (updated December 11), https://www.nytimes.com/2019/11/07/business/boeing-737-max-faa.html .

Kitroeff, N., & Gelles, D. (2020). It’s not just software: New safety risks under scrutiny on Boeing’s 737 MAX. The New York Times, January 5, https://www.nytimes.com/2020/01/05/business/boeing-737-max.html .

Kitroeff, N., & Schmidt, M. S. (2020). Federal prosecutors investigating whether Boeing pilot lied to F.A.A. The New York Times, February 21, https://www.nytimes.com/2020/02/21/business/boeing-737-max-investigation.html .

Kitroeff, N., Gelles, D., & Nicas, J. (2019a). The roots of Boeing’s 737 MAX Crisis: A regulator relaxes its oversight. The New York Times, July 27, https://www.nytimes.com/2019/07/27/business/boeing-737-max-faa.html .

Kitroeff, N., Gelles, D., & Nicas, J. (2019b). Boeing 737 MAX safety system was vetoed, Engineer Says. The New York Times, October 2, https://www.nytimes.com/2019/10/02/business/boeing-737-max-crashes.html .

Kline, R. R. (2001). Using history and sociology to teach engineering ethics. IEEE Technology and Society Magazine, 20 (4), 13–20.

Koenig, D. (2019). Boeing, FAA both faulted in certification of the 737 MAX. AP, October 11, https://apnews.com/470abf326cdb4229bdc18c8ad8caa78a .

Langewiesche, W. (2019). What really brought down the Boeing 737 MAX? The New York Times, September 18, https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-crashes.html .

Leveson, N. G., & Turner, C. S. (1993). An investigation of the Therac-25 accidents. Computer, 26 (7), 18–41.

Marks, S., & Dahir, A. L. (2020). Ethiopian report on 737 Max Crash Blames Boeing, March 9, https://www.nytimes.com/2020/03/09/world/africa/ethiopia-crash-boeing.html .

Martin, D. A., Conlon, E., & Bowe, B. (2019). The role of role-play in student awareness of the social dimension of the engineering profession. European Journal of Engineering Education, 44 (6), 882–905.

Miller, G. (2019). Toward lifelong excellence: navigating the engineering-business space. In The Engineering-Business Nexus (pp. 81–101). Springer, Cham.

National Transportation Safety Board (NTSB). (2019). Safety Recommendations Report, September 19, https://www.ntsb.gov/investigations/AccidentReports/Reports/ASR1901.pdf .

Nissenbaum, H. (1994). Computing and accountability. Communications of the ACM , January, https://dl.acm.org/doi/10.1145/175222.175228 .

Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2 (1), 25–42.

Noorman, M. (2020). Computing and moral responsibility. In Zalta, E. N. (Ed.). The Stanford Encyclopedia of Philosophy (Spring), https://plato.stanford.edu/archives/spr2020/entries/computing-responsibility .

Pasztor, A. (2019). More Whistleblower complaints emerge in Boeing 737 MAX Safety Inquiries. The Wall Street Journal, April 27, https://www.wsj.com/articles/more-whistleblower-complaints-emerge-in-boeing-737-max-safety-inquiries-11556418721 .

Pasztor, A., & Cameron, D. (2020). U.S. News: Panel Backs How FAA gave safety approval for 737 MAX. The Wall Street Journal, January 17, https://www.wsj.com/articles/panel-clears-737-maxs-safety-approval-process-at-faa-11579188086 .

Pasztor, A., Cameron.D., & Sider, A. (2020). Boeing backs MAX simulator training in reversal of stance. The Wall Street Journal, January 7, https://www.wsj.com/articles/boeing-recommends-fresh-max-simulator-training-11578423221 .

Peters, D., Vold, K., Robinson, D., & Calvo, R. A. (2020). Responsible AI—two frameworks for ethical design practice. IEEE Transactions on Technology and Society, 1 (1), 34–47.

Peterson, M. (2019). The ethical failures behind the Boeing disasters. Blog of the APA, April 8, https://blog.apaonline.org/2019/04/08/the-ethical-failures-behind-the-boeing-disasters/ .

Pinkus, R. L., Pinkus, R. L. B., Shuman, L. J., Hummon, N. P., & Wolfe, H. (1997). Engineering ethics: Balancing cost, schedule, and risk-lessons learned from the space shuttle . Cambridge: Cambridge University Press.

Republic of Indonesia. (2019). Final Aircraft Accident Investigation Report. KNKT.18.10.35.04, https://knkt.dephub.go.id/knkt/ntsc_aviation/baru/2018%2520-%2520035%2520-%2520PK-LQP%2520Final%2520Report.pdf .

Rich, G. (2019). Boeing 737 MAX should return in 2020 but the crisis won't be over. Investor's Business Daily, December 31, https://www.investors.com/news/boeing-737-max-service-return-2020-crisis-not-over/ .

Schnebel, E., & Bienert, M. A. (2004). Implementing ethics in business organizations. Journal of Business Ethics, 53 (1–2), 203–211.

Schwartz, M. S. (2013). Developing and sustaining an ethical corporate culture: The core elements. Business Horizons, 56 (1), 39–50.

Stephan, K. (2016). GM Ignition Switch Recall: Too Little Too Late? [Ethical Dilemmas]. IEEE Technology and Society Magazine, 35 (2), 34–35.

Sullenberger, S. (2019). My letter to the editor of New York Times Magazine, https://www.sullysullenberger.com/my-letter-to-the-editor-of-new-york-times-magazine/ .

Thompson, D. F. (1980). Moral responsibility of public officials: The problem of many hands. American Political Science Review, 74 (4), 905–916.

Thompson, D. F. (2014). Responsibility for failures of government: The problem of many hands. The American Review of Public Administration, 44 (3), 259–273.

Tkacik, M. (2019). Crash course: how Boeing’s managerial revolution created the 737 MAX Disaster. The New Republic, September 18, https://newrepublic.com/article/154944/boeing-737-max-investigation-indonesia-lion-air-ethiopian-airlines-managerial-revolution .

Travis, G. (2019). How the Boeing 737 MAX disaster looks to a software developer. IEEE Spectrum , April 18, https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software-developer .

Useem, J. (2019). The long-forgotten flight that sent Boeing off course. The Atlantic, November 20, https://www.theatlantic.com/ideas/archive/2019/11/how-boeing-lost-its-bearings/602188/ .

Watts, L. L., & Buckley, M. R. (2017). A dual-processing model of moral whistleblowing in organizations. Journal of Business Ethics, 146 (3), 669–683.

Werhane, P. H. (1991). Engineers and management: The challenge of the Challenger incident. Journal of Business Ethics, 10 (8), 605–616.

Download references

Acknowledgement

The authors would like to thank the anonymous reviewers for their helpful comments.

Author information

Authors and affiliations.

North Carolina State University, Raleigh, NC, USA

Joseph Herkert

Georgia Institute of Technology, Atlanta, GA, USA

Jason Borenstein

University of Missouri – St. Louis, St. Louis, MO, USA

Keith Miller

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Joseph Herkert .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Herkert, J., Borenstein, J. & Miller, K. The Boeing 737 MAX: Lessons for Engineering Ethics. Sci Eng Ethics 26 , 2957–2974 (2020). https://doi.org/10.1007/s11948-020-00252-y

Download citation

Received : 26 March 2020

Accepted : 25 June 2020

Published : 10 July 2020

Issue Date : December 2020

DOI : https://doi.org/10.1007/s11948-020-00252-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Engineering ethics
  • Airline safety
  • Engineering design
  • Corporate culture
  • Software engineering
  • Find a journal
  • Publish with us
  • Track your research

engineering ethics case studies with solution

  • Member Benefits
  • Types of Membership
  • Renew Membership
  • Diversity, Equity, and Inclusion
  • Get Involved
  • NSPE Communities
  • Interest Groups
  • State Societies
  • What Is a PE
  • Why Get Licensed
  • How to Get Licensed
  • Maintaining a License
  • Why PEs Matter
  • NSPE Protects Your PE License
  • Licensing Boards
  • Licensing Resources
  • Professional Engineers Day
  • History of the Code of Ethics for Engineers
  • Engineers' Creed
  • Code of Ethics (French)
  • Code of Ethics (German)
  • Code of Ethics (Japanese)
  • Code of Ethics (Spanish)
  • Board of Ethical Review
  • Board of Ethical Review Cases
  • Education and Publications
  • Engineering Ethics Videos
  • Ethics Exam
  • Milton F. Lunch Ethics Contest
  • Leadership Institute for Women PEs
  • 2024 Professional Engineers Conference
  • PE/FE Exam Preparation
  • Emerging Leaders Program
  • NSPE Education Foundation
  • EJCDC Contract Documents
  • Professional Liability
  • NSPE Advocacy Center
  • Sustainability and Resilience
  • Action on Issues
  • Latest News
  • Reports on State PE Laws and Rules
  • Advocacy Tools
  • State Watch
  • PE Legislators
  • Professional Policies and Position Statements
  • NSPE Legal Fund
  • Protect the PE Fund
  • NSPE Life Member Contribution
  • Digital PE Magazine
  • PE Magazine
  • Daily Designs Archives
  • NSPE Update
  • Advertising

News and Publications

New Ethics Case Studies Published

NSPE Today New Ethics Case Studies Published

NSPE’s Board of Ethical Review has published six new case studies that provide engineering ethics guidance using factbased scenarios. The cases cover the topics of plan stamping; gifts; the public health, safety, and welfare; conflicts of interest; responsible charge; and job qualifications. NSPE established the Board of Ethical Review in June 1954 due to many requests by engineers, state societies, and chapters for interpretations of the Code of Ethics in specific circumstances. Since the publishing of the first case in 1958, which involved questionable actions on a World Bank-financed hydroelectric project, the case catalog has grown to nearly 650.

Today, there are many real-world examples in which engineering ethics has a direct impact on the public, especially those related to technology advancement. For example, NSPE encourages policymakers to protect the public health, safety, and welfare when developing artificial intelligence and autonomous vehicles. In comments to the National Institute of Standards and Technology in August, NSPE called for the involvement of ethically accountable licensed professional engineers or duly certified individuals in the AI development process. The Society has also called on NIST to create AI technical standards that include an ethical framework that can be applied universally in the development of AI decision-making.

Each of the BER’s just-released cases dives into subjects that practicing professional engineers and engineer interns can face on the job. In Case 20-4 , a PE for a metropolitan water commission and a consulting engineer retained by the commission are faced with ethical dilemmas surrounding the commission’s consideration of a change in its water supply source—a change with public health, safety, and welfare implications. In another case ( 20-1 ), an engineer intern applies for a position at a consulting firm. The job requires the candidate to hold a PE license or to become licensed within 90 days. The firm offers the job to the engineer intern, but complications arise when the EI fails the PE exam and is found to have withheld information from the firm.

More NSPE Now Articles

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

The Boeing 737 MAX: Lessons for Engineering Ethics

Joseph herkert.

1 North Carolina State University, Raleigh, NC USA

Jason Borenstein

2 Georgia Institute of Technology, Atlanta, GA USA

Keith Miller

3 University of Missouri – St. Louis, St. Louis, MO USA

The crash of two 737 MAX passenger aircraft in late 2018 and early 2019, and subsequent grounding of the entire fleet of 737 MAX jets, turned a global spotlight on Boeing’s practices and culture. Explanations for the crashes include: design flaws within the MAX’s new flight control software system designed to prevent stalls; internal pressure to keep pace with Boeing’s chief competitor, Airbus; Boeing’s lack of transparency about the new software; and the lack of adequate monitoring of Boeing by the FAA, especially during the certification of the MAX and following the first crash. While these and other factors have been the subject of numerous government reports and investigative journalism articles, little to date has been written on the ethical significance of the accidents, in particular the ethical responsibilities of the engineers at Boeing and the FAA involved in designing and certifying the MAX. Lessons learned from this case include the need to strengthen the voice of engineers within large organizations. There is also the need for greater involvement of professional engineering societies in ethics-related activities and for broader focus on moral courage in engineering ethics education.

Introduction

In October 2018 and March 2019, Boeing 737 MAX passenger jets crashed minutes after takeoff; these two accidents claimed nearly 350 lives. After the second incident, all 737 MAX planes were grounded worldwide. The 737 MAX was an updated version of the 737 workhorse that first began flying in the 1960s. The crashes were precipitated by a failure of an Angle of Attack (AOA) sensor and the subsequent activation of new flight control software, the Maneuvering Characteristics Augmentation System (MCAS). The MCAS software was intended to compensate for changes in the size and placement of the engines on the MAX as compared to prior versions of the 737. The existence of the software, designed to prevent a stall due to the reconfiguration of the engines, was not disclosed to pilots until after the first crash. Even after that tragic incident, pilots were not required to undergo simulation training on the 737 MAX.

In this paper, we examine several aspects of the case, including technical and other factors that led up to the crashes, especially Boeing’s design choices and organizational tensions internal to the company, and between Boeing and the U.S. Federal Aviation Administration (FAA). While the case is ongoing and at this writing, the 737 MAX has yet to be recertified for flight, our analysis is based on numerous government reports and detailed news accounts currently available. We conclude with a discussion of specific lessons for engineers and engineering educators regarding engineering ethics.

Overview of 737 MAX History and Crashes

In December 2010, Boeing’s primary competitor Airbus announced the A320neo family of jetliners, an update of their successful A320 narrow-body aircraft. The A320neo featured larger, more fuel-efficient engines. Boeing had been planning to introduce a totally new aircraft to replace its successful, but dated, 737 line of jets; yet to remain competitive with Airbus, Boeing instead announced in August 2011 the 737 MAX family, an update of the 737NG with similar engine upgrades to the A320neo and other improvements (Gelles et al. 2019 ). The 737 MAX, which entered service in May 2017, became Boeing’s fastest-selling airliner of all time with 5000 orders from over 100 airlines worldwide (Boeing n.d. a) (See Fig.  1 for timeline of 737 MAX key events).

An external file that holds a picture, illustration, etc.
Object name is 11948_2020_252_Fig1_HTML.jpg

737 MAX timeline showing key events from 2010 to 2019

The 737 MAX had been in operation for over a year when on October 29, 2018, Lion Air flight JT610 crashed into the Java Sea 13 minutes after takeoff from Jakarta, Indonesia; all 189 passengers and crew on board died. Monitoring from the flight data recorder recovered from the wreckage indicated that MCAS, the software specifically designed for the MAX, forced the nose of the aircraft down 26 times in 10 minutes (Gates 2018 ). In October 2019, the Final Report of Indonesia’s Lion Air Accident Investigation was issued. The Report placed some of the blame on the pilots and maintenance crews but concluded that Boeing and the FAA were primarily responsible for the crash (Republic of Indonesia 2019 ).

MCAS was not identified in the original documentation/training for 737 MAX pilots (Glanz et al. 2019 ). But after the Lion Air crash, Boeing ( 2018 ) issued a Flight Crew Operations Manual Bulletin on November 6, 2018 containing procedures for responding to flight control problems due to possible erroneous AOA inputs. The next day the FAA ( 2018a ) issued an Emergency Airworthiness Directive on the same subject; however, the FAA did not ground the 737 MAX at that time. According to published reports, these notices were the first time that airline pilots learned of the existence of MCAS (e.g., Bushey 2019 ).

On March 20, 2019, about four months after the Lion Air crash, Ethiopian Airlines Flight ET302 crashed 6 minutes after takeoff in a field 39 miles from Addis Ababa Airport. The accident caused the deaths of all 157 passengers and crew. The Preliminary Report of the Ethiopian Airlines Accident Investigation (Federal Democratic Republic of Ethiopia 2019 ), issued in April 2019, indicated that the pilots followed the checklist from the Boeing Flight Crew Operations Manual Bulletin posted after the Lion Air crash but could not control the plane (Ahmed et al. 2019 ). This was followed by an Interim Report (Federal Democratic Republic of Ethiopia 2020 ) issued in March 2020 that exonerated the pilots and airline, and placed blame for the accident on design flaws in the MAX (Marks and Dahir 2020 ). Following the second crash, the 737 MAX was grounded worldwide with the U.S., through the FAA, being the last country to act on March 13, 2019 (Kaplan et al. 2019 ).

Design Choices that Led to the Crashes

As noted above, with its belief that it must keep up with its main competitor, Airbus, Boeing elected to modify the latest generation of the 737 family, the 737NG, rather than design an entirely new aircraft. Yet this raised a significant engineering challenge for Boeing. Mounting larger, more fuel-efficient engines, similar to those employed on the A320neo, on the existing 737 airframe posed a serious design problem, because the 737 family was built closer to the ground than the Airbus A320. In order to provide appropriate ground clearance, the larger engines had to be mounted higher and farther forward on the wings than previous models of the 737 (see Fig.  2 ). This significantly changed the aerodynamics of the aircraft and created the possibility of a nose-up stall under certain flight conditions (Travis 2019 ; Glanz et al. 2019 ).

An external file that holds a picture, illustration, etc.
Object name is 11948_2020_252_Fig2_HTML.jpg

Boeing 737 MAX (left) compared to Boeing 737NG (right) showing larger 737 MAX engines mounted higher and more forward on the wing.

(Image source: https://www.norebbo.com )

Boeing’s attempt to solve this problem involved incorporating MCAS as a software fix for the potential stall condition. The 737 was designed with two AOA sensors, one on each side of the aircraft. Yet Boeing decided that the 737 MAX would only use input from one of the plane’s two AOA sensors. If the single AOA sensor was triggered, MCAS would detect a dangerous nose-up condition and send a signal to the horizontal stabilizer located in the tail. Movement of the stabilizer would then force the plane’s tail up and the nose down (Travis 2019 ). In both the Lion Air and Ethiopian Air crashes, the AOA sensor malfunctioned, repeatedly activating MCAS (Gates 2018 ; Ahmed et al. 2019 ). Since the two crashes, Boeing has made adjustments to the MCAS, including that the system will rely on input from the two AOA sensors instead of just one. But still more problems with MCAS have been uncovered. For example, an indicator light that would alert pilots if the jet’s two AOA sensors disagreed, thought by Boeing to be standard on all MAX aircraft, would only operate as part of an optional equipment package that neither airline involved in the crashes purchased (Gelles and Kitroeff 2019a ).

Similar to its responses to previous accidents, Boeing has been reluctant to admit to a design flaw in its aircraft, instead blaming pilot error (Hall and Goelz 2019 ). In the 737 MAX case, the company pointed to the pilots’ alleged inability to control the planes under stall conditions (Economy 2019 ). Following the Ethiopian Airlines crash, Boeing acknowledged for the first time that MCAS played a primary role in the crashes, while continuing to highlight that other factors, such as pilot error, were also involved (Hall and Goelz 2019 ). For example, on April 29, 2019, more than a month after the second crash, then Boeing CEO Dennis Muilenburg defended MCAS by stating:

We've confirmed that [the MCAS system] was designed per our standards, certified per our standards, and we're confident in that process. So, it operated according to those design and certification standards. So, we haven't seen a technical slip or gap in terms of the fundamental design and certification of the approach. (Economy 2019 )

The view that MCAS was not primarily at fault was supported within an article written by noted journalist and pilot William Langewiesche ( 2019 ). While not denying Boeing made serious mistakes, he placed ultimate blame on the use of inexperienced pilots by the two airlines involved in the crashes. Langewiesche suggested that the accidents resulted from the cost-cutting practices of the airlines and the lax regulatory environments in which they operated. He argued that more experienced pilots, despite their lack of information on MCAS, should have been able to take corrective action to control the planes using customary stall prevention procedures. Langewiesche ( 2019 ) concludes in his article that:

What we had in the two downed airplanes was a textbook failure of airmanship. In broad daylight, these pilots couldn’t decipher a variant of a simple runaway trim, and they ended up flying too fast at low altitude, neglecting to throttle back and leading their passengers over an aerodynamic edge into oblivion. They were the deciding factor here — not the MCAS, not the Max.

Others have taken a more critical view of MCAS, Boeing, and the FAA. These critics prominently include Captain Chesley “Sully” Sullenberger, who famously crash-landed an A320 in the Hudson River after bird strikes had knocked out both of the plane’s engines. Sullenberger responded directly to Langewiesche in a letter to the Editor:

… Langewiesche draws the conclusion that the pilots are primarily to blame for the fatal crashes of Lion Air 610 and Ethiopian 302. In resurrecting this age-old aviation canard, Langewiesche minimizes the fatal design flaws and certification failures that precipitated those tragedies, and still pose a threat to the flying public. I have long stated, as he does note, that pilots must be capable of absolute mastery of the aircraft and the situation at all times, a concept pilots call airmanship. Inadequate pilot training and insufficient pilot experience are problems worldwide, but they do not excuse the fatally flawed design of the Maneuvering Characteristics Augmentation System (MCAS) that was a death trap.... (Sullenberger 2019 )

Noting that he is one of the few pilots to have encountered both accident sequences in a 737 MAX simulator, Sullenberger continued:

These emergencies did not present as a classic runaway stabilizer problem, but initially as ambiguous unreliable airspeed and altitude situations, masking MCAS. The MCAS design should never have been approved, not by Boeing, and not by the Federal Aviation Administration (FAA)…. (Sullenberger 2019 )

In June 2019, Sullenberger noted in Congressional Testimony that “These crashes are demonstrable evidence that our current system of aircraft design and certification has failed us. These accidents should never have happened” (Benning and DiFurio 2019 ).

Others have agreed with Sullenberger’s assessment. Software developer and pilot Gregory Travis ( 2019 ) argues that Boeing’s design for the 737 MAX violated industry norms and that the company unwisely used software to compensate for inadequacies in the hardware design. Travis also contends that the existence of MCAS was not disclosed to pilots in order to preserve the fiction that the 737 MAX was just an update of earlier 737 models, which served as a way to circumvent the more stringent FAA certification requirements for a new airplane. Reports from government agencies seem to support this assessment, emphasizing the chaotic cockpit conditions created by MCAS and poor certification practices. The U.S. National Transportation Safety Board (NTSB) ( 2019 ) Safety Recommendations to the FAA in September 2019 indicated that Boeing underestimated the effect MCAS malfunction would have on the cockpit environment (Kitroeff 2019 , a , b ). The FAA Joint Authorities Technical Review ( 2019 ), which included international participation, issued its Final Report in October 2019. The Report faulted Boeing and FAA in MCAS certification (Koenig 2019 ).

Despite Boeing’s attempts to downplay the role of MCAS, it began to work on a fix for the system shortly after the Lion Air crash (Gates 2019 ). MCAS operation will now be based on inputs from both AOA sensors, instead of just one sensor, with a cockpit indicator light when the sensors disagree. In addition, MCAS will only be activated once for an AOA warning rather than multiple times. What follows is that the system would only seek to prevent a stall once per AOA warning. Also, MCAS’s power will be limited in terms of how much it can move the stabilizer and manual override by the pilot will always be possible (Bellamy 2019 ; Boeing n.d. b; Gates 2019 ). For over a year after the Lion Air crash, Boeing held that pilot simulator training would not be required for the redesigned MCAS system. In January 2020, Boeing relented and recommended that pilot simulator training be required when the 737 MAX returns to service (Pasztor et al. 2020 ).

Boeing and the FAA

There is mounting evidence that Boeing, and the FAA as well, had warnings about the inadequacy of MCAS’s design, and about the lack of communication to pilots about its existence and functioning. In 2015, for example, an unnamed Boeing engineer raised in an email the issue of relying on a single AOA sensor (Bellamy 2019 ). In 2016, Mark Forkner, Boeing’s Chief Technical Pilot, in an email to a colleague flagged the erratic behavior of MCAS in a flight simulator noting: “It’s running rampant” (Gelles and Kitroeff 2019c ). Forkner subsequently came under federal investigation regarding whether he misled the FAA regarding MCAS (Kitroeff and Schmidt 2020 ).

In December 2018, following the Lion Air Crash, the FAA ( 2018b ) conducted a Risk Assessment that estimated that fifteen more 737 MAX crashes would occur in the expected fleet life of 45 years if the flight control issues were not addressed; this Risk Assessment was not publicly disclosed until Congressional hearings a year later in December 2019 (Arnold 2019 ). After the two crashes, a senior Boeing engineer, Curtis Ewbank, filed an internal ethics complaint in 2019 about management squelching of a system that might have uncovered errors in the AOA sensors. Ewbank has since publicly stated that “I was willing to stand up for safety and quality… Boeing management was more concerned with cost and schedule than safety or quality” (Kitroeff et al. 2019b ).

One factor in Boeing’s apparent reluctance to heed such warnings may be attributed to the seeming transformation of the company’s engineering and safety culture over time to a finance orientation beginning with Boeing’s merger with McDonnell–Douglas in 1997 (Tkacik 2019 ; Useem 2019 ). Critical changes after the merger included replacing many in Boeing’s top management, historically engineers, with business executives from McDonnell–Douglas and moving the corporate headquarters to Chicago, while leaving the engineering staff in Seattle (Useem 2019 ). According to Tkacik ( 2019 ), the new management even went so far as “maligning and marginalizing engineers as a class”.

Financial drivers thus began to place an inordinate amount of strain on Boeing employees, including engineers. During the development of the 737 MAX, significant production pressure to keep pace with the Airbus 320neo was ever-present. For example, Boeing management allegedly rejected any design changes that would prolong certification or require additional pilot training for the MAX (Gelles et al. 2019 ). As Adam Dickson, a former Boeing engineer, explained in a television documentary (BBC Panorama 2019 ): “There was a lot of interest and pressure on the certification and analysis engineers in particular, to look at any changes to the Max as minor changes”.

Production pressures were exacerbated by the “cozy relationship” between Boeing and the FAA (Kitroeff et al. 2019a ; see also Gelles and Kaplan 2019 ; Hall and Goelz 2019 ). Beginning in 2005, the FAA increased its reliance on manufacturers to certify their own planes. Self-certification became standard practice throughout the U.S. airline industry. By 2018, Boeing was certifying 96% of its own work (Kitroeff et al. 2019a ).

The serious drawbacks to self-certification became acutely apparent in this case. Of particular concern, the safety analysis for MCAS delegated to Boeing by the FAA was flawed in at least three respects: (1) the analysis underestimated the power of MCAS to move the plane’s horizontal tail and thus how difficult it would be for pilots to maintain control of the aircraft; (2) it did not account for the system deploying multiple times; and (3) it underestimated the risk level if MCAS failed, thus permitting a design feature—the single AOA sensor input to MCAS—that did not have built-in redundancy (Gates 2019 ). Related to these concerns, the ability of MCAS to move the horizontal tail was increased without properly updating the safety analysis or notifying the FAA about the change (Gates 2019 ). In addition, the FAA did not require pilot training for MCAS or simulator training for the 737 MAX (Gelles and Kaplan 2019 ). Since the MAX grounding, the FAA has been become more independent during its assessments and certifications—for example, they will not use Boeing personnel when certifying approvals of new 737 MAX planes (Josephs 2019 ).

The role of the FAA has also been subject to political scrutiny. The report of a study of the FAA certification process commissioned by Secretary of Transportation Elaine Chao (DOT 2020 ), released January 16, 2020, concluded that the FAA certification process was “appropriate and effective,” and that certification of the MAX as a new airplane would not have made a difference in the plane’s safety. At the same time, the report recommended a number of measures to strengthen the process and augment FAA’s staff (Pasztor and Cameron 2020 ). In contrast, a report of preliminary investigative findings by the Democratic staff of the House Committee on Transportation and Infrastructure (House TI 2020 ), issued in March 2020, characterized FAA’s certification of the MAX as “grossly insufficient” and criticized Boeing’s design flaws and lack of transparency with the FAA, airlines, and pilots (Duncan and Laris 2020 ).

Boeing has incurred significant economic losses from the crashes and subsequent grounding of the MAX. In December 2019, Boeing CEO Dennis Muilenburg was fired and the corporation announced that 737 MAX production would be suspended in January 2020 (Rich 2019 ) (see Fig.  1 ). Boeing is facing numerous lawsuits and possible criminal investigations. Boeing estimates that its economic losses for the 737 MAX will exceed $18 billion (Gelles 2020 ). In addition to the need to fix MCAS, other issues have arisen in recertification of the aircraft, including wiring for controls of the tail stabilizer, possible weaknesses in the engine rotors, and vulnerabilities in lightning protection for the engines (Kitroeff and Gelles 2020 ). The FAA had planned to flight test the 737 MAX early in 2020, and it was supposed to return to service in summer 2020 (Gelles and Kitroeff 2020 ). Given the global impact of the COVID-19 pandemic and other factors, it is difficult to predict when MAX flights might resume. In addition, uncertainty of passenger demand has resulted in some airlines delaying or cancelling orders for the MAX (Bogaisky 2020 ). Even after obtaining flight approval, public resistance to flying in the 737 MAX will probably be considerable (Gelles 2019 ).

Lessons for Engineering Ethics

The 737 MAX case is still unfolding and will continue to do so for some time. Yet important lessons can already be learned (or relearned) from the case. Some of those lessons are straightforward, and others are more subtle. A key and clear lesson is that engineers may need reminders about prioritizing the public good, and more specifically, the public’s safety. A more subtle lesson pertains to the ways in which the problem of many hands may or may not apply here. Other lessons involve the need for corporations, engineering societies, and engineering educators to rise to the challenge of nurturing and supporting ethical behavior on the part of engineers, especially in light of the difficulties revealed in this case.

All contemporary codes of ethics promulgated by major engineering societies state that an engineer’s paramount responsibility is to protect the “safety, health, and welfare” of the public. The American Institute of Aeronautics and Astronautics Code of Ethics indicates that engineers must “[H]old paramount the safety, health, and welfare of the public in the performance of their duties” (AIAA 2013 ). The Institute of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further, pledging its members: “…to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment” (IEEE 2017 ). The IEEE Computer Society (CS) cooperated with the Association for Computing Machinery (ACM) in developing a Software Engineering Code of Ethics ( 1997 ) which holds that software engineers shall: “Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment….” According to Gotterbarn and Miller ( 2009 ), the latter code is a useful guide when examining cases involving software design and underscores the fact that during design, as in all engineering practice, the well-being of the public should be the overriding concern. While engineering codes of ethics are plentiful in number, they differ in their source of moral authority (i.e., organizational codes vs. professional codes), are often unenforceable through the law, and formally apply to different groups of engineers (e.g., based on discipline or organizational membership). However, the codes are generally recognized as a statement of the values inherent to engineering and its ethical commitments (Davis 2015 ).

An engineer’s ethical responsibility does not preclude consideration of factors such as cost and schedule (Pinkus et al. 1997 ). Engineers always have to grapple with constraints, including time and resource limitations. The engineers working at Boeing did have legitimate concerns about their company losing contracts to its competitor Airbus. But being an engineer means that public safety and welfare must be the highest priority (Davis 1991 ). The aforementioned software and other design errors in the development of the 737 MAX, which resulted in hundreds of deaths, would thus seem to be clear violations of engineering codes of ethics. In addition to pointing to engineering codes, Peterson ( 2019 ) argues that Boeing engineers and managers violated widely accepted ethical norms such as informed consent and the precautionary principle.

From an engineering perspective, the central ethical issue in the MAX case arguably circulates around the decision to use software (i.e., MCAS) to “mask” a questionable hardware design—the repositioning of the engines that disrupted the aerodynamics of the airframe (Travis 2019 ). As Johnston and Harris ( 2019 ) argue: “To meet the design goals and avoid an expensive hardware change, Boeing created the MCAS as a software Band-Aid.” Though a reliance on software fixes often happens in this manner, it places a high burden of safety on such fixes that they may not be able to handle, as is illustrated by the case of the Therac-25 radiation therapy machine. In the Therac-25 case, hardware safety interlocks employed in earlier models of the machine were replaced by software safety controls. In addition, information about how the software might malfunction was lacking from the user manual for the Therac machine. Thus, when certain types of errors appeared on its interface, the machine’s operators did not know how to respond. Software flaws, among other factors, contributed to six patients being given massive radiation overdoses, resulting in deaths and serious injuries (Leveson and Turner 1993 ). A more recent case involves problems with the embedded software guiding the electronic throttle in Toyota vehicles. In 2013, “…a jury found Toyota responsible for two unintended acceleration deaths, with expert witnesses citing bugs in the software and throttle fail safe defects” (Cummings and Britton 2020 ).

Boeing’s use of MCAS to mask the significant change in hardware configuration of the MAX was compounded by not providing redundancy for components prone to failure (i.e., the AOA sensors) (Campbell 2019 ), and by failing to notify pilots about the new software. In such cases, it is especially crucial that pilots receive clear documentation and relevant training so that they know how to manage the hand-off with an automated system properly (Johnston and Harris 2019 ). Part of the necessity for such training is related to trust calibration (Borenstein et al. 2020 ; Borenstein et al. 2018 ), a factor that has contributed to previous airplane accidents (e.g., Carr 2014 ). For example, if pilots do not place enough trust in an automated system, they may add risk by intervening in system operation. Conversely, if pilots trust an automated system too much, they may lack sufficient time to act once they identify a problem. This is further complicated in the MAX case because pilots were not fully aware, if at all, of MCAS’s existence and how the system functioned.

In addition to engineering decision-making that failed to prioritize public safety, questionable management decisions were also made at both Boeing and the FAA. As noted earlier, Boeing managerial leadership ignored numerous warning signs that the 737 MAX was not safe. Also, FAA’s shift to greater reliance on self-regulation by Boeing was ill-advised; that lesson appears to have been learned at the expense of hundreds of lives (Duncan and Aratani 2019 ).

The Problem of Many Hands Revisited

Actions, or inaction, by large, complex organizations, in this case corporate and government entities, suggest that the “problem of many hands” may be relevant to the 737 MAX case. At a high level of abstraction, the problem of many hands involves the idea that accountability is difficult to assign in the face of collective action, especially in a computerized society (Thompson 1980 ; Nissenbaum 1994 ). According to Nissenbaum ( 1996 , 29), “Where a mishap is the work of ‘many hands,’ it may not be obvious who is to blame because frequently its most salient and immediate causal antecedents do not converge with its locus of decision-making. The conditions for blame, therefore, are not satisfied in a way normally satisfied when a single individual is held blameworthy for a harm”.

However, there is an alternative understanding of the problem of many hands. In this version of the problem, the lack of accountability is not merely because multiple people and multiple decisions figure into a final outcome. Instead, in order to “qualify” as the problem of many hands, the component decisions should be benign, or at least far less harmful, if examined in isolation; only when the individual decisions are collectively combined do we see the most harmful result. In this understanding, the individual decision-makers should not have the same moral culpability as they would if they made all the decisions by themselves (Noorman 2020 ).

Both of these understandings of the problem of many hands could shed light on the 737 MAX case. Yet we focus on the first version of the problem. We admit the possibility that some of the isolated decisions about the 737 MAX may have been made in part because of ignorance of a broader picture. While we do not stake a claim on whether this is what actually happened in the MAX case, we acknowledge that it may be true in some circumstances. However, we think the more important point is that some of the 737 MAX decisions were so clearly misguided that a competent engineer should have seen the implications, even if the engineer was not aware of all of the broader context. The problem then is to identify responsibility for the questionable decisions in a way that discourages bad judgments in the future, a task made more challenging by the complexities of the decision-making. Legal proceedings about this case are likely to explore those complexities in detail and are outside the scope of this article. But such complexities must be examined carefully so as not to act as an insulator to accountability.

When many individuals are involved in the design of a computing device, for example, and a serious failure occurs, each person might try to absolve themselves of responsibility by indicating that “too many people” and “too many decisions” were involved for any individual person to know that the problem was going to happen. This is a common, and often dubious, excuse in the attempt to abdicate responsibility for a harm. While it can have different levels of magnitude and severity, the problem of many hands often arises in large scale ethical failures in engineering such as in the Deepwater Horizon oil spill (Thompson 2014 ).

Possible examples in the 737 MAX case of the difficulty of assigning moral responsibility due to the problem of many hands include:

  • The decision to reposition the engines;
  • The decision to mask the jet’s subsequent dynamic instability with MCAS;
  • The decision to rely on only one AOA sensor in designing MCAS; and
  • The decision to not inform nor properly train pilots about the MCAS system.

While overall responsibility for each of these decisions may be difficult to allocate precisely, at least points 1–3 above arguably reflect fundamental errors in engineering judgement (Travis 2019 ). Boeing engineers and FAA engineers either participated in or were aware of these decisions (Kitroeff and Gelles 2019 ) and may have had opportunities to reconsider or redirect such decisions. As Davis has noted ( 2012 ), responsible engineering professionals make it their business to address problems even when they did not cause the problem, or, we would argue, solely cause it. As noted earlier, reports indicate that at least one Boeing engineer expressed reservations about the design of MCAS (Bellamy 2019 ). Since the two crashes, one Boeing engineer, Curtis Ewbank, filed an internal ethics complaint (Kitroeff et al. 2019b ) and several current and former Boeing engineers and other employees have gone public with various concerns about the 737 MAX (Pasztor 2019 ). And yet, as is often the case, the flawed design went forward with tragic results.

Enabling Ethical Engineers

The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994 ), Space Shuttle Challenger (Werhane 1991 ), and GM ignition switch (Jennings and Trautman 2016 ). In the Pinto case, Ford engineers were aware of the unsafe placement of the fuel tank well before the car was released to the public and signed off on the design even though crash tests showed the tank was vulnerable to rupture during low-speed rear-end collisions (Baura 2006 ). In the case of the GM ignition switch, engineers knew for at least four years about the faulty design, a flaw that resulted in at least a dozen fatal accidents (Stephan 2016 ). In the case of the well-documented Challenger accident, engineer Roger Boisjoly warned his supervisors at Morton Thiokol of potentially catastrophic flaws in the shuttle’s solid rocket boosters a full six months before the accident. He, along with other engineers, unsuccessfully argued on the eve of launch for a delay due to the effect that freezing temperatures could have on the boosters’ O-ring seals. Boisjoly was also one of a handful of engineers to describe these warnings to the Presidential commission investigating the accident (Boisjoly et al. 1989 ).

Returning to the 737 MAX case, could Ewbank or others with concerns about the safety of the airplane have done more than filing ethics complaints or offering public testimony only after the Lion Air and Ethiopian Airlines crashes? One might argue that requiring professional registration by all engineers in the U.S. would result in more ethical conduct (for example, by giving state licensing boards greater oversight authority). Yet the well-entrenched “industry exemption” from registration for most engineers working in large corporations has undermined such calls (Kline 2001 ).

It could empower engineers with safety concerns if Boeing and other corporations would strengthen internal ethics processes, including sincere and meaningful responsiveness to anonymous complaint channels. Schwartz ( 2013 ) outlines three core components of an ethical corporate culture, including strong core ethical values, a formal ethics program (including an ethics hotline), and capable ethical leadership. Schwartz points to Siemens’ creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time (Schnebel and Bienert 2004 ) and has taken efforts in the past to evaluate its effectiveness (Boeing 2003 ). Yet it is clear that more robust measures are needed in response to ethics concerns and complaints. Since the MAX crashes, Boeing’s Board has implemented a number of changes including establishing a corporate safety group and revising internal reporting procedures so that lead engineers primarily report to the chief engineer rather than business managers (Gelles and Kitroeff 2019b , Boeing n.d. c). Whether these measures will be enough to restore Boeing’s former engineering-centered focus remains to be seen.

Professional engineering societies could play a stronger role in communicating and enforcing codes of ethics, in supporting ethical behavior of engineers, and by providing more educational opportunities for learning about ethics and about the ethical responsibilities of engineers. Some societies, including ACM and IEEE, have become increasingly engaged in ethics-related activities. Initially ethics engagement by the societies consisted primarily of a focus on macroethical issues such as sustainable development (Herkert 2004 ). Recently, however, the societies have also turned to a greater focus on microethical issues (the behavior of individuals). The 2017 revision to the IEEE Code of Ethics, for example, highlights the importance of “ethical design” (Adamson and Herkert 2020 ). This parallels IEEE activities in the area of design of autonomous and intelligent systems (e.g., IEEE 2018 ). A promising outcome of this emphasis is a move toward implementing “ethical design” frameworks (Peters et al. 2020 ).

In terms of engineering education, educators need to place a greater emphasis on fostering moral courage, that is the courage to act on one’s moral convictions including adherence to codes of ethics. This is of particular significance in large organizations such as Boeing and the FAA where the agency of engineers may be limited by factors such as organizational culture (Watts and Buckley 2017 ). In a study of twenty-six ethics interventions in engineering programs, Hess and Fore ( 2018 ) found that only twenty-seven percent had a learning goal of development of “ethical courage, confidence or commitment”. This goal could be operationalized in a number of ways, for example through a focus on virtue ethics (Harris 2008 ) or professional identity (Hashemian and Loui 2010 ). This need should not only be addressed within the engineering curriculum but during lifelong learning initiatives and other professional development opportunities as well (Miller 2019 ).

The circumstances surrounding the 737 MAX airplane could certainly serve as an informative case study for ethics or technical courses. The case can shed light on important lessons for engineers including the complex interactions, and sometimes tensions, between engineering and managerial considerations. The case also tangibly displays that what seems to be relatively small-scale, and likely well-intended, decisions by individual engineers can combine collectively to result in large-scale tragedy. No individual person wanted to do harm, but it happened nonetheless. Thus, the case can serve a reminder to current and future generations of engineers that public safety must be the first and foremost priority. A particularly useful pedagogical method for considering this case is to assign students to the roles of engineers, managers, and regulators, as well as the flying public, airline personnel, and representatives of engineering societies (Herkert 1997 ). In addition to illuminating the perspectives and responsibilities of each stakeholder group, role-playing can also shed light on the “macroethical” issues raised by the case (Martin et al. 2019 ) such as airline safety standards and the proper role for engineers and engineering societies in the regulation of the industry.

Conclusions and Recommendations

The case of the Boeing 737 MAX provides valuable lessons for engineers and engineering educators concerning the ethical responsibilities of the profession. Safety is not cheap, but careless engineering design in the name of minimizing costs and adhering to a delivery schedule is a symptom of ethical blight. Using almost any standard ethical analysis or framework, Boeing’s actions regarding the safety of the 737 MAX, particularly decisions regarding MCAS, fall short.

Boeing failed in its obligations to protect the public. At a minimum, the company had an obligation to inform airlines and pilots of significant design changes, especially the role of MCAS in compensating for repositioning of engines in the MAX from prior versions of the 737. Clearly, it was a “significant” change because it had a direct, and unfortunately tragic, impact on the public’s safety. The Boeing and FAA interaction underscores the fact that conflicts of interest are a serious concern in regulatory actions within the airline industry.

Internal and external organizational factors may have interfered with Boeing and FAA engineers’ fulfillment of their professional ethical responsibilities; this is an all too common problem that merits serious attention from industry leaders, regulators, professional societies, and educators. The lessons to be learned in this case are not new. After large scale tragedies involving engineering decision-making, calls for change often emerge. But such lessons apparently must be retaught and relearned by each generation of engineers.

Acknowledgement

The authors would like to thank the anonymous reviewers for their helpful comments.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • ACM/IEEE-CS Joint Task Force. (1997). Software Engineering Code of Ethics and Professional Practice, https://ethics.acm.org/code-of-ethics/software-engineering-code/ . [ PubMed ]
  • Adamson, G., & Herkert, J. (2020). Addressing intelligent systems and ethical design in the IEEE Code of Ethics. In Codes of ethics and ethical guidelines: Emerging technologies, changing fields . New York: Springer ( in press ).
  • Ahmed, H., Glanz, J., & Beech, H. (2019). Ethiopian airlines pilots followed Boeing’s safety procedures before crash, Report Shows. The New York Times, April 4, https://www.nytimes.com/2019/04/04/world/asia/ethiopia-crash-boeing.html .
  • AIAA. (2013). Code of Ethics, https://www.aiaa.org/about/Governance/Code-of-Ethics .
  • Arnold, K. (2019). FAA report predicted there could be 15 more 737 MAX crashes. The Dallas Morning News, December 11, https://www.dallasnews.com/business/airlines/2019/12/11/faa-chief-says-boeings-737-max-wont-be-approved-in-2019/
  • Baura G. Engineering ethics: an industrial perspective. Amsterdam: Elsevier; 2006. [ Google Scholar ]
  • BBC News. (2019). Work on production line of Boeing 737 MAX ‘Not Adequately Funded’. July 29, https://www.bbc.com/news/business-49142761 .
  • Bellamy, W. (2019). Boeing CEO outlines 737 MAX MCAS software fix in congressional hearings. Aviation Today, November 2, https://www.aviationtoday.com/2019/11/02/boeing-ceo-outlines-mcas-updates-congressional-hearings/ .
  • Benning, T., & DiFurio, D. (2019). American Airlines Pilots Union boss prods lawmakers to solve 'Crisis of Trust' over Boeing 737 MAX. The Dallas Morning News, June 19, https://www.dallasnews.com/business/airlines/2019/06/19/american-airlines-pilots-union-boss-prods-lawmakers-to-solve-crisis-of-trust-over-boeing-737-max/ .
  • Birsch D, Fielder J, editors. The ford pinto case: A study in applied ethics, business, and technology. New York: The State University of New York Press; 1994. [ Google Scholar ]
  • Boeing. (2003). Boeing Releases Independent Reviews of Company Ethics Program. December 18, https://boeing.mediaroom.com/2003-12-18-Boeing-Releases-Independent-Reviews-of-Company-Ethics-Program .
  • Boeing. (2018). Flight crew operations manual bulletin for the Boeing company. November 6, https://www.avioesemusicas.com/wp-content/uploads/2018/10/TBC-19-Uncommanded-Nose-Down-Stab-Trim-Due-to-AOA.pdf .
  • Boeing. (n.d. a). About the Boeing 737 MAX. https://www.boeing.com/commercial/737max/ .
  • Boeing. (n.d. b). 737 MAX Updates. https://www.boeing.com/737-max-updates/ .
  • Boeing. (n.d. c). Initial actions: sharpening our focus on safety. https://www.boeing.com/737-max-updates/resources/ .
  • Bogaisky, J. (2020). Boeing stock plunges as coronavirus imperils quick ramp up in 737 MAX deliveries. Forbes, March 11, https://www.forbes.com/sites/jeremybogaisky/2020/03/11/boeing-coronavirus-737-max/#1b9eb8955b5a .
  • Boisjoly RP, Curtis EF, Mellican E. Roger Boisjoly and the challenger disaster: The ethical dimensions. J Bus Ethics. 1989; 8 (4):217–230. doi: 10.1007/BF00383335. [ CrossRef ] [ Google Scholar ]
  • Borenstein, J., Mahajan, H. P., Wagner, A. R., & Howard, A. (2020). Trust and pediatric exoskeletons: A comparative study of clinician and parental perspectives. IEEE Transactions on Technology and Society , 1 (2), 83–88.
  • Borenstein J, Wagner AR, Howard A. Overtrust of pediatric health-care robots: A preliminary survey of parent perspectives. IEEE Robot Autom Mag. 2018; 25 (1):46–54. doi: 10.1109/MRA.2017.2778743. [ CrossRef ] [ Google Scholar ]
  • Bushey, C. (2019). The Tough Crowd Boeing Needs to Convince. Crain’s Chicago Business, October 25, https://www.chicagobusiness.com/manufacturing/tough-crowd-boeing-needs-convince .
  • Campbell, D. (2019). The many human errors that brought down the Boeing 737 MAX. The Verge, May 2, https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-human-error-mcas-faa .
  • Carr, N. (2014). The glass cage: Automation and us . Norton.
  • Cummings, M. L., & Britton, D. (2020). Regulating safety-critical autonomous systems: past, present, and future perspectives. In Living with robots (pp. 119–140). Academic Press, New York.
  • Davis M. Thinking like an engineer: The place of a code of ethics in the practice of a profession. Philos Publ Affairs. 1991; 20 (2):150–167. [ Google Scholar ]
  • Davis M. “Ain’t no one here but us social forces”: Constructing the professional responsibility of engineers. Sci Eng Ethics. 2012; 18 (1):13–34. doi: 10.1007/s11948-010-9225-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Davis, M. (2015). Engineering as profession: Some methodological problems in its study. In Engineering identities, epistemologies and values (pp. 65–79). Springer, New York.
  • Department of Transportation (DOT). (2020). Official report of the special committee to review the Federal Aviation Administration’s Aircraft Certification Process, January 16. https://www.transportation.gov/sites/dot.gov/files/2020-01/scc-final-report.pdf .
  • Duncan, I., & Aratani, L. (2019). FAA flexes its authority in final stages of Boeing 737 MAX safety review. The Washington Post, November 27, https://www.washingtonpost.com/transportation/2019/11/27/faa-flexes-its-authority-final-stages-boeing-max-safety-review/ .
  • Duncan, I., & Laris, M. (2020). House report on 737 Max crashes faults Boeing’s ‘culture of concealment’ and labels FAA ‘grossly insufficient’. The Washington Post, March 6, https://www.washingtonpost.com/local/trafficandcommuting/house-report-on-737-max-crashes-faults-boeings-culture-of-concealment-and-labels-faa-grossly-insufficient/2020/03/06/9e336b9e-5fce-11ea-b014-4fafa866bb81_story.html .
  • Economy, P. (2019). Boeing CEO Puts Partial Blame on Pilots of Crashed 737 MAX Aircraft for Not 'Completely' Following Procedures. Inc., April 30, https://www.inc.com/peter-economy/boeing-ceo-puts-partial-blame-on-pilots-of-crashed-737-max-aircraft-for-not-completely-following-procedures.html .
  • Federal Aviation Administration (FAA). (2018a). Airworthiness directives; the Boeing company airplanes. FR Doc No: R1-2018-26365. https://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.nsf/0/fe8237743be9b8968625835b004fc051/$FILE/2018-23-51_Correction.pdf .
  • Federal Aviation Administration (FAA). (2018b). Quantitative Risk Assessment. https://www.documentcloud.org/documents/6573544-Risk-Assessment-for-Release-1.html#document/p1 .
  • Federal Aviation Administration (FAA). (2019). Joint authorities technical review: observations, findings, and recommendations. October 11, https://www.faa.gov/news/media/attachments/Final_JATR_Submittal_to_FAA_Oct_2019.pdf .
  • Federal Democratic Republic of Ethiopia. (2019). Aircraft accident investigation preliminary report. Report No. AI-01/19, April 4, https://leehamnews.com/wp-content/uploads/2019/04/Preliminary-Report-B737-800MAX-ET-AVJ.pdf .
  • Federal Democratic Republic of Ethiopia. (2020). Aircraft Accident Investigation Interim Report. Report No. AI-01/19, March 20, https://www.aib.gov.et/wp-content/uploads/2020/documents/accident/ET-302%2520%2520Interim%2520Investigation%2520%2520Report%2520March%25209%25202020.pdf .
  • Gates, D. (2018). Pilots struggled against Boeing's 737 MAX control system on doomed Lion Air flight. The Seattle Times, November 27, https://www.seattletimes.com/business/boeing-aerospace/black-box-data-reveals-lion-air-pilots-struggle-against-boeings-737-max-flight-control-system/ .
  • Gates, D. (2019). Flawed analysis, failed oversight: how Boeing, FAA Certified the Suspect 737 MAX Flight Control System. The Seattle Times, March 17, https://www.seattletimes.com/business/boeing-aerospace/failed-certification-faa-missed-safety-issues-in-the-737-max-system-implicated-in-the-lion-air-crash/ .
  • Gelles, D. (2019). Boeing can’t fly its 737 MAX, but it’s ready to sell its safety. The New York Times, December 24 (updated February 10, 2020), https://www.nytimes.com/2019/12/24/business/boeing-737-max-survey.html .
  • Gelles, D. (2020). Boeing expects 737 MAX costs will surpass $18 Billion. The New York Times, January 29, https://www.nytimes.com/2020/01/29/business/boeing-737-max-costs.html .
  • Gelles, D., & Kaplan, T. (2019). F.A.A. Approval of Boeing jet involved in two crashes comes under scrutiny. The New York Times, March 19, https://www.nytimes.com/2019/03/19/business/boeing-elaine-chao.html .
  • Gelles, D., & Kitroeff, N. (2019a). Boeing Believed a 737 MAX warning light was standard. It wasn’t. New York: The New York Times. https://www.nytimes.com/2019/05/05/business/boeing-737-max-warning-light.html .
  • Gelles, D., & Kitroeff, N. (2019b). Boeing board to call for safety changes after 737 MAX Crashes. The New York Times, September 15, (updated October 2), https://www.nytimes.com/2019/09/15/business/boeing-safety-737-max.html .
  • Gelles, D., & Kitroeff, N. (2019c). Boeing pilot complained of ‘Egregious’ issue with 737 MAX in 2016. The New York Times, October 18, https://www.nytimes.com/2019/10/18/business/boeing-flight-simulator-text-message.html .
  • Gelles, D., & Kitroeff, N. (2020). What needs to happen to get Boeing’s 737 MAX flying again?. The New York Times, February 10, https://www.nytimes.com/2020/02/10/business/boeing-737-max-fly-again.html .
  • Gelles, D., Kitroeff, N., Nicas, J., & Ruiz, R. R. (2019). Boeing was ‘Go, Go, Go’ to beat airbus with the 737 MAX. The New York Times, March 23, https://www.nytimes.com/2019/03/23/business/boeing-737-max-crash.html .
  • Glanz, J., Creswell, J., Kaplan, T., & Wichter, Z. (2019). After a Lion Air 737 MAX Crashed in October, Questions About the Plane Arose. The New York Times, February 3, https://www.nytimes.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html .
  • Gotterbarn D, Miller KW. The public is the priority: Making decisions using the software engineering code of ethics. Computer. 2009; 42 (6):66–73. doi: 10.1109/MC.2009.204. [ CrossRef ] [ Google Scholar ]
  • Hall, J., & Goelz, P. (2019). The Boeing 737 MAX Crisis Is a Leadership Failure, The New York Times, July 17, https://www.nytimes.com/2019/07/17/opinion/boeing-737-max.html .
  • Harris CE. The good engineer: Giving virtue its due in engineering ethics. Science and Engineering Ethics. 2008; 14 (2):153–164. doi: 10.1007/s11948-008-9068-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hashemian G, Loui MC. Can instruction in engineering ethics change students’ feelings about professional responsibility? Science and Engineering Ethics. 2010; 16 (1):201–215. doi: 10.1007/s11948-010-9195-5. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Herkert JR. Collaborative learning in engineering ethics. Science and Engineering Ethics. 1997; 3 (4):447–462. doi: 10.1007/s11948-997-0047-x. [ CrossRef ] [ Google Scholar ]
  • Herkert, J. R. (2004). Microethics, macroethics, and professional engineering societies. In Emerging technologies and ethical issues in engineering: papers from a workshop (pp. 107–114). National Academies Press, New York.
  • Hess JL, Fore G. A systematic literature review of US engineering ethics interventions. Science and Engineering Ethics. 2018; 24 (2):551–583. [ PubMed ] [ Google Scholar ]
  • House Committee on Transportation and Infrastructure (House TI). (2020). The Boeing 737 MAX Aircraft: Costs, Consequences, and Lessons from its Design, Development, and Certification-Preliminary Investigative Findings, March. https://transportation.house.gov/imo/media/doc/TI%2520Preliminary%2520Investigative%2520Findings%2520Boeing%2520737%2520MAX%2520March%25202020.pdf .
  • IEEE. (2017). IEEE Code of Ethics. https://www.ieee.org/about/corporate/governance/p7-8.html .
  • IEEE. (2018). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems (version 2). https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf .
  • Jennings M, Trautman LJ. Ethical culture and legal liability: The GM switch crisis and lessons in governance. Boston University Journal of Science and Technology Law. 2016; 22 :187. [ Google Scholar ]
  • Johnston P, Harris R. The Boeing 737 MAX Saga: Lessons for software organizations. Software Quality Professional. 2019; 21 (3):4–12. [ Google Scholar ]
  • Josephs, L. (2019). FAA tightens grip on Boeing with plan to individually review each new 737 MAX Jetliner. CNBC, November 27, https://www.cnbc.com/2019/11/27/faa-tightens-grip-on-boeing-with-plan-to-individually-inspect-max-jets.html .
  • Kaplan, T., Austen, I., & Gebrekidan, S. (2019). The New York Times, March 13. https://www.nytimes.com/2019/03/13/business/canada-737-max.html .
  • Kitroeff, N. (2019). Boeing underestimated cockpit chaos on 737 MAX, N.T.S.B. Says. The New York Times, September 26, https://www.nytimes.com/2019/09/26/business/boeing-737-max-ntsb-mcas.html .
  • Kitroeff, N., & Gelles, D. (2019). Legislators call on F.A.A. to say why it overruled its experts on 737 MAX. The New York Times, November 7 (updated December 11), https://www.nytimes.com/2019/11/07/business/boeing-737-max-faa.html .
  • Kitroeff, N., & Gelles, D. (2020). It’s not just software: New safety risks under scrutiny on Boeing’s 737 MAX. The New York Times, January 5, https://www.nytimes.com/2020/01/05/business/boeing-737-max.html .
  • Kitroeff, N., & Schmidt, M. S. (2020). Federal prosecutors investigating whether Boeing pilot lied to F.A.A. The New York Times, February 21, https://www.nytimes.com/2020/02/21/business/boeing-737-max-investigation.html .
  • Kitroeff, N., Gelles, D., & Nicas, J. (2019a). The roots of Boeing’s 737 MAX Crisis: A regulator relaxes its oversight. The New York Times, July 27, https://www.nytimes.com/2019/07/27/business/boeing-737-max-faa.html .
  • Kitroeff, N., Gelles, D., & Nicas, J. (2019b). Boeing 737 MAX safety system was vetoed, Engineer Says. The New York Times, October 2, https://www.nytimes.com/2019/10/02/business/boeing-737-max-crashes.html .
  • Kline RR. Using history and sociology to teach engineering ethics. IEEE Technology and Society Magazine. 2001; 20 (4):13–20. doi: 10.1109/44.974503. [ CrossRef ] [ Google Scholar ]
  • Koenig, D. (2019). Boeing, FAA both faulted in certification of the 737 MAX. AP, October 11, https://apnews.com/470abf326cdb4229bdc18c8ad8caa78a .
  • Langewiesche, W. (2019). What really brought down the Boeing 737 MAX? The New York Times, September 18, https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-crashes.html .
  • Leveson NG, Turner CS. An investigation of the Therac-25 accidents. Computer. 1993; 26 (7):18–41. doi: 10.1109/MC.1993.274940. [ CrossRef ] [ Google Scholar ]
  • Marks, S., & Dahir, A. L. (2020). Ethiopian report on 737 Max Crash Blames Boeing, March 9, https://www.nytimes.com/2020/03/09/world/africa/ethiopia-crash-boeing.html .
  • Martin DA, Conlon E, Bowe B. The role of role-play in student awareness of the social dimension of the engineering profession. European Journal of Engineering Education. 2019; 44 (6):882–905. doi: 10.1080/03043797.2019.1624691. [ CrossRef ] [ Google Scholar ]
  • Miller, G. (2019). Toward lifelong excellence: navigating the engineering-business space. In The Engineering-Business Nexus (pp. 81–101). Springer, Cham.
  • National Transportation Safety Board (NTSB). (2019). Safety Recommendations Report, September 19, https://www.ntsb.gov/investigations/AccidentReports/Reports/ASR1901.pdf .
  • Nissenbaum, H. (1994). Computing and accountability. Communications of the ACM , January, https://dl.acm.org/doi/10.1145/175222.175228 .
  • Nissenbaum H. Accountability in a computerized society. Science and Engineering Ethics. 1996; 2 (1):25–42. doi: 10.1007/BF02639315. [ CrossRef ] [ Google Scholar ]
  • Noorman, M. (2020). Computing and moral responsibility. In Zalta, E. N. (Ed.). The Stanford Encyclopedia of Philosophy (Spring), https://plato.stanford.edu/archives/spr2020/entries/computing-responsibility .
  • Pasztor, A. (2019). More Whistleblower complaints emerge in Boeing 737 MAX Safety Inquiries. The Wall Street Journal, April 27, https://www.wsj.com/articles/more-whistleblower-complaints-emerge-in-boeing-737-max-safety-inquiries-11556418721 .
  • Pasztor, A., & Cameron, D. (2020). U.S. News: Panel Backs How FAA gave safety approval for 737 MAX. The Wall Street Journal, January 17, https://www.wsj.com/articles/panel-clears-737-maxs-safety-approval-process-at-faa-11579188086 .
  • Pasztor, A., Cameron.D., & Sider, A. (2020). Boeing backs MAX simulator training in reversal of stance. The Wall Street Journal, January 7, https://www.wsj.com/articles/boeing-recommends-fresh-max-simulator-training-11578423221 .
  • Peters D, Vold K, Robinson D, Calvo RA. Responsible AI—two frameworks for ethical design practice. IEEE Transactions on Technology and Society. 2020; 1 (1):34–47. doi: 10.1109/TTS.2020.2974991. [ CrossRef ] [ Google Scholar ]
  • Peterson, M. (2019). The ethical failures behind the Boeing disasters. Blog of the APA, April 8, https://blog.apaonline.org/2019/04/08/the-ethical-failures-behind-the-boeing-disasters/ .
  • Pinkus RL, Pinkus RLB, Shuman LJ, Hummon NP, Wolfe H. Engineering ethics: Balancing cost, schedule, and risk-lessons learned from the space shuttle. Cambridge: Cambridge University Press; 1997. [ Google Scholar ]
  • Republic of Indonesia. (2019). Final Aircraft Accident Investigation Report. KNKT.18.10.35.04, https://knkt.dephub.go.id/knkt/ntsc_aviation/baru/2018%2520-%2520035%2520-%2520PK-LQP%2520Final%2520Report.pdf .
  • Rich, G. (2019). Boeing 737 MAX should return in 2020 but the crisis won't be over. Investor's Business Daily, December 31, https://www.investors.com/news/boeing-737-max-service-return-2020-crisis-not-over/ .
  • Schnebel E, Bienert MA. Implementing ethics in business organizations. Journal of Business Ethics. 2004; 53 (1–2):203–211. doi: 10.1023/B:BUSI.0000039409.58757.a8. [ CrossRef ] [ Google Scholar ]
  • Schwartz MS. Developing and sustaining an ethical corporate culture: The core elements. Business Horizons. 2013; 56 (1):39–50. doi: 10.1016/j.bushor.2012.09.002. [ CrossRef ] [ Google Scholar ]
  • Stephan K. GM Ignition Switch Recall: Too Little Too Late? [Ethical Dilemmas] IEEE Technology and Society Magazine. 2016; 35 (2):34–35. doi: 10.1109/MTS.2016.2554445. [ CrossRef ] [ Google Scholar ]
  • Sullenberger, S. (2019). My letter to the editor of New York Times Magazine, https://www.sullysullenberger.com/my-letter-to-the-editor-of-new-york-times-magazine/ .
  • Thompson DF. Moral responsibility of public officials: The problem of many hands. American Political Science Review. 1980; 74 (4):905–916. doi: 10.2307/1954312. [ CrossRef ] [ Google Scholar ]
  • Thompson DF. Responsibility for failures of government: The problem of many hands. The American Review of Public Administration. 2014; 44 (3):259–273. doi: 10.1177/0275074014524013. [ CrossRef ] [ Google Scholar ]
  • Tkacik, M. (2019). Crash course: how Boeing’s managerial revolution created the 737 MAX Disaster. The New Republic, September 18, https://newrepublic.com/article/154944/boeing-737-max-investigation-indonesia-lion-air-ethiopian-airlines-managerial-revolution .
  • Travis, G. (2019). How the Boeing 737 MAX disaster looks to a software developer. IEEE Spectrum , April 18, https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software-developer .
  • Useem, J. (2019). The long-forgotten flight that sent Boeing off course. The Atlantic, November 20, https://www.theatlantic.com/ideas/archive/2019/11/how-boeing-lost-its-bearings/602188/ .
  • Watts LL, Buckley MR. A dual-processing model of moral whistleblowing in organizations. Journal of Business Ethics. 2017; 146 (3):669–683. doi: 10.1007/s10551-015-2913-9. [ CrossRef ] [ Google Scholar ]
  • Werhane PH. Engineers and management: The challenge of the Challenger incident. Journal of Business Ethics. 1991; 10 (8):605–616. doi: 10.1007/BF00382880. [ CrossRef ] [ Google Scholar ]

engineering ethics case studies with solution

To Ship or Not to Ship

  • Markkula Center for Applied Ethics
  • Focus Areas
  • More Focus Areas
  • Engineering Ethics
  • Engineering Ethics Cases

A quality assurance engineer must decide whether or not to ship products that might be defective.

Rachel works as a Quality Assurance Engineer at a large electronics company. She is responsible for the final testing of her company’s servers and is part of a team which decides when new products will be shipped to distributors for sale.

Rachel’s company has a contract with another company which makes the chips which are incorporated into the servers Rachel’s company makes. The business model for this product is to release a new generation server approximately every six months, meaning Rachel has a limited timeframe to conduct her Quality Control tests.

Because there is such a short amount of time between the release of each next new product, the Quality and Assurance department cannot perform every possible test on the servers to ensure they are defect free. Rachel will not ship a product if there is any possibility that the server could malfunction and cause physical harm to the customer. However, she will ship a product that has a higher likelihood of failure resulting in data loss for the customer, because she knows that if she doesn't, her company's competitor will.

Is this an ethical way to conduct business? How should she determine when to ship a product with known defects?

Clare Bartlett was a 2014-2015 Hackworth Fellow in Engineering Ethics at the Markkula Center for Applied Ethics at Santa Clara University.

August 2015

  • Academic Ethics
  • Bioengineering
  • Engineering Business
  • Civil Engineering
  • Computer/Software Engineering
  • Electrical Engineering
  • International
  • Mechanical Engineering
  • Science/Research Ethics

OEC logo

Site Search

  • How to Search
  • Advisory Group
  • Editorial Board
  • OEC Fellows
  • History and Funding
  • Using OEC Materials
  • Collections
  • Research Ethics Resources
  • Ethics Projects
  • Communities of Practice
  • Get Involved
  • Submit Content
  • Open Access Membership
  • Become a Partner

Hyatt Regency Walkway Collapse

A summary of the Hyatt Walkway collapse that includes discussion questions. 

ENGINEERING ETHICS

Negligence and the professional "debate" over responsibility for design instructor's guide - introduction to the case.

On July 17, 1981, the Hyatt Regency Hotel in Kansas City, Missouri, held a videotaped tea-dance party in their atrium lobby. With many party-goers standing and dancing on the suspended walkways, connections supporting the ceiling rods that held up the second and fourth-floor walkways across the atrium failed, and both walkways collapsed onto the crowded first-floor atrium below. The fourth-floor walkway collapsed onto the second-floor walkway, while the off set third-floor walkway remained intact. As the United States' most devastating structural failure, in terms of loss of life and injuries, the Kansas City Hyatt Regency walkways collapse left 114 dead and in excess of 200 injured. In addition, millions of dollars in costs resulted from the co l  apse, and thousands of lives were adversely affected.

The hotel had only been in operation for approximately one year at the time of the walkways co l  apse, and the ensuing investigation of the accident revealed some unsettling facts:

During January and February, 1979, the design of the hanger rod connections was changed in a series of events and disputed communications between the fabricator (Havens Steel Company) and the engineering design team (G.C.E. International, Inc., a professional engineering firm). The fabricator changed the design from a one-rod to a two-rod system to simplify the assembly task, doubling the load on the connector, which ultimately resulted in the walkways collapse.(1)

The fabricator, in sworn testimony before the administrative judicial hearings after the accident, claimed that his company (Havens) telephoned the engineering firm (G.C.E.) for change approval. G.C.E. denied ever receiving such a call   from Havens.(2)

On October 14, 1979 (more than one year before the walkways collapsed), while the hotel was still  under construction, more than 2700 square feet of the atrium roof collapsed because one of the roof connections at the north end of the atrium failed.(3) In testimony, G.C.E. stated that on three separate occasions they requested on-site project representation during the construction phase; however, these requests were not acted on by the owner (Crown Center Redevelopment Corporation), due to additional costs of providing on-site inspection.(4)

Even as originally designed, the walkways were barely capable of holding up the expected load, and would have failed to meet the requirements of the Kansas City Building Code.(5)

Due to evidence supplied at the Hearings, a number of principals involved lost their engineering licenses, a number of firms went bankrupt, and many expensive legal suits were settled out of court. The case serves as an excellent example of the importance of meeting professional responsibilities, and what the consequences are for professionals who fail to meet those responsibilities. This case is particularly serviceable for use in structural design, statics and materials classes, although it is also useful as a general overview of consequences for professional actions. The Hyatt Regency Walkways Collapse provides a vivid example of the importance of accuracy and detail in engineering design and shop drawings (particularly regarding revisions), and the costly consequences of negligence in this realm.

For purposes of this case study, we assume that the disputed telephone call  was  made by the fabrication firm and that the engineering firm  did  give verbal approval for the fatal design change. Students are, however, encouraged to view the case reversing these assumptions.

Guidelines For Presentation

Read student handout for a detailed description of the case.

At the class preceding case discussion, distribute student handouts: The Kansas City Hyatt Regency Walkways Collapse, which includes literature on negligence and the professional "debate" over responsibility for design, and an annotated bibliography. Have students come to the follow-up discussion class prepared to address the Kansas City Hyatt Regency Walkways Collapse in light of the ethical issues raised in the student handout.

Show Hyatt Regency Walkways Collapse segment of the "To Engineer is Human," video. Discuss with students the five overheads:

The Hyatt Regency Walkways Collapse Cast of Characters

Hanger Rod Details Original Design and As Built

Chronology of the Hyatt Regency Walkways Collapse (four pages)

ASME Code of Ethics of Engineers; and

IEEE Code of Ethics. Ask students some of the following questions:

Who is ultimately responsible for the fatal design flaw? Why?

Does the disputed telephone call matter to the outcome of the case? Why or why not?

What is the responsibility of a licensed professional engineer who fixes his/her seal to fabrication drawings?

  • In terms of meeting building codes, what are the responsibilities of the engineer? The fabricator? The owner?

What measures can professional societies take to ensure that catastrophes such as the Hyatt Regency Walkways Collapse do not occur?

Do you agree with the findings that the principal engineers involved should have been subject to discipline for gross negligence in the practice of engineering? Should they have lost their licenses, temporarily or permanently?

Was it fair that G.C.E., as a company, was held liable for gross negligence and engineering incompetence? Why or why not?

End the discussion with Overhead 6), Hyatt Regency Walkways Collapse: Ethical Issues of the Case. Discuss the ethical questions raised by the case: what are the professional responsibilities of the engineers, fabricators, and hotel contractors? How can professionals protect themselves, and the public, from the gross negligence of an incompetent few? What are the implications of this case in terms of state-by-state licensing procedures?

For a detailed discussion on these issues, see essay #5, "Negligence, Risk, and the Professional Debate Over the Responsibility for Design," appended at the end of the cases in the report. In addition, essays #1 through #4 appended at the end of the case listings in this report will   have relevant background information for the instructor preparing to lead classroom discussion. Their titles are, respectively: "Ethics and Professionalism in Engineering: Why the Interest in Engineering Ethics?;" "Basic Concepts and Methods in Ethics;" "Moral Concepts and Theories," and "Engineering Design: Literature on Social Responsibility Versus Legal Liability."

Case  Notes

  • Missouri Board for Architects, Professional Engineers and Land Surveyors vs. Daniel M. Duncan, Jack.
  • D. Gillum and G.C.E. International, Inc., before the Administrative Hearing Commission, State of Missouri, Case No. AR840239, Statement of the Case, Findings of Fact, Conclusions of Law and Decision rendered by Judge James B. Deutsch, November 14, 1985, pp. 54-63. Case No. AR840239 hereinafter referred to as  Administrative Hearing Commission .
  • Administrative Hearing Commission, pp. 63-66.
  • Administrative Hearing Commission, p. 384.
  • Administrative Hearing Commission, pp. 12-13.
  • Administrative Hearing Commission, pp. 423-425.

Copy of Administrative Hearing Commission: pdf version Word version Note that both of these were scanned from every poor copy of the document. Sorry, but you get what you paid for.

T he Hyatt Regency Walkways Collapse Cast Of Characters

In 1976, as owner, Crown Center Redevelopment Corporation - commenced a project to design and build a Hyatt Regency Hotel in Kansas City, Missouri, and on April 4, 1978, Crown entered into a standard contract with G.C.E. International, Inc. Professional Consulting Firm of Structural Engineers (1980 formerly called Jack D. Gillum & Associates, Ltd. changed name to G.C.E. May 5, 1983)

Jack D. Gillum P.E., structural engineering state licensed since February 26, 1968.

Daniel M. Duncan P.E., structural engineering state licensed since February 27, 1979 PBNDML Architects, Planners, Inc. Architect.

G.C.E. agreed to provide, "all   structural engineering services for a 750-room hotel projected located at 2345 McGee Street, Kansas City, Missouri."

On or about December 19, 1978, Eldridge Construction Company, the general contractor on the Hyatt project, entered into a subcontract with  Havens Steel Company  Professional Fabricator who agreed to fabricate and erect the atrium steel for the Hyatt project.

Chronology Of The Hyatt Regency Walkways Collapse

Early 1976: Crown Center Redevelopment Corporation (owner) commences project to design and build a Hyatt Regency Hotel in Kansas City, Missouri.

July 1976: Gillum-Colaco, Inc. (G.C.E. International, Inc., 1983), a Texas corporation, selected as the consulting structural engineer for the Hyatt project.

July 1976: Hyatt project in schematic design development.

Summer 1977: G.C.E. assisted owner and architect (PBNDML Architects, Planners, Inc.) with developing various plans for hotel project, and decided on basic design.

Late 1977: Bid set of structural drawings and specifications.

Early 1978: Project prepared, using standard Kansas City, Missouri, Building Codes.

April 4, 1978: Actual contract entered into by G.C.E. and the architect, PBNDML Architects, Planners, Inc.

G.C.E. agreed to provide "all structural engineering services for a 750-room hotel project located at 2345 McGee Street, Kansas City, Missouri."

Spring 1978: Construction on hotel begins.

August 28, 1978: Specifications on project issued for construction, based on the American Institute of Steel Construction (AISC) standards used by fabricators.

December 1978: Eldridge Construction Company, general contractor on the Hyatt project, enters into subcontract with Havens Steel Company. Havens agrees to fabricate and erect the atrium steel for the Hyatt project.

January 1979: Events and communications between G.C.E. and Havens.

February 1979: Havens makes design change from a single to a double hanger rod box beam connection for use at the fourth floor walkways. Telephone calls disputed; however, because of alleged communications between engineer and fabricator, Shop Drawing 30 and Erection Drawing E3 are changed.

February 1979: G.C.E. receives 42 shop drawings (including Shop Drawing 30 and Erection Drawing E-3) on February 16, and returns them to Havens stamped with engineering review stamp approval on February 26.

October 14, 1979: Part of the atrium roof collapses while the hotel is under construction. Inspection team called in, whose contract dealt primarily with the investigation of the cause of the roof collapse and created no obligation to check any engineering or design work beyond the scope of their investigation and contract.

October 16, 1979: Owner retains an independent engineering firm, Seiden-Page, to investigate the cause of the atrium roof collapse.

October 20, 1979: Gillum writes owner, stating he is undertaking both an atrium collapse investigation as well   as a thorough design check of all the members comprising the atrium roof.

October: Reports and meetings from engineer to clients.

November 1979: Owner/architect assures clients of overall   safety of the entire atrium.

July 1980: Construction of hotel complete, and the Kansas City Hyatt Regency Hotel opens for business.

July 17, 1981: Connections supporting the rods from the ceiling that held up the 2nd and 4th floor walkways across the atrium of the Hyatt Regency Hotel co l  apse, killing 114 and injuring in excess of 200 others.

February 3, 1984: Missouri Board of Architects, Professional Engineers and Land Surveyors files complaint against Daniel M. Duncan, Jack D. Gillum and G.C.E. International Inc., charging gross negligence, incompetence, misconduct and unprofessional conduct in the practice of engineering in connection with their performance of engineering services in the design and construction of the Hyatt Regency Hotel in Kansas City, Missouri.

November, 1984: Duncan, Gillum, and G.C.E. International, Inc. found guilty of gross negligence, misconduct and unprofessional conduct in the practice of engineering. Subsequently, Duncan and Gillum lost their licenses to practice engineering in the State of Missouri, and G.C.E. had its certificate of authority as an engineering firm revoked. American Society of Civil Engineering (ASCE) adopts report that states structural engineers have full   responsibility for design projects. Duncan and Gillum now practicing engineers in states other than Missouri.

ASME Code Of Ethics Of Engineers The Fundamental Principles

Engineers uphold and advance the integrity, honor, and dignity of the Engineering profession by:

I. Using their knowledge and skill for the enhancement of human welfare;

II. being honest and impartial, and serving with fidelity the public, their employers and clients; and

III. striving to increase the competence and prestige of the engineering profession.

The Fundamental Canons

Engineers shall   hold paramount the safety, health and welfare of the public in the performance of their professional duties.

Engineers shall   perform services only in areas of their competence.

Engineers shall   continue their professional development throughout their careers and shall   provide opportunities for the professional development of those engineers under their supervision.

Engineers shall   act in professional matters for each employer or client as faithful agents or trustees, and shall   avoid conflicts of interest.

Engineers shall   build their professional reputation on the merit of their services and shall   not compete unfairly with others.

Engineers shall   associate only with reputable persons or organizations.

Engineers shall   issue public statements only in an objective and truthful manner.

IEEE Code Of Ethics (Revised October 1990)

We, the members of the IEEE, in recognition of the importance of our technologies in affecting the quality of life throughout the world, and in accepting a personal obligation to our profession, its members and the communities we serve, do hereby commit ourselves to the highest ethical and professional conduct and agree:

to accept responsibility in making engineering decisions consistent with the safety, health, and welfare of the public, and to disclose promptly factors that might endanger the public or the environment;

to avoid real or perceived conflicts of interest whenever possible, and to disclose them to affected parties when they do exist;

to be honest and realistic in stating claims or estimates based on available data;

to reject bribery in all   its forms;

to improve the understanding of technology, its appropriate application, and potential consequences;

to maintain and improve our technical competence and to undertake technological tasks for others only if qualified by training or experience, or after full   disclosure of pertinent limitations;

to seek, accept, and offer honest criticism of technical work, to acknowledge and correct errors, and to credit properly the contributions of others;

to treat fairly all   persons regardless of such factors as race, religion, gender, disability, age, or national origin;

to avoid injuring others, their property, reputation, or employment by false or malicious action;

to assist colleagues and coworkers in their professional development and to support them in following this code of ethics.

Hyatt Regency Walkways Collapse: Ethical Issues Of The Case

  • Who is ultimately responsible for checking the safety of final designs as depicted in shop drawings?
  • What measures can professional societies take to ensure catastrophes like the Hyatt Regency Walkways Collapse do not occur?

On July 17, 1981, the Hyatt Regency Hotel in Kansas City, Missouri, held a videotaped tea-dance party in their atrium lobby. With many party-goers standing and dancing on the suspended walkways, connections supporting the ceiling rods that held up the second and fourth-floor walkways across the atrium failed, and both walkways collapsed onto the crowded first-floor atrium below. The fourth-floor walkway collapsed onto the second-floor walkway, while the off   set third-floor walkway remained intact. As the United States' most devastating structural failure, in terms of loss of life and injuries, the Kansas City Hyatt Regency walkways co l  apse left 114 dead and in excess of 200 injured. In addition, millions of dollars in costs resulted from the co l  apse, and thousands of lives were adversely affected.

The hotel had only been in operation for approximately one year at the time of the walkways co l  apse, and the ensuing investigation of the accident revealed some unsettling facts.

First, during January and February, 1979, over a year before the collapse, the design of the walkway hanger rod connections was changed in a series of events and communications (or disputed miscommunications) between the fabricator (Havens Steel Company) and the engineering design team (G.C.E. International, Inc., a professional engineering firm). The fabricator changed the design from a one-rod to a two-rod system to simplify the assembly task, doubling the load on the connector, which ultimately resulted in the walkways collapse.(1)

Second, the fabricator, in sworn testimony before the administrative judicial hearings after the accident, claimed that his company (Havens) telephoned the engineering firm (G.C.E.) for change approval. G.C.E. denied ever receiving such a call   from Havens.(2)

Third, on October 14, 1979, while the hotel was still   under construction, more than 2700 square feet of the atrium roof collapsed because one of the roof connections at the north end of the atrium failed.(3) In testimony, G.C.E. stated that on three separate occasions they requested on-site project representation to check all   fabrication during the construction phase; however, these requests were not acted on by the owner (Crown Center Redevelopment Corporation), due to additional costs of providing on-site inspection.(4)

Fourth, even as originally designed, the walkways were barely capable of holding up the expected load, and would have failed to meet the requirements of the Kansas City Building Code.(5)

Individuals Involved In The Hyatt Regency Case Several key players are involved in the case:

In 1976, as owner,  Crown Center Redevelopment Corporation  commenced a project to design and build a Hyatt Regency Hotel in Kansas City, Missouri, and on April 4, 1978 entered into a standard contract with G.C.E. International, Inc. Professional Consulting Firm of Structural Engineers (1980 formerly cal ed Jack D. Gil um & Associates, Ltd. changed name to G.C.E. May 5, 1983) Principals Jack D. Gillum P.E., structural engineering state licensed since February 26, 1968 Daniel M. Duncan P.E., structural engineering state licensed since February 27, 1979 and PBNDML Architects, Planners, Inc. Architect. G.C.E. agreed to provide, "all structural engineering services for a 750-room hotel projected located at 2345 McGee Street, Kansas City, Missouri. On or about December 19, 1978, Eldridge Construction Company, the general contractor on the Hyatt project, entered into a subcontract with Havens Steel Company fabricator who agreed to fabricate and erect the atrium steel for the Hyatt project.

Structural Failure During the Atrium Tea Dance

In 1976, Crown Center Redevelopment Corporation initiated a project for designing and building a Hyatt Regency Hotel in Kansas City Missouri. In July of 1976, Gillum-Colaco, Inc., a Texas corporation, was selected as the consulting structural engineer for the project. A schematic design development phase for the project was undertaken from July 1976 through the summer of 1977. During that time, Jack D. Gillum (the supervisor of the professional engineering activities of Gillum-Colaco, Inc.) and Daniel M. Duncan (working under the direct supervision of Gillum, the engineer responsible for the actual structural engineering work on the Hyatt project) assisted Crown Center Redevelopment Corporation (the owner) and PBNDML Architects, Planners, Inc. (the architect on the project) in developing plans for the hotel project and deciding on its basic design. A bid set of structural drawings and specifications for the project were prepared in late 1977 and early 1978, and construction began on the hotel in the spring of 1978. The specifications on the project were issued for construction on August 28, 1978.(6)

On April 4, 1978, the actual written contract was entered into by Gillum-Colaco, Inc. and PBNDML Architects, Planners, Inc. The contract was standard in nature, and Gillum-Colaco, Inc. agreed to provide all   the structural engineering services for the Hyatt Regency project. The firm Gillum-Colaco, Inc. did not actually perform the structural engineering services on the project; instead, they subcontracted the responsibility for performing all   of the structural engineering services for the Hyatt Regency Hotel project to their subsidiary firm, Jack D. Gillum & Associates, Ltd. (hereinafter referenced as G.C.E.).(7) According to the specifications for the project, no work could start until the shop drawings for the work had been approved by the structural engineer.(8)

Three teams, with particular roles to play in the construction system employed in building the Hyatt Regency Hotel, were contracted for the project: PBNDML and G.C.E. made up the "design team," and were authorized to control the entire project on behalf of the owner; Eldridge Construction Co., as the "construction team," was responsible for general contracting; and the "inspection team," made up of two inspecting agencies (H&R Inspection and General Testing), a quality control official, a construction manager, and an investigating engineer (Seiden and Page).

On December 19, 1978, Eldridge Construction Company, as general contractor, entered into a subcontract with Havens Steel Company, who agreed to fabricate and erect the atrium steel for the Hyatt project.

G.C.E. was responsible for preparing structural engineering drawings for the Hyatt project: three walkways spanning the atrium area of the hotel. Wide flange beams with 16-inch depths (W16x26) were used along either side of the walkway and hung from a box beam (made from two MC8x8.5 rectangular channels, welded toe-to-toe). A clip angle welded to the top of the box beam connected these beams by bolts to the W section. This joint carried virtually no moment, and therefore was modeled as a hinge. One end of the walkway was welded to a fixed plate and would be a fixed support, but for simplicity, it could be modeled as a hinge. This only makes a difference on the hanger rod nearest this support (it would carry less load than the others and would not govern design). The other end of the walkway support was a sliding bearing modeled by a roller. The original design for the hanger rod connection to the fourth floor walkway was a continuous rod through both walkway box beams (Figure 1 below).

Events and disputed communications between G.C.E. engineers and Havens resulted in a design change from a single to a double hanger rod box beam connection for use at the fourth floor walkways. The fabricator requested this change to avoid threading the entire rod. They made the change, and the contract's Shop Drawing 30 and Erection Drawing E-3 were changed (Figure 2 shows the hanger rod as built).

On February 16, 1979, G.C.E. received 42 shop drawings (including the revised Shop Drawing 30 and Erection Drawing E-3). On February 26, 1979, G.C.E. returned the drawings to Havens, stamped with Gi l  um's engineering review seal, authorizing construction. The fabricator (Havens) built the walkways in compliance with the directions contained in the structural drawings, as interpreted by the shop drawings, with regard to these hangers. In addition, Havens followed the American Institute of Steel Construction (AISC) guidelines and standards for the actual design of steel-to-steel connections by steel fabricators.

As a precedent for the Hyatt case, the  Guide to Investigation of Structural Failure 's Section 4.5, "Failure Causes Classified by Connection Type," states that:

Overall  collapses  resulting from connection failures have occurred only in structures with few or no redundancies. Where low strength connections have been repeated, the failure of one has lead to failure of neighboring connections and a progressive co l  apse has occurred. The primary causes of connection failures are:

  • improper design due to lack of consideration of all   forces acting on a connection, especia l  y those associated with volume changes
  • Improper design utilizing abrupt section changes resulting in stress concentrations.
  • Insufficient provisions for rotation and movement.
  • Improper preparation of mating surfaces and installation of connections.
  • Degradation of materials in a connection.
  • Lack of consideration of large residual stresses resulting from manufacture or fabrication.

Figure 1. Hangar-rod/box-beam assembly as origina l  y designed. (See Figures) Note that the nut only carries the load of the floor above it.

Figure 2. Schematic of original versus changed design. Note that now the upper nut at the far left carries only the load of the floor above it whereas the nut at the far right carries the load of both floors. (See Figures)

On October 14, 1979, part of the atrium roof collapsed while the hotel was under construction. As a result, the owner called in the inspection team. The inspection team's contract dealt primarily with the investigation of the cause of the roof collapse and created no obligation to check any engineering or design work beyond the scope of their investigation and contract. In addition to the inspection team, the owner retained, on October 16, 1979, an independent engineering firm, Seiden-Page, to investigate the cause of the atrium roof co l  apse. On October 20, 1979, G.C.E.'s Gillum wrote the owner, stating that he was undertaking both an atrium collapse investigation as well   as a thorough design check of all   the members comprising the atrium roof. G.C.E. promised to check  all  steel connections in the structures, not just those found in the roof.

From October-November, 1979, various reports were sent from G.C.E. to the owner and architect, assuring the overall   safety of the entire atrium. In addition to the reports, meetings were held between the owner, architect and G.C.E.

In July of 1980, the construction was complete, and the Kansas City Hyatt Regency Hotel was opened for business.

Just one year later, on July 17, 1981, the box beams resting on the supporting rod nuts and washers were deformed, so that the box beam resting on the nuts and washers on the rods could no longer hold up the load. The box beams (and walkways) separated from the ceiling rods and the fourth and second floor walkways across the atrium of the Hyatt Regency Hotel collapsed, killing 114 and injuring in excess of 200 others.

One investigation report gave the following summary:

The Hyatt Regency consists of three main sections: a 40-story tower section, a function block, and a connecting atrium. The atrium is a large open area, approximately 117 ft (36 m) by 145 ft (44 m) in plan and 50 ft (15 m) high. Three suspended walkways spanned the atrium at the second, third and fourth floor levels [see Figure 3 on following page]. These walkways connected the tower section and the function block. The third floor walkway was independently suspended from the atrium roof trusses while the second floor walkway was suspended from the fourth floor walkway, which in turn was suspended from the roof framing.

In the collapse, the second and fourth floor walkways fell   to the atrium first floor with the fourth floor walkway coming to rest on top of the second. Most of those killed   ed or injured were either on the atrium first floor level or on the second floor walkway. The third floor walkway was not involved in the collapse.

Figure 3. Schematic Layout of Walkways as Viewed from the South (See Figures)

Figure 4. Schematic representation of hangar-rod/box-beam assembly as actually built. Note that the two top hangars to the fourth floor no longer continue through that floor to the second floor. (See Figures)

Following the accident investigations, on February 3, 1984, the Missouri Board of Architects, Professional Engineers and Land Surveyors filed a complaint against Daniel M. Duncan, Jack D. Gillum, and G.C.E. International, Inc., charging gross negligence, incompetence, misconduct and unprofessional conduct in the practice of engineering in connection with their performance of engineering services in the design and construction of the Hyatt Regency Hotel. The NBS report noted that:

The hanger rod detail actually used in the construction of the second and fourth floor walkways is a departure from the detail shown on the contract drawings. In the original arrangement, each hanger rod was to be continuous from the second floor walkway to the hanger rod bracket attached to the atrium roof framing. The design load to be transferred to each hanger rod at the second floor walkway would have been 20.3 kips (90 kN). An essentially identical load would have been transferred to each hanger rod at the fourth floor walkway. Thus the design load acting on the upper portion of a continuous hanger rod would have been twice that acting on the lower portion, but the required design load for the box beam hanger rod connections would have been the same for both walkways (20.3 kips (90 kN)).(11)

The hanger rod configuration actually used consisted of two hanger rods: the fourth floor to ceiling hanger rod segment as originally detailed on the second to fourth floor segment which was off   set 4 in. (102 mm) inward along the axis of the box beam. With this modification the design load to be transferred by each second floor box beam-hanger rod connection was unchanged, as were the loads in the upper and lower hanger rod segments.

However, the load to be transferred from the fourth floor box beam to the upper hanger rod under this arrangement was essentia l  y doubled, thus compounding an already critical condition. The design load for a fourth floor box beam-hanger rod connection would be 40.7 kips (181 kN) for this configuration. ...

Had this change in hanger rod detail not been made, the ultimate capacity of the box beam-hanger rod connection still   would have been far short of that expected of a connection designed in accordance with the Kansas City Building Code, which is based on the AISC Specification. In terms of ultimate load capacity of the connection, the minimum value should have been 1.67 times 20.3, or 33.9 kips (151 kN). Based on test results the mean ultimate capacity of a single-rod connection is approximately 20.5 kips (91 kN), depending on the weld area. Thus the ultimate capacity actually available using the original connection detail would have been approximately 60% of that expected of a connection designed in accordance with AISC Specifications.(12)

During the 26-week administrative law trial that ensued, G.C.E. representatives denied ever receiving the call about the design change. Yet, Gillum affixed his seal of approval to the revised engineering design drawings.

Results of the hearing concluded that G.C.E., in preparation of their structural detail drawings, "depicting the box beam hanger rod connection for the Hyatt atrium walkways, failed to conform to acceptable engineering practice. [This is based] upon evidence of a number of mistakes, errors, omissions and inadequacies contained on this section detail itself and of [G.C.E.'s] alleged failure to conform to the accepted custom and practice of engineering for proper communication of the engineer's design intent."(13) Evidence showed that neither due care during the design phase, nor appropriate investigations following the atrium roof collapse were undertaken by G.C.E. In addition, G.C.E. was found responsible for the change from a one-rod to a two-rod system. Further, it was found that even if Havens failed to review the shop drawings or to specifically note the box beam hanger rod connections, the engineers were still   responsible for the final check. Evidence showed that G.C.E. engineers did not "spot check" the connection or the atrium roof collapse, and that they placed too much reliance on Havens.

Due to evidence supplied at the Hearings, a number of principals involved lost their engineering licenses, a number of firms went bankrupt, and many expensive legal suits were settled out of court. In November, 1984, Duncan, Gillum, and G.C.E. International, Inc. were found guilty of gross negligence, misconduct and unprofessional conduct in the practice of engineering. Subsequently, Duncan and Gillum lost their licenses to practice engineering in the State of Missouri (and later, Texas), and G.C.E. had its certificate of authority as an engineering firm revoked.

As a result of the Hyatt Regency Walkways Collapse, the American Society of Civil Engineering (ASCE) adopted a report that states structural engineers have full   responsibility for design projects.

Both Duncan and Gillum are now practicing engineers in states other than Missouri and Texas.

The responsibility for and obligation to design steel-to-steel connections in construction lies at the heart of the Hyatt Regency Hotel project controversy. To understand the issues of negligence and the engineer's design responsibility, we must examine some key elements associated with professional obligations to protect the public. This will   be discussed in class from three perspectives: the implicit social contract between engineers and society; the issue of public risk and informed consent; and negligence and codes of ethics of professional societies.

Ethical Issues Of The Case - Points For Discussion

This case centers on the question of who is responsible for a design failure. As an ethical issue,

When we take the implicit social contract between engineers and society, the issue of public risk and informed consent, and codes of ethics of professional societies into account, it seems clear that the engineer must assume this responsibility when any change in design involving public safety carries a licensed engineer's seal. Yet,

If we assume the engineer in the Hyatt case received the fabricator's telephone call   requesting a verbal approval of the design change for simplifying assembly, what would make him approve such an untenable change? Some

  • possible reasons include:
  • saving time; saving money;
  • avoiding a call   for re-analysis, thereby raising the issue of a request to recheck all   connector designs following the previous year's atrium roof collapse;
  • following his immediate supervisor's orders;
  • looking good professionally by simplifying the design; misunderstanding the consequences of his actions; or any combination of the above.

These reasons do not, however, fall   within acceptable standards of engineering professional conduct. Instead, they pave the way for legitimate charges of negligence, incompetence, misconduct and unprofessional conduct in the practice of engineering. When the engineer's actions are compared to professional responsibilities cited in the engineering codes of ethics, an abrogation of professional responsibilities by the engineer in charge is clearly demonstrated. But what of the owner, or the fabricator?

What if the call   was not made? While responsibility rests with the fabricator for violating building codes, would the engineers involved in the case be off   the hook? Why or why not?

The Hyatt Regency walkways collapse has resulted in a nationwide reexamination of building codes. In addition, professional codes on structural construction management practices are changing in significant ways.(14) Finally, what is your assessment of this case, based on the following questions:

  • Should Gillum and Duncan be allowed to practice engineering in other states? Why or why not? What is the engineering society's responsibility in this realm?

Annotated Bibliography

Davis, Michael, "Thinking Like An Engineer: The Place of a Code of Ethics in the Practice of a Profession,"  Philosophy & Public Affairs , Vol. 20, No. 2, Spring 1991, pp. 150-167. (see also, "Explaining Wrongdoing,"  Journal of Social Philosophy , Vol. 20, Numbers 1&2, Spring/Fall   1989, pp. 74-90.

In these lucid essays, Davis argues that "a code of professional ethics is central to advising individual engineers how to conduct themselves, to judging their conduct, and ultimately to understanding engineering as a profession." Using the now infamous Challenger disaster as his model, Davis discusses both the evolution of engineering ethics as well   as why engineers should obey their professional codes of ethics, from both a pragmatic and ethica l  y-responsible point of view. Essential reading for any graduating engineering student.

Engineering News Report

Throughout the hearings,  Engineering News Report , published by the National Society of Professional Engineers (NSPE), kept vigilant watch over the case. Of particular interest are their following articles:

  • "Hyatt Walkway Design Switched," July 30, 1981. "Hyatt Hearing Traces Design Change," July 26, 1984.
  • "Difference of Opinion: Hyatt Structural Engineer Gi l  um Disputes NBS Collapse Report," September 6, 1984.
  • "Weld Aided Collapse, Witness Says," September 13, 1984. "Judge Bars Hyatt Tests," September 20, 1984.
  • "Hyatt Engineers Found Guilty of Negligence," November 21, 1985. "Hyatt Ruling Rocks Engineers," November 28, 1985. "Construction Rescuers Sue," August 7, 1986.

Glickman, Theodore S., and Michael Gough (eds.),  Readings in Risk , Washington, D.C.: Resources for the Future, 1990.

This is an excellent collection of essays on managing technology-induced risk. As a starting-off   point, of particular worth to the engineers are the essays: "Probing the Question of Technology-Induced Risk" and "Choosing and Managing Technology-Induced Risk," by M. Granger Morgan; "Defining Risk," by Baruch Fischhoff, Stephen R. Watson, and Chris Hope; "Risk Analysis: Understanding 'How Safe is Safe Enough?'," by Stephen L. Derby and Ralph L. Keeney; "Social Benefit Versus Technological Risk," by Chauncey Starr; and "The Application of Probabilistic Risk Assessment Techniques to Energy Technologies," by Norman C. Rasmussen.

Gibble, Kenneth (ed.),  Management Lessons from Engineering Failures , Proceedings of a symposium sponsored by the Engineering Management Division of the American Society of Civil Engineers in conjunction with the ASCE Convention in Boston, October 28, 1986, New York: American Society of Civil Engineers, 1986.

This short work examines a variety of engineering failures, including those involving individual planning, and project failures. In particular see Irvin M. Fogel's essay, "Avoiding 'Failures' Caused by Lack of Management," and Gerald W. Farquhar's "Lessons to be Learned in the Management of Change Orders in Shop Drawings," both excellent illustrations for use with the Hyatt case.

Hall, John C., "Acts and Omissions,"  The Philosophical Quarterly , Vol. 39, No. 157, October 1989, pp. 399-408.

This article is a discussion of the legal and ethical ramifications of professional choices and activities, both active and passive.

"Hyatt Notebook: Parts I and II,"  Kansas City , October 1984 and November 1984.

These are two articles written by a Kansas City television reporter for the local magazine,  Kansas City , detailing highlights from the 26-week Hyatt Regency Walkways Collapse hearings.

Janney, Jack R. (ed.),  Guide to Investigation of Structural Failures , prepared for the American Society of Civil Engineers' Research Council on Performance of Structures, sponsored by the Federal Highway Administration, U.S. Department of Transportation, Contract No. DOTFH118843, 1979.

This short volume gives an exce l  ent overview of structural failure investigation procedures, and discusses failure causes by project type, structural type, and material, connection and foundation type. In addition, discussions on field operations, project management, and data analysis and reports are offered. Of particular interest to those studying the Hyatt case are sections 4.5-4.7, "Failure Causes Classified by Connection Type," and "Steel to Steel Connections."

Martin, Mike W. and Roland Schinzinger,  Ethics in Engineering  (2nd ed.), New York: McGraw-Hill Book Company, 1989.

An excellent text-book treatment of ethical issues in engineering. Of particular interest to this case is Part Two, "The Experimental Nature of Engineering," and Part Three, "Engineers, Management and Organizations."

McK Norrie, Kenneth, "Reasonable: The Keystone of Negligence,"  Journal of Medical Ethics , Vol. 13, No. 2, June 1987, pp. 92-94.

This article is a brief discussion of legal liability for professional actions. "The more knowledge, skill   and experience a person has, the higher standard the law subjects that person to" (p. 92).

PDF version: Missouri Board for Architects, Professional Engineers and Land Surveyors vs. Daniel M. Duncan, Jack D. Gi l  um and G.C.E. International, Inc., before the Administrative Hearing Commission, State of Missouri, Case No. AR840239, Statement of the Case, Findings of Fact, Conclusions of Law and Decision rendered by Judge James B. Deutsch, November 14, 1985, 442 pp. Note this is a BIG file - 20 Mb!

Word version: Missouri Board for Architects, Professional Engineers and Land Surveyors vs. Daniel M. Duncan, Jack D. Gillum and G.C.E. International, Inc., before the Administrative Hearing Commission, State of Missouri, Case No. AR840239, Statement of the Case, Findings of Fact, Conclusions of Law and Decision rendered by Judge James B. Deutsch, November 14, 1985, 442 pp. This has been changed to Word format, without any checking. Many errors are found when the scanner attempted to transcribe the pdf file to Word, but no one has found the time to correct the conversion

This volume contains the findings, conclusions of law and the final decision of the Hyatt Regency Walkways Collapse case, as rendered by Judge James B. Deutsch. The volume contains both the findings of the case and an exce l  ent general discussion of responsibilities of the professional engineer.

Pfrang, Edward O. and Richard Marshall, "Collapse of the Kansas City Hyatt Regency Walkways,"  Civil Engineering-ASCE , July 1982, pp. 65-68.

Official findings of the failure investigation conducted by the National Bureau of Standards, U.S. Department of Commerce. Among its conclusions was this: "Even if the now-notorious design shift in the hanger rod details had not been made, the entire design of a l  three walkways, including the one which did not collapse, was a significant violation of the Kansas City Building Code."

The Kansas City Hyatt Regency Walkways Collapse.

Department of Philosophy and Department of Mechanical Engineering Texas A&M University.

NSF Grant Number: DIR-9012252.

Related Resources

Submit Content to the OEC   Donate

NSF logo

This material is based upon work supported by the National Science Foundation under Award No. 2055332. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

  • Master of Engineering
  • MS in Biomedical Engineering
  • MS in Mechanical Engineering
  • MS in Systems & Control Engineering
  • State Authorization
  • Technology Requirements
  • Tuition and Financial Aid
  • Class Profile
  • Online Experience

Five Disastrous Engineering Failures

boat going down a flooded street

Engineering failures are not new. From the Johnstown Flood in 1889 to the Fukushima Daiichi nuclear disaster in 2011, engineering failures have been caused by problems in design, construction and safety protocol.

The blame can often be laid at ignorance, miscommunications and, in some extreme cases, indifference or negligence. After many of these engineering disasters however, professionals and leaders have learned from the wrong decisions that were made. Here, we discuss some of the worst engineering disasters and what caused them.

Not all engineering mistakes are associated with large-scale feats or impressive architectural marvels. From 1971 through 1976, the Ford Motor Company produced and sold more than 2.2 million Ford Pintos. The automaker set out to make a competitive, affordable car, but late into the development of its design, engineers discovered an issue with the fuel tank. Located between the rear axle and the bumper, the tank punctured and ruptured easily due to the car’s design. Ford’s engineers recommended an easy fix to the problem, one that would cost an additional $11 for each vehicle. In spite of this, the company decided to continue with the design as is, both to keep the cost low and to not delay production.

After just a few years on the road, the National Highway Traffic Safety Administration began investigating accidents involving the small car catching fire, but it took an article from the magazine Mother Jones to bring to light the Pinto’s danger to the public as well as Ford’s previous knowledge of it. After losing a lawsuit, Ford recalled the Pinto in 1978 and fixed vehicles with the original suggested solution. Some estimate that between 27 and 180 people died from the fuel tank issue. 1

The saga of the Love Canal is one of the first major environmental disasters in the U.S. The project originally began in 1894 when an entrepreneur attempted to build a canal in Niagara Falls, New York, to bring water and hydroelectric power to the city. The project was never completed, but in 1947, the canal was sold to Hooker Chemicals and Plastic Corporation. The company lined the unfinished canal with clay and began dumping chemicals and waste into the then isolated site. In 1953, the site was sold again, but this time to build an elementary school and houses.

Controversy remains over whether Hooker or the Niagara Falls Board of Education, which chose the site in spite of strict restrictions detailed in the land deed, is responsible for the consequences from building on the site. During the construction of the school, homes and a sewer line were built on and through the canal. The clay lining broke and chemicals began seeping into the ground. Eventually a state of emergency was declared by New York. Residents reported miscarriages, birth defects, cancer and other disorders and continued to fight to keep the site vacant years after they were evacuated. Today, the ramifications of this environmental and engineering failure still affects building and policy. 2

The Hyatt Regency Hotel Walkway

One year after the Hyatt Regency Hotel was completed in Kansas City, Missouri, two walkways suspended over the atrium lobby collapsed in July 1981. It happened in the middle of a dance, with attendees packed on the walkways and the floor below. More than 200 were injured, and 114 people were killed.

A series of decisions and miscommunications were found to be at fault. The original designs for the walkways violated the city’s weight-bearing codes: The second and fourth story walkways were suspended by slim sets of rods anchored to the ceiling. However, following a discussion with the fabricator during construction, the decision was made to attach the set of rods supporting the second-floor walkway to the bottom of the fourth—instead of the ceiling. That meant the rods attached to the fourth-floor walkway were supporting twice the weight than the original design intended. A lack of proper communication was blamed for the design change not being analyzed and approved properly, but the engineers involved with the site and the fabricators refused to accept responsibility. 3

New Orleans’ Levee System

The American Society of Civil Engineers notes that the destruction of the levees in New Orleans during Hurricane Katrina is unique among engineering failures. No one single decision led to the disaster, but rather systemic failures were the cause.

During construction, the Army Corps of Engineers failed to follow their own guidelines when estimating the strength of the soil—and designed the system to withstand low hurricane wind speeds. The height of the levees was another of many engineering mistakes: In addition to using flawed data about land elevation, the Corps also did not take into account the land’s natural, gradual sinking. In addition, local, state and federal politics and mismanagement played a role in both the quality and speediness of the construction and in failing to fund and maintain the system.

Across the Gulf Coast, more than 1,800 died and more than $100 billion in damage was caused. New Orleans was one of the hardest hit regions from Hurricane Katrina. Roughly 80 percent of the city and its surrounding area were flooded. 4

The Titanic

More than 1,500 people died when the Titanic struck an iceberg in 1912. Over the years, many have researched and investigated the details of its sinking, and it has been determined that a number of design issues and poor decisions led to its sinking in just over two and-a-half hours.

As one of the biggest ocean liners of its day, the Titanic featured 16 watertight compartments. If four of those flooded, the ship would still be able to stay afloat. Six compartments flooded though because the bulkheads were not tall enough to hold the water. 5 Some potential causes behind the ship’s sinking include designs that failed to take into account its size and mobility, the speed the ship was traveling, ignored warnings about the likelihood of icebergs and other factors. 6

One flaw that is undisputed though: There were not enough lifeboats for everyone on board. The 20 lifeboats would only have had space for roughly 1,200 people, while more than 2,200 passengers and crew were on board the ship. Additional lifeboats had been removed from the design because the ship owners were worried that it made the ship look unsafe and seemed packed on the deck.

Make sure you’re doing the job right.

To gain the expertise and experience that position you to succeed as an engineer, you need an online master’s degree from the Case School of Engineering. Complete the form and we’ll send program details to your inbox.

Importance of Leadership

Decisions that impact the integrity of a design or its construction usually come from the top down. Lapses in leadership can lead to these kinds of engineering failures. That’s why it’s essential to have leaders trained in both ethical decision-making and technical decision-making.

At the Case School of Engineering, our online graduate programs focus on developing the leadership expertise that highly skilled engineers need to be successful. Joining our program means joining a network of experienced engineering leaders from a number of different industries. Learn more about who our students are .

  • Retrieved on March 20, 2020, from popularmechanics.com/cars/a6700/top-automotive-engineering-failures-ford-pinto-fuel-tanks
  • Retrieved on March 20, 2020, from encyclopedia.com/places/united-states-and-canada/us-political-geography/love-canal
  • Retrieved on April 6, 2020, from ascelibrary.org/doi/10.1061/(ASCE)1527-6988(2007)8:3(61)
  • Retrieved on April 9, 2020 from asce.org/question-of-ethics-articles/july-2015/
  • Retrieved on April 9, 2020 from nationalgeographic.org/media/sinking-of-the-titanic/
  • Retrieved on April 9, 2020 from nbcnews.com/sciencemain/10-causes-titanic-tragedy-620220

Return to Online Engineering Blog

Complete the form below before proceeding to the application portal.

Case Western Reserve University has engaged Everspring , a leading provider of education and technology services, to support select aspects of program delivery.

IMAGES

  1. Engineering Ethics

    engineering ethics case studies with solution

  2. Ethics in Engineering

    engineering ethics case studies with solution

  3. ENGINEERING ETHICS: THREE CASE STUDIES / engineering-ethics-three-case

    engineering ethics case studies with solution

  4. 04-ENGINEERING ETHICS CASE STUDIES.pdf

    engineering ethics case studies with solution

  5. engineering ethics case studies

    engineering ethics case studies with solution

  6. Week1 HW 1A engineering ethics case studies homwork v22 1 .pdf

    engineering ethics case studies with solution

VIDEO

  1. ETHICS CASE STUDIES-Ethical Dilemmas in Corporate HR Management|LECTURE-3|UPSC CSE MAINS|LevelUp IAS

  2. CASE STUDY 3 Engineering Ethics & OSHEBELU4053

  3. Professional ethics in engineering

  4. Welcome to Engineering Ethics!

  5. Assignment 3

  6. Assignment 8

COMMENTS

  1. PDF Engineering Ethics: Three Case Studies

    MTI lead on-site rep presented charts leading to first (engineering) recommendation: "O-Ring temp must be 53 degF (or greater) at launch." NASA on-site reps asked for and got MTI higher management telecom concurrence. After off-line conference, top management in Utah withdrew earlier objection.

  2. Engineering Ethics Cases

    The engineering ethics cases in this series were written by Santa Clara University School of Engineering students Clare Bartlett, Nabilah Deen, and Jocelyn Tan, who worked as Hackworth Engineering Ethics Fellows at the Markkula Center for Applied Ethics over the course of the 2014-2015 academic year. In order to write these cases, the fellows ...

  3. The Boeing 737 MAX: Lessons for Engineering Ethics

    The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994), Space Shuttle ... to Siemens' creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time ...

  4. New Ethics Case Studies Published

    Fall 2021 NSPE Today New Ethics Case Studies Published NSPE's Board of Ethical Review has published six new case studies that provide engineering ethics guidance using factbased scenarios. The cases cover the topics of plan stamping; gifts; the public health, safety, and welfare; conflicts of interest; responsible charge; and job qualifications.

  5. Case Studies for Engineering Ethics Across the Product Life Cycle

    To provide a baseline for evaluating the new case studies, a review of learning assessments was carried out in spring 2015 for a mechanical/industrial engineering course, which currently uses a case study-based ethics module about the Bhopal chemical disaster, and retrospectively for the 150+ students who have passed through the course over ...

  6. Using case studies in engineering ethics education: the case for

    The qualitative study aims to determine (RQ1) how cases are selected, (RQ2) the goals envisioned for engineering ethics case instruction, (RQ3) the characteristics of the scenarios employed and (RQ4) the preferred application by instructors. A first finding notes the diverse set of goals and application of ethics case studies.

  7. PDF Case Studies In Engineering Ethics

    TECHNICAL, ETHICS CASE STUDIES. Case 1 - False Claim of Production Source. A major company was unsuccessful in bidding on a complex gyroscopic control system for a military aircraft. Using strong political connections with the White House, they forced a Pentagon level review of the evaluation. The proposal claimed all portions of the system ...

  8. Engineering Ethics

    Engineering Ethics Real World Case Studies STEVEN K. STARRETT,PH.D., P.E., D.WRE AMY L. LARA,PH.D. CARLOS BERTHA,PH.D. ... Case 1. Creating Engineering Solutions That Maximize Damages for Legal Case 50 Case 2. Hazardous Tweeting 54 Case 3. Environmental Test Results Only 1% Out of Compliance 57

  9. Engineering Ethics: Real World Case Studies

    Entrusted by the public to provide professional solutions to complex situations, engineers can face ethical dilemmas of all forms. In Engineering Ethics: Real World Case Studies, Starrett, Bertha, and Lara provide in-depth analysis with extended discussions and study questions of case studies that are based on real work situations.

  10. Engineering Ethics

    Introduction to Software Engineering Ethics. Designed for use in software engineering courses, this module includes a reading, homework assignments, case studies, and classroom exercises that will prompt conversation about ethical issues that students will face in their role as software engineers. View Module.

  11. PDF Hypothetical Cases in Engineering Ethics

    Historical case studies emphasize the relevance of ethics in engineering work. Hypothetical case studies can address specific ethical principles and provide great design flexibility. This paper discusses hypothetical cases in engineering ethics in the context of instructional exercises or student competitions.

  12. PDF Deepwater Horizon Oil Spill: An Ethics Case Study in Environmental

    The Deepwater Horizon oil spill is a novel, unique, and effective engineering ethics case study for promoting students' ethical reasoning development. In this paper, we have argued that this case is particularly useful for developing students' tendency to consider the broad scope of stakeholders impacted by engineering decisions, including ...

  13. Using case studies in engineering ethics education: the case for

    of case studies in engineering ethics education and includes 23 engineering programmes from 6 higher education institutions in Ireland. The qualitative study aims to determine (RQ1) how ... of representation and solution paths, as well as non- engineering success standards and constraints, mak- ing use of distributed knowledge and collaborative ...

  14. The Boeing 737 MAX: Lessons for Engineering Ethics

    The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994), Space Shuttle ... to Siemens' creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time ...

  15. State of the Art in Engineering Ethics Methodologies for Case Studies

    Suggested Citation:"State of the Art in Engineering Ethics Methodologies for Case Studies in Engineering Ethics." National Academy of Engineering. 2004. Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11083.

  16. To Ship or Not to Ship

    Clare Bartlett was a 2014-2015 Hackworth Fellow in Engineering Ethics at the Markkula Center for Applied Ethics at Santa Clara University. August 2015. Aug 26, 2015. --. A quality assurance engineer must decide whether or not to ship products that might be defective.

  17. PDF The Case Study Approach To Engineering Ethics

    Three years ago, a learning module was added to this course focusing on engineering ethics. It evolved over time into a case study approach consisting of code of ethics lectures, case study discussions, case study written essays, and a case study ethics examination. The learning module begins with a reflective look at the type of decisions ...

  18. Engineering Ethics: Real World Case Studies

    Entrusted by the public to provide professional solutions to complex situations, engineers can face ethical dilemmas of all forms. In Engineering Ethics: Real-World Case Studies, Starrett, Lara, and Bertha provide in-depth analysis with extended discussions and study questions of case studies that are based on real work situations. Important ...

  19. PDF Engineering ethics cases for electrical and computer engineering

    classic case studies used in engineering ethics courses and text-books. This makes it sometimes difficult to excite and to motivate electrical and computer engineering students to study and discuss these cases. In teaching engineering ethics to these students, it can be valuable to employ case studies that involve technical issues

  20. PDF More Engineering Ethics Cases

    More Engineering Ethics Cases 2020 Instructor: Thomas Mason, PE PDH Online | PDH Center 5272 Meadow Estates Drive Fairfax, VA 22030-6658 Phone: 703-988-0088 ... ETHICS CASE 11 - Use of Another's Project Study ETHICS CASE 12 - Airline Mechanic ©Thomas Mason Page 2 of 33 .

  21. Hyatt Regency Walkway Collapse

    ENGINEERING ETHICS Negligence And The Professional "Debate" Over Responsibility For Design Instructor's Guide - Introduction To The Case. On July 17, 1981, the Hyatt Regency Hotel in Kansas City, Missouri, held a videotaped tea-dance party in their atrium lobby. ... A Case Study of the Therac-25: A paper on teaching ethics of real-time systems ...

  22. Five Disastrous Engineering Failures

    02 Sep. Engineering failures are not new. From the Johnstown Flood in 1889 to the Fukushima Daiichi nuclear disaster in 2011, engineering failures have been caused by problems in design, construction and safety protocol. The blame can often be laid at ignorance, miscommunications and, in some extreme cases, indifference or negligence.

  23. ISG Case Study Research Recognizes 47 Providers for High-Impact Client

    Among the 47 providers recognized for 2023, five providers—Hexaware, Infosys (including Infosys BPM), LTIMindtree, Tech Mahindra and WNS (Holdings) Limited—had five or more standout case studies.

  24. Role of Engineering Ethics Case Studies and Student Learning

    The study of ethics in engineering education may also aid in understanding that while the codes of ethics are handy, they may only sometimes provide clear-cut answers. Engineers are obligated to bring competence and integrity to their work, but considering the public's welfare is equally important. This paper will discuss the classroom ...