The Boeing 737 Max Disaster Through The Eyes of a Software Developer

So Boeing produced a dynamically unstable airframe, the 737 Max. That is big strike No. 1. Boeing then tried to mask the 737 dynamic instability with a software system. Big strike No. 2. Finally, the software relied on systems known for their propensity to fail angle-of-attack indicators and did not appear to include even rudimentary provisions to cross-check the outputs of the angle-of-attack sensor against other sensors.

IEEE's  Spectrum published the article "How the Boeing 737 Max Disaster Looks to a Software Developer." In the article, pilot (and software executive) Gregory Travis argues Boeing tried to avoid costly hardware changes  to their 737s with a flawed software fix -- specifically, the Maneuvering Characteristics Augmentation System (or MCAS): It is astounding that no one who wrote the MCAS software for the 737 Max seems even to have raised the possibility of using multiple inputs, including the opposite angle-of-attack sensor, in the computer's determination of an impending stall. As a lifetime member of the software development fraternity, I don't know what toxic combination of inexperience, hubris, or lack of cultural understanding led to this mistake. But I do know that it's indicative of a much deeper problem. The people who wrote the code for the original MCAS system were obviously terribly far out of their league and did not know it.

So Boeing produced a dynamically unstable airframe, the 737 Max. That is big strike No. 1. Boeing then tried to mask the 737's dynamic instability with a software system. Big strike No. 2. Finally, the software relied on systems known for their propensity to fail (angle-of-attack indicators) and did not appear to include even rudimentary provisions to cross-check the outputs of the angle-of-attack sensor against other sensors, or even the other angle-of-attack sensor. Big strike No. 3... None of the above should have passed muster. None of the above should have passed the "OK" pencil of the most junior engineering staff... That's not a big strike. That's a political, social, economic, and technical sin...  

The 737 Max saga teaches us not only about the limits of technology and the risks of complexity, it teaches us about our real priorities. Today, safety doesn't come first -- money comes first, and safety's only utility in that regard is in helping to keep the money coming. The problem is getting worse because our devices are increasingly dominated by something that's all too easy to manipulate: software.... I believe the relative ease -- not to mention the lack of tangible cost -- of software updates has created a cultural laziness within the software engineering community. Moreover, because more and more of the hardware that we create is monitored and controlled by software, that cultural laziness is now creeping into hardware engineering -- like building airliners. Less thought is now given to getting a design correct and simple up front because it's so easy to fix what you didn't get right later. The article also points out that "not letting the pilot regain control by pulling back on the column was an explicit design decision. Because if the pilots could pull up the nose when MCAS said it should go down, why have MCAS at all?  

"MCAS is implemented in the flight management computer, even at times when the autopilot is turned off, when the pilots think they are flying the plane."

Did you like it? Why don't you try also...

FTC May Hold Zuckerberg Personally Responsible For Facebook Privacy Failures

According to NBC, FTC officials are discussing whether and how to hold Facebook Chief Executive Mark Zuckerberg personally accountable for the company's history of mismanaging users private data. However, NBC said its sources wouldn't elaborate on what measures are specifically under consideration.

CIA Accuses Huawei Of Being Funded By Chinese Intelligence

The accusation comes at a time of trade tensions between Washington and Beijing and amid concerns in the United States that Huawei equipment could be used for espionage. The company has said the concerns are unfounded

NYC Subway Denies Using Real-Time Face Recognition Screens in Times Square

Young says that the recordings arent being monitored to identify individuals in the footage, though. There is absolutely no facial recognition component to these cameras, no facial recognition software, or anything else that could be used to automatically identify people in any way.