Monday, April 22, 2019

How the Boeing 737 Max "denies [pilots'] sovereignty"

[T]he 737 Max . . . violated that most ancient of aviation canons and probably violated the certification criteria of the U.S. Federal Aviation Administration. But instead of going back to the drawing board and getting the airframe hardware right . . . Boeing relied on something called the “Maneuvering Characteristics Augmentation System,” or MCAS. . . .

MCAS is certainly much less expensive than extensively modifying the airframe to accommodate the larger engines. Such an airframe modification would have meant things like longer landing gear (which might not then fit in the fuselage when retracted), more wing dihedral (upward bend), and so forth. All of those hardware changes would be horribly expensive.

What’s worse, those changes could be extensive enough to require not only that the FAA recertify the 737 but that Boeing build an entirely new aircraft. Now we’re talking real money, both for the manufacturer as well as the manufacturer’s customers.

That’s because the major selling point of the 737 Max is that it is just a 737, and any pilot who has flown other 737s can fly a 737 Max without expensive training, without recertification, without another type of rating. Airlines—Southwest is a prominent example—tend to go for one “standard” airplane. They want to have one airplane that all their pilots can fly because that makes both pilots and airplanes fungible, maximizing flexibility and minimizing costs. . . .

The flight management computer is a computer. What that means is that it’s not full of aluminum bits, cables, fuel lines, or all the other accoutrements of aviation. It’s full of lines of code. And that’s where things get dangerous.

Those lines of code were no doubt created by people at the direction of managers. Neither such coders nor their managers are as in touch with the particular culture and mores of the aviation world as much as the people who are down on the factory floor, riveting wings on, designing control yokes, and fitting landing gears. Those people have decades of institutional memory about what has worked in the past and what has not worked. Software people do not.

In the 737 Max, only one of the flight management computers is active at a time—either the pilot’s computer or the copilot’s computer. And the active computer takes inputs only from the sensors on its own side of the aircraft.

When the two computers disagree, the solution for the humans in the cockpit is to look across the control panel to see what the other instruments are saying and then sort it out. In the Boeing system, the flight management computer does not “look across” at the other instruments. It believes only the instruments on its side. It doesn’t go old-school. It’s modern. It’s software.

This means that if a particular angle-of-attack sensor goes haywire—which happens all the time in a machine that alternates from one extreme environment to another, vibrating and shaking all the way—the flight management computer just believes it.

It gets even worse. There are several other instruments that can be used to determine things like angle of attack, either directly or indirectly, such as the pitot tubes, the artificial horizons, etc. All of these things would be cross-checked by a human pilot to quickly diagnose a faulty angle-of-attack sensor.

In a pinch, a human pilot could just look out the windshield to confirm visually and directly that, no, the aircraft is not pitched up dangerously. That’s the ultimate check and should go directly to the pilot’s ultimate sovereignty. Unfortunately, the current implementation of MCAS denies that sovereignty. It denies the pilots the ability to respond to what’s before their own eyes.

Like someone with narcissistic personality disorder, MCAS gaslights the pilots. And it turns out badly for everyone. “Raise the nose, HAL.” “I’m sorry, Dave, I’m afraid I can’t do that.”

In the MCAS system, the flight management computer is blind to any other evidence that it is wrong, including what the pilot sees with his own eyes and what he does when he desperately tries to pull back on the robotic control columns that are biting him, and his passengers, to death. . . .

None of the above should have passed muster. None of the above should have passed the “OK” pencil of the most junior engineering staff, much less a DER. That’s not a big strike. That’s a political, social, economic, and technical sin.

It just so happens that, during the timeframe between the first 737 Max crash and the most recent 737 crash, I’d had the occasion to upgrade and install a brand-new digital autopilot in my own aircraft. . . .

1 comments:

Inkling said...

A great article but there is one gotcha. There seems to a debate about who is responsible for MCAS. Most, including this article, are blaming Boeing for adopting it so it did not need to come up with a replacement for the 737. But others are saying that Boeing ran flight simulations of the 737MAX which showed that experienced pilots adapted easily to the different traits of the 737MAX, essentially a greater (not different) tendency to nose up when accelerating on takeoff. In that scenario, Boeing added MCAS to please the FAA who, everyone agrees, was doing a lousy job of vetting MCAS.

I have no way of knowing which claim is true. I'm eagerly awaiting reports and, after them, a definite book on what went so badly wrong.