Home
Driverless

A Fatal Airbus Crash Paints an Ugly Picture for the Future of Level 3 Cars

Aug 4, 2017 08:27 PM
Aug 4, 2017 10:13 PM
636374348553698559.jpg

The fatal crash of Air France Flight 447 is one of the most tragic accidents in avionic history — while it also serves as a stark reminder of what can go wrong when humans rely too much on driverless vehicles.

The tragic 2009 plane crash off the coast of Brazil involved an Airbus A330-200, which has some of the most technologically advanced self-pilot controls available in modern commercial aircraft. It assumes many take-off, cruising, and landing controls. But the human continues to play a role, since at least one pilot must be ready to take control of the plane at all times, similar to how drivers must always be prepared to take over driving Level 1 through Level 3 autonomous vehicles.

In a nutshell, among the conclusions drawn from the post-crash analysis of Flight 447 is that the pilots were unprepared to take back control of the plane as it failed to right itself and gain altitude while it fell thousands of feet from the sky before crashing into Atlantic ocean, resulting in the deaths of all 228 passengers and crew members onboard.

The design of Airbuses and modern-day Boeings has rendered them exponentially more capable to takeover piloting controls, which in turn, has made them significantly safer to fly since humans, who are much more prone to error, become less involved in piloting them. But this is where the paradox comes into play: pilots, as well as human drivers who are required to takeover driving self-piloted vehicles, become less able to respond to emergency situations when they occur, once they become used to relying on autonomous vehicle transportation.

A Case Study

636374429848170757.jpg

Waymo is retiring its steering wheel-less self-driving cars, but is still skipping Level 3.

In the book Driverless, Hod Lipson, a professor of mechanical engineering at Columbia University, uses Air France Flight 447 to illustrate how humans cannot be trusted to safely take over a car's self-drive controls. He also agreed when queried by Driverless that human fallibility and an over-reliance on driverless vehicles' capabilities make Level 3 vehicles inherently more dangerous than Level 4 cars are.

Human-machine cooperation is great for many applications, but not for driving. Humans get distracted, lose situational awareness, and lose practice. Reliance on humans to take over the wheel in emergency situations is false comfort. Perhaps it is a convenient legal solution, but it is not a practical solution in the long term.

— Hod Lipson, professor of mechanical engineering at Columbia University

Google's driverless arm (before it became Waymo), drew the same conclusion after the search engine giant allowed a group of employees to borrow self-driving Lexuses the company had designed in 2012. The employees were warned that at all times they had to be ready to take back control of the car when prompted, especially in the event of an emergency. Instead, they often climbed in the backseat, watched videos, or otherwise didn't pay attention. Google determined that humans simply couldn't be trusted to monitor the car's controls once the driverless mode was switched on. In a report, Google described this as "automation bias."

The Big Skip to Level 4

Level 3 driving is obviously safer than Level 2 driving is since the driver has 10 seconds or more to take over control of the vehicle, instead of having to constantly be on alert in case something goes wrong. And yet, Level 3 drivers will likely become more reliant on the machine's driving capabilities as they become used to letting the car self-pilot itself. They will thus probably become more out of practice to react when things go wrong, as the pilots in the tragic Air France crash were ill-prepared to respond to a dire emergency situation.

In many ways, Level 3 will also not be smart enough, at least in the near future, to make up for human shortcomings. Neural networks have years to go before they will have enough human-like common sense to realize that a ball stuck under a parked car means a child, which the car does not see or detect, could be close by, Eyal Amir, CEO of Ai Incube and an associate professor of computer science at the University of Illinois at Urbana-Champaign, told Driverless.

I believe that Level 3 autonomy in current driving situations in dense city streets would not be realized before seven to 10 years from now.

— Eyal Amir, CEO of Ai Incube and a professor of computer science at the University of Illinois

The inherent risks associated with Level 3 and Level 2 driverless cars also help to account for why OEMs are said to be skipping Level 3 altogether. Yet, there is certainly a market for those who want to buy and experience self-drive vehicles long before you can push a button and watch TV in the backseat in your car during a two-hour trip.

Humans may also have already started to become overly reliant on even Level 2 cars' self-driving capabilities in cars currently available, such as the Tesla models. These cars, in theory, should also make far fewer driving mistakes than human do when in self-drive mode — but then again, when things do go south, human drivers are expected to become that much more ill-prepared to take back control of the vehicle as they begin to trust the machine. As a matter of fact, the captain of Air France Flight 447 was taking a nap when the crew was first prompted that something had gone horribly wrong.

Juliet Gallagher contributed to this report.

Cover photo via Air France

Comments

No Comments Exist

Be the first, drop a comment!