The role of humans in self-driving cars is even more complicated after Uber's fatal crash

02.04.2018

The death of a pedestrian raises questions about how humans should act when robots are in control

A typical Uber driver has clearly defined responsibilities. Arrive on time, know your route, keep your car clean, and, most importantly, safely deliver your passenger to their destination. Sitting behind the wheel of a self-driving Uber—or any autonomous vehicle, for that matter—is, paradoxically, more complicated. A recent, tragic incident in which a self-driving Uber struck and killed a 49-year-old pedestrian, while a safety driver sat behind the wheel, has stirred up many conversations about blame, regulations, and the overall readiness of autonomous tech. The lingering question, however, is how we humans fit into this picture.

The Tempe Police Department released a 14-second clip of the moments leading up to the fatal Uber crash. It shows an outside video, which includes the victim, as well as an inside view of the cabin, which shows the reaction—or lack of reaction—by the person in the driver’s seat.

What does it mean to be a safety driver?

Both Uber and Lyft (the latter of which hadn’t responded for comment at the time of publication) have dedicated training programs to teach flesh-and-blood people how to act when behind the wheel of a car that drives itself. According to a schedule Uber provided to PopSci.com, the training program includes both theoretical and practical evaluations.

By our reading of these materials, it appears that Uber expects that a driver may sometimes need to take control of the vehicle, but the specific circumstances in which that’s the case are somewhat unclear. While Lyft is more tight-lipped about its onboarding process for new drivers, the company does provide a little more insight about when the human is meant to take over command.

In Lyft’s FAQ about its self-driving program, it states that the “pilots” are “constantly monitoring the vehicle systems and surrounding environment, and are ready to manually take control of the vehicle if an unexpected situation arises.” It goes on to specifically mention complicated traffic patterns, like detours, or humans directing traffic around things like construction.

While Uber’s guidelines may differ from Lyft’s, the concept of constantly monitoring the car’s surroundings has been a keystone in discussions about Uber’s fatal crash. The video appears to show the car’s driver looking down into the cabin rather than out in the direction the car is traveling.

“On two separate occasions, the driver seems to be looking down at something for nearly five seconds,” says Bryant Walker Smith, a leading legal expert in the arena of autonomous vehicle deployment. “At 37 miles per hour, a car covers about 250 feet in 5 seconds.” Average reaction time for a human driver is in the neighborhood of 2.3 seconds, which suggests a driver offering their full attention to the road may have been able to at least attempt to brake or perform an evasive maneuver.

But, the idea of a driver maintaining attention when not actively piloting the vehicle has been a sticking point since cars first started hinting at self-driving tech. Back in 2016, a Tesla got into a fatal collision when its autonomous systems couldn’t differentiate the white panels on the side of a truck from the brightness of the open sky. In that case, however, the driver reportedly didn’t notice—or chose to ignore—the vehicle’s calls for human intervention.

Researchers, including those at the Center for Automotive Research at Stanford, have been studying this moment of hand-off between autonomous systems and flesh-and-blood drivers for years, exploring options like haptic feedback in the steering wheel, as well as lights in and around the dashboard to indicate that something is wrong and intervention is required. Of course, all of that is irrelevant if the car doesn’t see trouble coming in the first place.

The navigation and self-driving tech in Uber’s vehicles is also a lot more advanced than Tesla’s semi-autonomous Autopilot mode, which isn’t meant to completely replace the need for a driver, and is still in beta according to the company. Uber uses LIDAR, a system that creates a 3D map of the areas surrounding the car using lasers, as well as a typical radar system, and cameras to detect objects before collisions.

On paper, these systems should have been able to detect a pedestrian in the road and send a signal to the driver to take over. While Uber hasn’t confirmed the exact way in which its system reacted just before and during the crash, it doesn’t appear that the driver received a warning—at least not with enough time to intervene....

Read more...

 

 


Subscribe to our weekly newsletter to keep up to date with the latest industry news in all key market sectors regarding project developments, company news, market trends.