Inquiry Line (Signal only)

Live Broadcast

Fatal Tesla Crash Raises New Questions About Self-Driving System

Facebook
Twitter
LinkedIn
Pinterest
Pocket
WhatsApp

Supported by

Business Day

Fatal Tesla Crash Raises New Questions About Self-Driving System

Photo
Tesla’s semi-autonomous driving system, shown here in 2015, is coming under new scrutiny after a fatal crash on March 23 in California occurred while the Autopilot feature was engaged. Credit Beck Diefenbach/Reuters

In the fall of 2016, Tesla beamed new software over the air to cars on the road in the United States and elsewhere that added safeguards to its Autopilot system to prevent drivers from looking away from the road or keeping their hands off the steering wheel for long periods of time.

The move came in the wake of a crash in Florida in which an Ohio man died when his Model S sedan hit a tractor-trailer while Autopilot was engaged. Federal investigators found that the driver’s hands had been on the steering wheel for only a few seconds in the minute before the crash.

When the upgrades were released, Tesla’s chief executive, Elon Musk, said the new Autopilot system was “really going to be beyond what people expect” and would make the Tesla Model S sedan and the Model X sport utility vehicle the safest cars on the road “by far.”

Now, however, Tesla’s semiautonomous driving system is coming under new scrutiny after the company disclosed late on Friday that a fatal crash on March 23 in California occurred while Autopilot was engaged.

Advertisement

Continue reading the main story

The company said the driver, Wei Huang, 38, a software engineer for Apple, had received several visual and audible warnings to put his hands back on the steering wheel but had failed to do so, even though his Model X S.U.V. had the modified version of the software. His hands were not detected on the wheel for six seconds before his Model X slammed into a concrete divider near the junction of Highway 101 and 85 in Mountain View, and neither Mr. Huang nor the Autopilot activated the brakes before the crash.

Continue reading the main story

The accident renews questions about Autopilot, a signature feature of Tesla vehicles, and whether the company has gone far enough to ensure that it keeps drivers and passengers safe.

“At the very least, I think there will have to be fundamental changes to Autopilot,” said Mike Ramsey, a Gartner analyst who focuses on self-driving technology. “The system as it is now tricks you into thinking it has more capability than it does. It’s not an autonomous system. It’s not a hands-free system. But that’s how people are using it, and it works fine, until it suddenly doesn’t.”

On Saturday, Tesla declined to comment on the California crash or to make Mr. Musk or another executive available for an interview. In its blog post on Friday about the crash, the company acknowledged that Autopilot “does not prevent all accidents,” but said the system “makes them much less likely to occur” and “unequivocally makes the world safer.”

For the company, the significance of the crash goes beyond Autopilot. Tesla is already reeling from a barrage of negative news. The value of its stock and bonds has plunged amid increasing concerns about how much cash it is using up and the repeated delays in the production of the Model 3, a battery-powered compact car that Mr. Musk is counting on to generate much-needed revenue.

It is also facing an investor lawsuit related to Tesla’s acquisition of SolarCity, a solar-panel maker where Mr. Musk was serving as chairman. Meanwhile, competition is mounting from other luxury car makers that have developed their own electric cars, while Waymo, the Google spinoff, General Motors and others seem to have passed Tesla in self-driving technology.

“There’s a lot going on that undermines Elon’s credibility right now,” said Karl Brauer, a senior analyst at Kelley Blue Book.

Autopilot uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. Tesla readily points out that Autopilot — despite the implications in its name — is only a driver-assistance system and is not intended to pilot cars on its own.

Drivers are given warnings on the dashboard and in the owner’s manual to remain engaged and alert while using it. Tesla originally described it as a “beta” version, a term that usually refers to software still in the developmental stage.

Advertisement

Continue reading the main story

At the time of the Florida crash, it was possible to engage Autopilot and cruise on highways for several minutes without the driver holding the steering wheel. In that crash, the Autopilot’s camera, then the primary sensor in the system, failed to recognize a white truck as it was crossing a rural highway. Tesla said the camera was confused because the truck appeared against a bright sky.

The modifications introduced in 2016, known as Autopilot 8.0, included more frequent warnings to drivers to keep their hands on the steering wheel. After three warnings, the new software prevents Autopilot from operating until the driver stops, turns off the car and restarts.

The new version also made radar the primary sensor, and Mr. Musk said the new radar would have been able to see the truck in the Florida crash despite the bright sky.

Autopilot does not use lidar — a kind of radar based on lasers — that Waymo and others have maintained are crucial for fully autonomous vehicles. Mr. Musk has said he believes lidar is not necessary for Autopilot be safe.

At least three people have now died while driving with Autopilot engaged. In January 2017, a Chinese owner was at the wheel of a Model S when the car crashed into a road sweeper on a highway.

The National Transportation Safety Board is now investigating the March 23 crash that killed Mr. Huang. Its investigation of the 2016 Florida accident concluded that Autopilot “played a major role,” and said that it lacked safeguards to prevent misuse by drivers.

An earlier investigation, by the National Highway Transportation Safety Administration, said that the company’s Autopilot-enabled vehicles did not need to be recalled. That inquiry, however, focused only on the question of whether any flaws in the system had led to the crash; it found no such flaws.

A version of this article appears in print on April 1, 2018, on Page A21 of the New York edition with the headline: New Questions About Tesla’s Self-Driving System. Order Reprints| Today's Paper|Subscribe

Continue reading the main story Read the Original Article

Facebook Comments
Facebook
Twitter
LinkedIn
Pinterest
Pocket
WhatsApp

Never miss any important news. Subscribe to our newsletter.

Recent News

Follow Radio Biafra on Twitter

Editor's Pick