Automotive

Consumer Reports flags "serious safety concerns" over Tesla's Autopilot

Consumer Reports flags "serious safety concerns" over Tesla's Autopilot
Tesla's Navigate on Autopilot allows autonomous lane changing on the highway, but not everybody is convinced its ready for public use
Tesla's Navigate on Autopilot allows autonomous lane changing on the highway, but not everybody is convinced its ready for public use
View 2 Images
Tesla's Navigate on Autopilot allows autonomous lane changing on the highway, but not everybody is convinced its ready for public use
1/2
Tesla's Navigate on Autopilot allows autonomous lane changing on the highway, but not everybody is convinced its ready for public use
Screenshot of Tesla's Navigate on Autopilot in action
2/2
Screenshot of Tesla's Navigate on Autopilot in action

Last month, Tesla introduced an update for its Autopilot self-driving software designed to allow the car to complete lane changes on its own. But independent testing by consumer-focused non-profit Consumer Reports (CR) has uncovered a few shortcomings in the technology, even stating that it could "create potential safety risks for drivers."

Tesla's Navigate on Autopilot was introduced in October of last year and allowed suitably equipped Tesla vehicles to autonomously navigate from the on-ramp of a highway to the off-ramp. Throughout, it would keep the car centered in the lane and at a safe distance from other vehicles, while drivers were required to authorize a lane change with a tug of the turn stalk.

Last month's update removed the need for this authorization, enabling drivers to set up their Teslas to complete the lane changes all on their own. But after spending some time behind the wheel with the updated Navigate on Autopilot engaged, Consumer Reports is of the view that it's not quite ready yet, to put it mildly.

"Tesla is showing what not to do on the path toward self-driving cars: release increasingly automated driving systems that aren't vetted properly," says David Friedman, vice president of advocacy at Consumer Reports. "Before selling these systems, automakers should be required to give the public validated evidence of that system's safety – backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions."

Screenshot of Tesla's Navigate on Autopilot in action
Screenshot of Tesla's Navigate on Autopilot in action

Friedman says that in its current form, the "automatic lane-change function raises serious safety concerns." Another descriptor used by CR is "far less competent than a human driver."

The rather damning summary is based on reports from multiple CR testers who drove a Tesla Model 3 using Navigate on Autopilot on several highways in the state of Connecticut. The testers reported that it would cut off cars in adjacent lanes by overtaking without enough space, passing on the right-hand side and failing to respond adequately to vehicles approaching quickly from behind, which leads it to cut off passing cars traveling at faster speeds.

"The system's role should be to help the driver, but the way this technology is deployed, it's the other way around," says Jake Fisher, Consumer Reports' senior director of auto testing. "It's incredibly nearsighted. It doesn't appear to react to brake lights or turn signals, it can't anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it."

Tesla is betting big on Autopilot. CEO Elon Musk said during the reveal of the Model Y in March that he expects it to be fully feature complete by year's end. And according to CNBC, in a call with investors last month he said that self-driving robo-taxis will turn Tesla into a US$500 billion company. For its part, the company has always maintained a driver's hands must remain on the wheel when Autopilot is engaged.

Source: Consumer Reports

6 comments
6 comments
Grunchy
Robotic cars are just another type of robot, except they’ve been given ludicrous lethal power and no separation from humans. As predicted prior to their rollout, Tesla robot cars have already killed and are definitely going to kill again. The only thing missing is if some bad actor “steals the crypto keys” and opens up the self driving platform to external control. Believe me, Sony already suffered this with the PlayStation 3. Except when Tesla is hacked, imagine an army of hostile operators using tens of thousands of hijacked Tesla’s to murder North Americans indiscriminately. Honest to God, I’m deadly afraid this is going to happen.
guzmanchinky
I applaud Tesla for charging ahead. People in these cars are firmly told to keep an eye on the Autopilot, just like pilots do.
apprenticeearthwiz
CR seem to be assessing Autopilot on the basis of full self driving rather than a level 2 system requiring driver attention at all times. It's not level 3, 4 or 5 but it seems to be, by far, the best level 2 out there.
christopher
The trouble with ML and AI is that there's no intelligence whatsoever.
We don't let mentally handicapped people drive at all, so they need to stop using ML and AI and start-over using sensible auditable solutions which do what is expected, rather than "infinite monkeys" to mostly do "what seems to be OK in limited circumstances" - which is how ML/AI is trained.
Bob Stuart
Humans just are not built to pay close attention to good autopilots. Pigeons stay on task far better for Search and Rescue. Now, we have people who want to use their phones, not drive. There are also a lot of reasons to favour one side of the lane or another on many road sections.
ljaques
I think CR is becoming more and more corrupted of late. Several of their downgradings of Tesla are from people doing precisely what Tesla told them =not= to do.
No, autopilots are not ready for prime time. Yes, many improvements have been made to the old navigation helpers. People need to know that nothing on the road or in the air is sporting an actual auto-pilot. The guy who died in HelL.A. at the exact same spot he had complained to Tesla that it had "malfunctioned" before is a perfect example of the public being told one thing and them doing another. (Sorry, guy, but it was your own fault.)