Last month, Tesla introduced an update for its Autopilot self-driving software designed to allow the car to complete lane changes on its own. But independent testing by consumer-focused non-profit Consumer Reports (CR) has uncovered a few shortcomings in the technology, even stating that it could "create potential safety risks for drivers."
Tesla's Navigate on Autopilot was introduced in October of last year and allowed suitably equipped Tesla vehicles to autonomously navigate from the on-ramp of a highway to the off-ramp. Throughout, it would keep the car centered in the lane and at a safe distance from other vehicles, while drivers were required to authorize a lane change with a tug of the turn stalk.
Last month's update removed the need for this authorization, enabling drivers to set up their Teslas to complete the lane changes all on their own. But after spending some time behind the wheel with the updated Navigate on Autopilot engaged, Consumer Reports is of the view that it's not quite ready yet, to put it mildly.
"Tesla is showing what not to do on the path toward self-driving cars: release increasingly automated driving systems that aren't vetted properly," says David Friedman, vice president of advocacy at Consumer Reports. "Before selling these systems, automakers should be required to give the public validated evidence of that system's safety – backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions."
Friedman says that in its current form, the "automatic lane-change function raises serious safety concerns." Another descriptor used by CR is "far less competent than a human driver."
The rather damning summary is based on reports from multiple CR testers who drove a Tesla Model 3 using Navigate on Autopilot on several highways in the state of Connecticut. The testers reported that it would cut off cars in adjacent lanes by overtaking without enough space, passing on the right-hand side and failing to respond adequately to vehicles approaching quickly from behind, which leads it to cut off passing cars traveling at faster speeds.
"The system's role should be to help the driver, but the way this technology is deployed, it's the other way around," says Jake Fisher, Consumer Reports' senior director of auto testing. "It's incredibly nearsighted. It doesn't appear to react to brake lights or turn signals, it can't anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it."
Tesla is betting big on Autopilot. CEO Elon Musk said during the reveal of the Model Y in March that he expects it to be fully feature complete by year's end. And according to CNBC, in a call with investors last month he said that self-driving robo-taxis will turn Tesla into a US$500 billion company. For its part, the company has always maintained a driver's hands must remain on the wheel when Autopilot is engaged.
Source: Consumer Reports
We don't let mentally handicapped people drive at all, so they need to stop using ML and AI and start-over using sensible auditable solutions which do what is expected, rather than "infinite monkeys" to mostly do "what seems to be OK in limited circumstances" - which is how ML/AI is trained.
No, autopilots are not ready for prime time. Yes, many improvements have been made to the old navigation helpers. People need to know that nothing on the road or in the air is sporting an actual auto-pilot. The guy who died in HelL.A. at the exact same spot he had complained to Tesla that it had "malfunctioned" before is a perfect example of the public being told one thing and them doing another. (Sorry, guy, but it was your own fault.)