Automotive

How much blame should Tesla accept if customers misuse its products?

How much blame should Tesla accept if customers misuse its products?
A recent crash linked to misuse of Tesla's Autopilot feature has started some heated arguments
A recent crash linked to misuse of Tesla's Autopilot feature has started some heated arguments
View 1 Image
A recent crash linked to misuse of Tesla's Autopilot feature has started some heated arguments
1/1
A recent crash linked to misuse of Tesla's Autopilot feature has started some heated arguments

A few days ago, two Texan men, aged 59 and 69, died in a bizarre road accident when their car, a 2019 Tesla Model S, ran off the road at high speed in a curve, hit a tree, and burst into a fireball that took four hours to fully extinguish. The shocking part, according to investigators, was that nobody was in the driver's seat when the car lost control.

Tesla, of course, is leading the world in autonomous driving technology by a country mile. The company has been brazenly audacious in the rollout and rapid upgrading of its Autopilot self-driving feature, which is both incredible and terrifying to experience for the first time, and miles ahead of anything else on the market.

According to the New York Times, the two deceased men in this case were talking about the Autopilot system before they went out on their fatal drive – and anyone with a friend that owns a Tesla has probably had a version of the same conversation.

The timing was remarkable too; Elon Musk had just tweeted out Tesla's Q1 safety report figures. Tesla collects a vast trove of information on all its customer cars, and it regularly crunches through this data to compare Autopilot against human drivers. The idea of an autonomous car crash may somehow be more frightening than a regular human error, but according to Musk, a Tesla running on Autopilot is already 10 times less likely to have an accident than the average American car, on a per mile basis.

Numbers can be misleading; it's unclear from the above statistics whether the comparison is fair. Perhaps they're skewed by location; maybe Autopilot use is heavier on highways, where there are fewer crashes, but more serious ones, than in the backstreets. Maybe Autopilot is shutting itself down before some of these incidents, not knowing what to do and handing responsibility back to the driver.

Either way, the incident immediately raised comparisons to a high-profile fatal Autopilot crash back in 2016, when a semi-truck pulled out in front of a speeding, self-driving Tesla whose driver was paying no attention to the road ahead, and the car's systems failed to take evasive action.

As Senate Democrats urged "complete investigations" and "corrective actions" against Tesla, Consumer Reports was quick to release a video showing how the crash might've happened. The Autopilot system is designed to make sure the driver is ready to take over if it runs into situations it can't handle, but its method for doing this is incredibly crude compared to the rest of the car: a torque sensor on the steering wheel to detect the pressure of a human hand.

But it doesn't have to be a hand. As Consumer Reports demonstrated on a closed track, this torque sensor can be fooled simply by hanging a small weight off the wheel. Apparently lacking any other way to verify the driver's on deck, the car will continue to self-drive even if you slide over into the passenger seat.

This is not news to many Tesla owners. Indeed, while almost any weight (including a grapefruit wedged into the steering wheel) can do the trick, it seems a small cottage industry has sprung up selling tastefully-designed weight attachments specifically to fool the car's safety systems, so "drivers" can grab a nap, slide over to the passenger seat to show off to their friends, or simply spare their poor shoulders the task of holding their hands up on the wheel.

There's no evidence yet that the car in the Texas crash was using something like this. Indeed, at this stage, Elon Musk says that Tesla's initial investigation of the car's data logs appears to indicate that Autopilot wasn't even switched on when the car crashed, and the car wasn't kitted out with a Full Self Driving (FSD) computer.

The truth may emerge in the fullness of time, but either way, the question remains: how much responsibility does Tesla bear to look after drivers that deliberately circumvent its safety systems and use its products in ways it explicitly tells them not to?

Perhaps this tragic, smoldering wreck in Texas was little different to a pair of teenagers sticking cruise control on and surfing the car's hood. In a country that prides itself on individual freedoms, blaming Tesla for a customer's willful misuse of its products and demanding more stringent preventative measures has a nasty whiff of Nanny State to it.

On the other hand, Autopilot is a seductive technology. It works so well, so much of the time, that it lulls drivers into a sense of security. Probably not a false sense, either – if Tesla's numbers tell the real story, it's already genuinely safer to let these cars drive than doing it yourself. So why wouldn't you learn to trust it over time, to the point where even keeping your hand on the wheel feels like a silly hangover from a bygone age – a needless bureaucratic requirement, written with lesser cars in mind, that yucks the yum of owning a self-driving car before such things are even legal?

Perhaps Tesla needs to up its game and police things better, using driver-facing cameras that can make sure you're not asleep, or playing Fortnite, or scooting over to the passenger seat – whether that's what customers want or not. Or perhaps the fact that it delivers safety if you use it correctly is enough.

Either way, as the market leader in autonomy, Tesla's every move in this strange teething period is of extreme importance to the industry, as well as to the regulators trying to figure out what the heck to do with these next-gen technologies.

Source: New York Times

20 comments
20 comments
Vitor Amorim
Just a few clarifications:
The car couldn't possibly be using FSD, since it wasn't subscribed by the owner. It is incorrect that the car wasn't kitted out with a Full Self Driving (FSD) computer. It was, but the service wasn't subscribed to.
The fire didn't take 4 hours to be put out, as declared by the fire chief to Car and driver here: https://www.caranddriver.com/news/a36189237/tesla-model-s-fire-texas-crash-details-fire-chief/
The autopilot system is immediately shut off if it detects that nobody is in the drivers seat.
WillyDoodle
Well, gun manufacturers accept none, so why should car manufacturers?
SteveMc
Even the suggestion that Tesla should be held accountable for misuse of their vehicles is quite preposterous. These two middle-aged guys going out to 'test' their Tesla at that time of night reeks of to me. Maybe we should turn this on its head and ask the question "Why are other car manufacturers still making steel coffins now that we have the technological advancements not to?"
What's really driving this is the blame-to-claim industry. I'd hate to be Tesla Motors for this reason alone.
Spud Murphy
Why is this written as if autopilot may have been engaged? Tesla has stated it wasn't, and autopilot wouldn't engage on such a road anyway, it needs road lines and marks to operate, so autopilot was not a factor in this crash. Most likely, they were driving without seatbelts and the driver got thrown into the passenger seat in the crash.
Tudor Montescu
First, the facts: xpeng leads autonomous driving, not Tesla, and even worse it's proven they're using original software.
Second, Tesla has no blame for customers misusing the product. Are they misusing something called FSD? That is debatable.
Gene Preston
There is also the possibility that the driver was in the drivers seat but went to the back seat to get out of a burning car trying to open the seats to the trunk when fumes overcame them. It could have been a simple running off the road accident hitting a tree and then the vehicle catching fire and occupants could not get out.
guzmanchinky
None. These idiots were morons and it did not take 4 hours to put the fire out, get your facts straight. They only had to keep it cool for a few hours after the fire was put out in a matter of minutes. It's this kind of blame throwing that keeps big companies like Benz or GM from really pushing forward the way Tesla does.
aksdad
Tesla is not the only manufacturer blamed for the misuse of its products, nor is it the only car manufacturer blamed for the consequences of stupid driving. Fortunately innocent people weren't injured or killed by the poor choices of the driver and passenger in this incident. Unfortunately that isn't always the case. Bad drivers kill or maim hundreds of thousands each year worldwide. Be sober, drive wisely.
mutmod1
Thousands of drivers are killed on US roads every year in conventional gas powered autos. Many accidents are caused by driver inattentiveness, taking their eyes of the road to place a CD into the auto's entertainment system, texting, answering a cell phone call, eating while driving, etc. A Tesla owner went to the extent of placing a photo of he and his girlfriend having sex while behind the wheel while driving his Tesla.

All accidents causing death or pain and suffering should be investigated. Yet, Tesla is continuously placed under a microscope anytime a serious accident occurs.

I have a feeling that conventional auto dealers and manufacturers will go to any length to bad mouth or cause Tesla grief. 50 % less moving parts, direct sales without dealers in the majority of states the Tesla's are sold, No wonder why the existing network of car dealers and manufacturers fear Tesla's success.
Jack Hodges
IF it were AP or FSD then all we should say to Tesla is that they should have stronger controls on how these features are activated or deactivated, but the sad reality is that the author of this article simply pullled feeds from sources that didn't do their homework. The WSJ and NYT jumped the gun. Here is an article debunking much of what they reported (and this author re-posted), and Elon Musk has already repoorted on the contents of the car's black box. If we start challenging black box data then we might as well give up everything.

https://www.teslarati.com/tesla-battery-fire-fud-debunked-tx-fire-chief/ (reported on 04/20/2021)

So, at best NewAtlas is days behind the ball, and at worst is not monitoring the veracity of claims by its posters very well.
Load More