A few days ago, two Texan men, aged 59 and 69, died in a bizarre road accident when their car, a 2019 Tesla Model S, ran off the road at high speed in a curve, hit a tree, and burst into a fireball that took four hours to fully extinguish. The shocking part, according to investigators, was that nobody was in the driver's seat when the car lost control.
Tesla, of course, is leading the world in autonomous driving technology by a country mile. The company has been brazenly audacious in the rollout and rapid upgrading of its Autopilot self-driving feature, which is both incredible and terrifying to experience for the first time, and miles ahead of anything else on the market.
According to the New York Times, the two deceased men in this case were talking about the Autopilot system before they went out on their fatal drive – and anyone with a friend that owns a Tesla has probably had a version of the same conversation.
The timing was remarkable too; Elon Musk had just tweeted out Tesla's Q1 safety report figures. Tesla collects a vast trove of information on all its customer cars, and it regularly crunches through this data to compare Autopilot against human drivers. The idea of an autonomous car crash may somehow be more frightening than a regular human error, but according to Musk, a Tesla running on Autopilot is already 10 times less likely to have an accident than the average American car, on a per mile basis.
Numbers can be misleading; it's unclear from the above statistics whether the comparison is fair. Perhaps they're skewed by location; maybe Autopilot use is heavier on highways, where there are fewer crashes, but more serious ones, than in the backstreets. Maybe Autopilot is shutting itself down before some of these incidents, not knowing what to do and handing responsibility back to the driver.
Either way, the incident immediately raised comparisons to a high-profile fatal Autopilot crash back in 2016, when a semi-truck pulled out in front of a speeding, self-driving Tesla whose driver was paying no attention to the road ahead, and the car's systems failed to take evasive action.
As Senate Democrats urged "complete investigations" and "corrective actions" against Tesla, Consumer Reports was quick to release a video showing how the crash might've happened. The Autopilot system is designed to make sure the driver is ready to take over if it runs into situations it can't handle, but its method for doing this is incredibly crude compared to the rest of the car: a torque sensor on the steering wheel to detect the pressure of a human hand.
But it doesn't have to be a hand. As Consumer Reports demonstrated on a closed track, this torque sensor can be fooled simply by hanging a small weight off the wheel. Apparently lacking any other way to verify the driver's on deck, the car will continue to self-drive even if you slide over into the passenger seat.
This is not news to many Tesla owners. Indeed, while almost any weight (including a grapefruit wedged into the steering wheel) can do the trick, it seems a small cottage industry has sprung up selling tastefully-designed weight attachments specifically to fool the car's safety systems, so "drivers" can grab a nap, slide over to the passenger seat to show off to their friends, or simply spare their poor shoulders the task of holding their hands up on the wheel.
There's no evidence yet that the car in the Texas crash was using something like this. Indeed, at this stage, Elon Musk says that Tesla's initial investigation of the car's data logs appears to indicate that Autopilot wasn't even switched on when the car crashed, and the car wasn't kitted out with a Full Self Driving (FSD) computer.
The truth may emerge in the fullness of time, but either way, the question remains: how much responsibility does Tesla bear to look after drivers that deliberately circumvent its safety systems and use its products in ways it explicitly tells them not to?
Perhaps this tragic, smoldering wreck in Texas was little different to a pair of teenagers sticking cruise control on and surfing the car's hood. In a country that prides itself on individual freedoms, blaming Tesla for a customer's willful misuse of its products and demanding more stringent preventative measures has a nasty whiff of Nanny State to it.
On the other hand, Autopilot is a seductive technology. It works so well, so much of the time, that it lulls drivers into a sense of security. Probably not a false sense, either – if Tesla's numbers tell the real story, it's already genuinely safer to let these cars drive than doing it yourself. So why wouldn't you learn to trust it over time, to the point where even keeping your hand on the wheel feels like a silly hangover from a bygone age – a needless bureaucratic requirement, written with lesser cars in mind, that yucks the yum of owning a self-driving car before such things are even legal?
Perhaps Tesla needs to up its game and police things better, using driver-facing cameras that can make sure you're not asleep, or playing Fortnite, or scooting over to the passenger seat – whether that's what customers want or not. Or perhaps the fact that it delivers safety if you use it correctly is enough.
Either way, as the market leader in autonomy, Tesla's every move in this strange teething period is of extreme importance to the industry, as well as to the regulators trying to figure out what the heck to do with these next-gen technologies.
Source: New York Times
The car couldn't possibly be using FSD, since it wasn't subscribed by the owner. It is incorrect that the car wasn't kitted out with a Full Self Driving (FSD) computer. It was, but the service wasn't subscribed to.
The fire didn't take 4 hours to be put out, as declared by the fire chief to Car and driver here: https://www.caranddriver.com/news/a36189237/tesla-model-s-fire-texas-crash-details-fire-chief/
The autopilot system is immediately shut off if it detects that nobody is in the drivers seat.
What's really driving this is the blame-to-claim industry. I'd hate to be Tesla Motors for this reason alone.
Second, Tesla has no blame for customers misusing the product. Are they misusing something called FSD? That is debatable.
All accidents causing death or pain and suffering should be investigated. Yet, Tesla is continuously placed under a microscope anytime a serious accident occurs.
I have a feeling that conventional auto dealers and manufacturers will go to any length to bad mouth or cause Tesla grief. 50 % less moving parts, direct sales without dealers in the majority of states the Tesla's are sold, No wonder why the existing network of car dealers and manufacturers fear Tesla's success.
https://www.teslarati.com/tesla-battery-fire-fud-debunked-tx-fire-chief/ (reported on 04/20/2021)
So, at best NewAtlas is days behind the ball, and at worst is not monitoring the veracity of claims by its posters very well.