Just days after a fervent Tesla enthusiast’s European road trip to show off his Model 3 ended after the car’s “Autopilot” drove him into a median, new video emerged on Twitter this morning giving what appears to be an in-depth look at how Autopilot could be malfunctioning in a number of recent accidents.

This morning, video was posted showing what is allegedly a Tesla in Autopilot mode having a bout of “confusion” when the road being traveled changes from a straight road past a section on a highway that offers an exit by the road dividing. The video appears to show the car unable to determine which lane to stay in and nearly hitting the center median that divides the roadway from the exit.

The original video states that the car “dives” at the divider; it was released just days after yet another confirmed Model 3 Autopilot-induced accident was reported, one which Elon Musk won’t be able to consign to bias by the media or by the driver, as the accident occurred in a car operated by a long time Tesla enthusiast and was reported by electrek, a website that has been Tesla’s PR mouthpiece for the better part of the last couple of years.

The accident involves driver You You Xue, who in early 2018 famously toured his Model 3 across North America after he bought it in order to show other reservation holders what it looked like. That road trip was documented in all of its glory on electrek and was presented as a feel-good story.  You You felt so good that he decided to repeat the trip across Europe to continue to garner free publicity for the Model 3. This trip was less of a resounding success, and ended with the car driving itself into a median in Greece.

On You You’s Facebook page, he tells the story of the crash:

The road trip is over. I’m sorry.

Vehicle was engaged on Autopilot at 120 km/h. Car suddenly veered right without warning and crashed into the centre median (edit: divider at the exit fork). Both wheels (edit: one wheel) completely shattered, my door wouldn’t even open correctly. I’m unharmed.

The driver also posted photos of the fork in the road where the accident took place.

He then thanked everyone and the EV community in an updated post:

I’m just stating the facts in my post. My insurance is third-party only, which means I will receive no compensation for this collision. I am now calling a tow truck driver who will tow the car to Thessaloniki as it is not drivable. I will make further plans, most likely to repatriate the car back to San Francisco from there. I’m trying to stay positive guys, I’m so lucky to have had this opportunity to represent the EV community and movement, and so unlucky to have had this happen to me after driving without an issue through 25 countries.

Perhaps because this was a true enthusiast and because this trip was been widely documented, Tesla would consider employing a different PR strategy and showing some empathy and concern for the driver. Instead, they followed what has largely been their playbook for these types of accidents so far, blaming the driver when asked for a comment by electrek:

Update: Tesla sent us the following statement:

“While we appreciate You You Xue’s effort to spread the word about Model 3, he was informed that Tesla does not yet have a presence in Eastern Europe and that there is no connectivity or service available for vehicles there. In addition, Model 3 has not yet been approved and homologated for driving outside of the U.S. and Canada. Although we haven’t been able to retrieve any data from the vehicle given that the accident occurred in an unsupported area, Tesla has always been clear that the driver must remain responsible for the car at all times when using Autopilot. We’re sorry to hear that this accident occurred, and we’re glad You You is safe.”

Then, in a follow up attempt at spin, electrek also blamed the driver, proudly carrying the Tesla company line, nearly suggesting that the driver is lying and suggesting that this could be “the first example of Autopilot causing an accident”:

Electrek’s Take

When he says that it hits the median at an “exit fork”, the accident sounds reminiscent of the fatal Model X accident on Autopilot in Mountain View where it confused the median for a lane.

But in that case, Tesla showed that the driver had a lot of the time to take control.

While using Autopilot, the driver always needs to stay attentive and be ready to take control at all time.

As for You You’s accident, he makes it sounds like he had no time to respond as the car “suddenly veered”, but I have never seen Autopilot do that before.

I am not saying that You You is lying, but it is certainly a strange situation. Also, it looks like he was driving late at night and has been driving for days.

If it’s not a misuse of Autopilot and the system indeed “suddenly veered” into the median, then it might be the first example of Autopilot causing an accident, but I think we need to have the data logs before we get into that.

You You put out the following Tweet to Elon Musk immediately after the accident. 

He has allegedly received no response, despite Musk having tweeted up a storm in the past week (instead of being on the factory floor all night as he said previously). You You later wrote on his Twitter that the Tesla “autopilot becomes pointless when you can’t even look away for just a second, when your attention has to be more undivided then when you drive the car yourself, when your role becomes that of a beta tester risking your life to test a faulty software.”

This tweet was followed by additional ones, in which the (former) Tesla enthusiast charges Tesla with not doing enough to address the problem especially since he is now facing significant out of pocket expenses.

This all comes in light of the Center of Auto Safety and Consumer Watchdog who recently sent the FTC a joint request to look into how Tesla has marketed its “Autopilot” feature. Their letter to the FTC starts:

Two Americans are dead and one is injured as a result of Tesla deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is. After studying the first of these fatal accidents, the National Transportation Safety Board (NTSB) determined that over-reliance on and a lack of understanding of the Autopilot feature can lead to death. The marketing and advertising practices of Tesla, combined with Elon Musk’s public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of “self-driving”.

Consumers in the market for a new Tesla see advertisements proclaiming, “Full Self Driving Hardware on All Cars.” They are directed to videos of Tesla vehicles driving themselves through busy public roads, with no human operation whatsoever. They see press releases alleging that Autopilot reduces the likelihood of an accident by 40%. They also hear statements like “the probability of an accident with Autopilot is just less” from Tesla’s CEO, Elon Musk. Or they hear him relate Autopilot in a Tesla to autopilot systems in an aircraft. Such advertisements and statements mislead and deceive consumers into believing that Autopilot is safer and more capable than it is known to be.

Tesla Autopilot – and this incident in particular – were recently discussed on a podcast over Memorial Day weekend, where analysts Chris Irons of Quoth the Raven Research and noted Tesla critic Montana Skeptic reviewed the details and the company’s response (starts at about 27:17 in). 

Critics continue to point to Elon Musk’s statements during the Model 3 handover event, where he basically told a crowd that Tesla vehicles are capable of driving themselves, allowing operators to “watch movies, talk to friends [and] go to sleep” while at the wheel.

In the future, really – the future being now – the cars will be increasingly autonomous. So, you won’t really need to look at an instrument panel all that often, you’ll be able to do whatever you want. You’ll be able to watch movies, talk to friends, go to sleep. Every Tesla being produced now, the Model 3, the Model S, the Model X, has all the hardware necessary for full autonomy.”

Electrek and Tesla have been taking a very “matter of fact” tone that the driver needs to be especially alert when Autopilot is on, despite Tesla videos put out by Tesla themselves showing a driver with Autopilot engaged and his hands off the wheel. 

Judging by You You’s statements about his recent experience, it doesn’t just seem that Autopilot is being marketed incorrectly; this “amenity” may in fact represent an extra liability to the driver given Elon’s comments. In fact, why doesn’t Tesla just completely deactivate the erroneously named “autopilot” feature indefinitely until the company stops, in You You’s words, letting humans risk their lives as beta testers to experiment with what is clearly faulty software.

At this point one wonders how many more accidents will it take before the company stops blaming driver for chronic “autopilot” crashes, and how many more injuries and/or casualties will regulators need to document before taking action?

The post Tesla Enthusiast’s European Model 3 Tour Ends When Autopilot Crashes Into Median appeared first on crude-oil.news.

By admin