Just days after two teens were killed in a dramatic, fiery crash involving a Tesla Model S in Ft. Lauderdale, on Friday night another Tesla sedan with a semi-autonomous Autopilot feature rear-ended a fire department truck while driving at a speed of 60 mph (97 kph) apparently without braking before impact, although as Associated Press reports so far police say it’s unknown if the Autopilot feature was engaged.

The cause of the Friday evening crash, involving a Tesla Model S and a fire department mechanic truck stopped at a red light, was under investigation, said police in South Jordan, a suburb of Salt Lake City.  The police added that there was light rain falling and roads were wet when the crash occurred.

“Witnesses indicated the Tesla Model S did not brake prior to impact,” the statement said.

According to South Jordan police Sgt. Samuel Winkler, the Tesla’s air bags were activated in the crash, and the Tesla driver suffered a broken right ankle.

Speaking to MartketBeat, Sgt. Winkler said that there was no indication the Tesla’s driver was under the influence of any substance, and information on what the driver may have told investigators about the circumstances of the crash likely wouldn’t be available before Monday.

The crash, in which the Tesla driver was injured, took place at a troubling time for Tesla, just one day after federal safety agencies – including not only the NTSB but also NHTSA – launched a probe investigating the performance of Tesla’s semi-autonomous driving system, following the “horrific” Florida accident days earlier in which two passengers died after they were trapped in the flaming vehicle.

Even more ominously for Tesla, on Saturday Tesla competitor Waymo announced that Tesla’s Matt Schwall has begun working for the self-driving car unit. According to Schwall’s LinkedIn Bio, he had been Tesla’s “primary technical contact” with both the NTSB & NHTSA, suggesting the company’s troubles with government regulators may be set to escalate.

Schwall’s departure comes just hours after Tesla’s chief of engineering, Doug Field, took an extended leave of absence from Tesla to “spend time with his family“, at the most critical time possible for Tesla, just as the Model 3 rollout begins in earnest. 

Police said they had been in contact with the National Transportation Safety Board about the crash. NTSB spokesman Keith Holloway said he didn’t know whether the agency would get involved with the crash.

Meanwhile, in its latest broadside on Tesla, on Sunday morning the WSJ wrote a scathing critique of Tesla’s autopilot, “In Self-Driving Car Road Test, We Are the Guinea Pigs“, in which it questioned the validity of Elon Musk’s claims about Tesla’s safety record:

Tesla says that its cars with autonomous driving technology are 3.7 times safer than the average American vehicle. It’s true that Teslas are among the safest cars on the road, but it isn’t clear how much of this safety is due to the driving habits of its enthusiast owners (for now, those who can afford Teslas) or other factors, such as build quality or the cars’ crash avoidance technology, rather than Autopilot.

In the wake of a fatal 2016 crash, which happened when Autopilot was engaged, Tesla cited a report by the National Highway Traffic Safety Administration as evidence that Autopilot mode makes Teslas 40% safer. NHTSA recently clarified the report was based on Tesla’s own unaudited data, and NHTSA didn’t take into account whether Autopilot was engaged. Complicating things further, Tesla rolled out an auto-braking safety feature—which almost certainly reduced crashes—shortly before it launched Autopilot.

As the WSJ also notes, “there isn’t enough data to verify that self-driving vehicles cause fewer accidents than human-driven ones.” A Rand Corp. study concluded that traffic fatalities already occur at such relatively low rates—on the order of 1 per 100 million miles traveled—that determining whether self-driving cars are safer than humans could take decades.

To be sure, Musk won’t be happy with the article’s punchline:

What we do have is evidence—acknowledged in Tesla’s own user manuals—that Tesla’s semiautonomous driving system is easily fooled by bright sunlight, faded lane markings, seams in the road, etc. Researchers continue to document other ways to trick these systems, as well… The company promised a cross-country drive accomplished entirely by its self-driving tech sometime in 2017 but decided the system wasn’t yet ready.

While Tesla has promised to release safety data on its self-driving tech regularly starting next quarter, it remains unclear what kind of data it will release, “but experts say public sharing of data, from all makers of autonomous vehicles, is the only way to ensure proper evaluation of the safety of these new technologies.”

Meanwhile, with every new report of a crash involving a Tesla, especially with an autopilot engaged, the public mood against the new and still largely untested “autopilot paradigm” is turning increasingly sour, and all that would take to set back the industry for years is an adverse NHTSA ruling, one which in light of Elon Musk’s recent and numerous public  meltdowns, looks increasingly probable.

The post Tesla Model S With Autopilot Slams Into Fire Truck At 60mph As Another Key Exec Departs appeared first on crude-oil.news.

By admin