Autopilot not used in Texas Tesla crash

On Monday, Tesla CEO Elon Musk tweeted a denial that his company’s automated driving systems were involved in a fatal crash in Spring, Texas.

Two federal agencies, the National Highway Traffic Safety Administration and the National Transportation Safety Board, are investigating the crash now.

Local police said in multiple press interviews that, apparently, nobody was behind the wheel of the 2019 Tesla Model S when it veered off the road, hit a tree and burst into flames, according to their preliminary investigations.

Musk wrote in his tweet on Monday: “Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD. Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.”

Tesla sells its automated driving systems under the brand monikers Autopilot and Full Self-Driving, or FSD. It also releases a “beta” version of FSD software to some customers who have the premium FSD option, which costs $10,000.

Tesla Autopilot and FSD are not capable of controlling the electric vehicles in all normal driving circumstances, and the company’s owner’s manuals caution drivers to only use them with “active supervision.”

Autopilot, which is now standard in Tesla vehicles, does not always perfectly identify lane markers — for example, it can confuse sealed cracks in the road or bike lanes with other lane markers.

The system can also be misused or abused by drivers. A teen driver recently demonstrated in a stunt video he shared on social media that he could leave the driver’s seat with his Tesla’s Autopilot system remaining in use.

Access the original article
Subscribe
Don't miss the best news ! Subscribe to our free newsletter :