HomeNews2018 Tesla Model X Autopilot Crash Story Takes New Turn with Fresh...

2018 Tesla Model X Autopilot Crash Story Takes New Turn with Fresh Evidence

Tesla is going to go to trial again as new evidence has surfaced that can turn the table for the electric automaker this time around.

One of the most widely criticized accidents is the one involving the death of Apple engineer Walter Huang in a Tesla Autopilot crash in 2018.

Even though the car was on Autopilot, it didn’t navigate properly and crashed into a barrier on the highway. This got a lot of attention because it showed some issues with how Tesla’s Autopilot/ADAS (Advance Driving Assistance System) worked back then.

People started talking about whether these self-driving technologies were really safe and reliable. It made everyone think more about the whole situation and what went wrong in the Tesla Autopilot Crash. Now, the lawyers of the Huang family are going after the claims made by Tesla and statements made by Elon.

Tesla’s defense lawyers made even erratic moves saying some statements made by Elon could be deepfakes and not actually said by Musk. Bizarre right?

Here’s the whole thing!

Tesla Model X Autopilot Crash

New Evidence in 2018 Tesla Autopilot Trial

Six weeks before the first US Tesla Autopilot accident involving the death of one in 2016, Jon McNeill, Tesla’s president at the time, decided to try his Model X.

He then emailed his feedback to Sterling Anderson, in charge of automated driving at Tesla, including Elon Musk in his email. McNeill’s email, which hasn’t been talked about before, is now being used in a legal case against Tesla regarding Autopilot.

In the email dated March 25 of that year, McNeill mentioned how well the Autopilot system worked, even comparing its smoothness to that of a human driver. However, he also admitted that he became so comfortable using the Autopilot. He said he even ended up missing exits because he was too focused on emails or calls, acknowledging it wasn’t the safest way to use it.

“I got so comfortable under Autopilot, that I ended up blowing by exits because I was immersed in emails or calls (I know, I know, not a recommended use),”

This email has become a point of interest for plaintiffs’ lawyers in a wrongful death lawsuit in California. They questioned a Tesla witness about whether the company was aware that drivers might not pay attention to the road while using its driver-assistance system.

What Lawyers Are Saying?

Last year, Tesla’s lawyers suggested that Elon Musk’s previous remarks on self-driving safety might have been fabricated using deepfake technology. They said this to justify why Musk shouldn’t have to answer questions in a lawsuit that says Tesla’s Autopilot software caused the deadly crash.

Despite this argument, a judge tentatively ordered Musk to provide a deposition.

The trial of the sad highway accident near San Francisco has made a lot of people pay attention to Tesla’s Autopilot system. Tesla says Huang was playing a video game before the crash and didn’t use Autopilot correctly.

Lawyers for Huang’s family want to know if Tesla knew that people, including its own president, Jon McNeill, might not use Autopilot the right way. They also want to know what Tesla did to keep drivers safe.

The trial is set to begin from next week, and experts who know a lot about self-driving cars think this trial is a big challenge for Tesla. Especially if they say Autopilot is safe only if drivers do their part.

Some experts say if Tesla knew drivers might not use Autopilot correctly, they should have made it safer. Other experts think Tesla might say Huang didn’t use Autopilot properly on purpose.

Fatal Tesla Crashes

The crash that happened and caused Huang’s death is one of nearly 1,000 accidents in the United States where people thought Autopilot might have been involved. The U.S. National Highway Traffic Safety Administration (NHTSA) looked into about 956 crashes where Autopilot was reportedly being used.

They also started over 40 investigations into Tesla car accidents involving automated driving systems, and sadly, 23 people died in those accidents.

Because of the attention from NHTSA, Tesla decided to recall more than 2 million vehicles with Autopilot in December 2023. Tesla added some safety warning alerts by updating the software remotely.

Tesla Data Leak Unveils 2,400 Self-Acceleration and 1,500 Brake Issues, Raising Autopilot Safety Concerns

Are Tesla Autopilot System Safe?

Autopilot and other smart driving tools in electric cars have changed how many people drive today. These technologies do make our driving easier, safer, and give us a taste of what fully self-driving vehicles might be like in the future.

But, we need to be careful not to depend too much on Autopilot.

Why? Because there are several unexplained Autopilot-related fatal accidents questioning safety. It’s great for helping us, but we should still pay attention and be ready to take control if something goes wrong. The reliance on Autopilot raises important considerations.


If Huang’s family wins this case, it might set a rule for future cases about Autopilot. Right now, Tesla is facing lots of lawsuits about Autopilot, and many of them involve people dying.

Relying too much on technology can pose serious risks, as you can see with incidents like the Tesla Autopilot Crash. Drivers must remain vigilant at all times, ready to take control at a moment’s notice.

Let’s see how this trial unfolds more new Autopilot secrets!

Purnima Rathi
Purnima Rathi
Purnima has a strong love for EVs. Whether it's classic cars or modern performance vehicles, she likes to write about anything with four wheels, especially if there's a cool story behind it.


Please enter your comment!
Please enter your name here

Most Popular