Remember the Tesla on Autopilot that had hit a stationary truck? The reason may be the one to surprise you or come as a shock. Tesla’s autopilot feature is one of the most advanced driver assistance systems on the market today. It’s designed to help drivers by automatically sensing, steering, braking, and changing lanes on highways. However, like any technology, it’s not perfect.
In a viral video that surfaced on Twitter in 2020, shows Tesla Model 3 smashing into a stationary overturned semi-truck on a highway in Taiwan. The truck driver tried warning the Tesla driver to slow down and notice the truck in the middle of the road. However, the Tesla on Autopilot did not even flinch or lower its speed and crashed right into the truck.
This is not the first time something of this sort happened in May 2016, a Tesla Model S on autopilot collided with a stationary truck in Florida, killing the driver. This Tesla autopilot tragedy highlights the potential risks of using any driver assistance system, no matter how good it is.
Table of Contents
A Tesla Model 3 collided with an overturned semi-truck on a highway in Taiwan. The Tesla is said to be set on autopilot by the driver when the incident happened. The Model 3 Tesla car did not sense the stationery overturned truck and rammed into it on the highway.
So, why did this happen? Investigators believe that Tesla’s radar system failed to detect the truck because it was overturned and was stationarily parked in the middle of the road. The autopilot smart system might have confused the semi with stationary sign boards and other random stationary objects on the road.
The autopilot system relies on radar to detect objects in the road ahead, so it’s possible that the stationary semi-truck confused the radar and caused it to miss the truck.
This is a cautionary tale for anyone who uses any kind of driver assistance system, including Tesla’s autopilot. These systems are not perfect and they should never be used without supervision. If you’re using autopilot, or any other driver assistance system, always be aware of your surroundings and be ready to take control of the vehicle if necessary.
Why the Accident Happened?
Tesla uses cameras and front-facing sensors to make sense of any obstacles along the route. It is not likely that the 12 ultrasonic radars could not sense or detect the semi-truck lying in the middle of the road.
There are a few potential reasons why Tesla’s autopilot system might have caused the car to hit an overturned truck on a highway. The autopilot system may not have been able to properly detect the truck due to its position on the road.
Tesla’s autopilot system may have misjudged the distance to the truck, thinking it was further away than it actually was. Moreover, the autopilot system may have failed to brake in time to avoid the collision. These do make sense from a logical point of view of the collision with a semi-truck.
However, many automobile experts hint toward a more crucial aspect of machine learning in this case. The Tesla neural network might have confused the semi-truck with other stationary objects along the highway. But the cameras responsible for cross-referencing every object along the way should have known better. Maybe because these cameras have never seen an overturned semi-truck before and thought it was a regular overhead signboard on the road.
If Tesla had been working and using a system to work like the LiDAR system, the problem could have been solved potentially. The required system should be able to cross-reference such situations to avoid any fatal mishappening.
How Does Tesla Autopilot Work?
When it comes to self-driving cars, Tesla is leading the pack. The company’s Autopilot system is undeniably the most advanced on the market, and it’s constantly improving. But how does Tesla Autopilot work?
Cameras & Ultrasonic Sensors
The key to Tesla Autopilot’s success is its use of sensors and cameras. The system uses eight cameras to create a 360-degree view of the car. These cameras are supplemented by 12 ultrasonic sensors that can detect objects up to 16 feet away.
Tesla Autopilot also uses GPS to keep track of the car’s location. This information is used to create a map of the car’s surroundings. This map is constantly being updated as the car moves.
Tesla Neural Network
The final piece of the puzzle is Tesla’s neural network. This artificial intelligence system is constantly learning and improving. It processes the data from the sensors and cameras to make driving decisions.
This combination of sensors gives Tesla Autopilot a comprehensive view of its surroundings. But how to sum up all the data to convert 2D into a 3D version for the autopilot to understand what to do? That is where machine learning comes into action.
So, the next time you’re in a self-driving car, remember to thank the machine learning algorithms that are keeping you safe! These systems can identify lane markings, other vehicles, traffic signs, and obstacles in the road. This information is then used to navigate the car safely.
Tesla Autopilot is constantly evolving. The system is regularly updated with new features and improvements. Tesla is also working on adding new capabilities, such as the ability to change lanes automatically. As self-driving cars become more common, Tesla Autopilot will continue to lead the way.
How Many Types of Sensors Are There?
As autonomous driving technology continues to develop, so too do the sensors that are used to power it. Here is a look at some of the different types of sensors that are being used by automobile manufacturers for autonomous driving:
Light Detection and Ranging (LiDAR) is a sensor that uses laser light to map out the surrounding area. It is often used in conjunction with other sensors, such as cameras and radar, to provide a more comprehensive view of the environment.
A radar is a sensor that uses radio waves to detect objects in the surroundings. It can be used to detect both stationary and moving objects, making it ideal for use in autonomous vehicles.
Ultrasonic sensors emit sound waves and measure the time it takes for them to bounce back off of objects nearby. This information can be used to detect obstacles in the environment and determine their distance from the sensor.
Cameras are perhaps the most important sensor for autonomous vehicles, as they provide a real-time view of the surroundings. Camera systems often use computer vision algorithms to interpret the images they capture.
Global Positioning System (GPS) sensors are used to determine the precise location of the vehicle. This information is critical for autonomous vehicles, as it allows them to navigate safely and efficiently.
We all know that self-driving cars are the future. But what many people don’t realize is that machine learning plays a crucial role in making this autopilot technology possible.
Machine learning is a type of artificial intelligence that allows computers to learn from data, identify patterns, and make predictions. This is exactly what’s needed for a self-driving car to be able to make split-second decisions on the road.
Each of these sensors plays an important role in the development of autonomous vehicles. As technology continues to evolve, new and improved sensors will likely be developed to further enhance the capabilities of these vehicles.
Why Don’t Tesla Install LiDAR in Its EVs?
Tesla says that LIDAR is not an efficient technology for autonomous driving because it is expensive and the data it provides is not as accurate as other technologies.
Some experts have argued that LiDAR is not necessary for autonomous driving and that Tesla’s decision to not use the technology may be more about cost than anything else. However, some believe that LiDAR is a critical component of self-driving cars and that Tesla’s decision could ultimately prove to be a mistake.
Recently, Tesla Model Y has achieved the highest overall score under the Euro NCAP’s most merciless protocol to date, shattering its past rounds of safety testing. The findings could be seen as a validation of Tesla Vision, as the tests demonstrated that without radar, the Model Y is even safer — perhaps even safer than before sans the LiDAR. Simply put Tesla cars are as safe as any other vehicle running on AI or without.
Euro NCAP Secretary General Michiel van Ratingen officially stated and congratulated Tesla’s stellar performance saying, “Congratulations to Tesla for a truly outstanding, record-breaking Model Y rating. Tesla has shown that nothing but the best is good enough for them, and we hope to see them continue to aspire to that goal in the future.”
Tesla Model Y scored exceptionally well in the adult occupant, child occupant, and pedestrian protection categories. This strong performance is a result of Tesla’s commitment to safety. All of Tesla’s vehicles are designed with safety as a top priority. Tesla Model Y safety features include automatic emergency braking, lane-keeping assist, active park assist-360-degree cameras, and sensors.
Only time will tell which side is correct, but one thing is for sure: Tesla’s decision to eschew LiDAR in favor of other technologies is certainly an interesting one.
Why Other Automobile Makers Are Using LiDAR?
LiDAR makes the whole data interpretation process to be significantly simplified by using the instant data from the autonomous vehicle’s surroundings. The LiDAR sensors don’t have to depend on the neural network of the vehicle to make sense of the 2D data. Waymo and other leading automobile makers are relating to the fact that how effective LiDAR sensors could be for their vision of autonomous drives.
LiDAR, which stands for light detection and ranging, uses lasers to map out the surrounding environment. This information is then used by autonomous vehicles to navigate safely. Mercedes, BMW, Volvo, General Motors, GMC, and other leading automobile makers are already using LiDAR sensors for autonomous models.
Tesla is an automobile company that is not in favor of LiDAR sensors. Elon Musk, Tesla owner and CEO even said that the LiDAR technology is unnecessary and ugly. Well, that to some extent is true as the big size of these LiDAR sensors is not very adorable. But the size or the looks of it do not matter in front of the safety aspect it could bring.
Leading automobile makers across the globe are in favor of using LiDAR for the added benefit of instant data interpretation over-relying on machine learning alone. However, Tesla & Toyota beg to differ and consider their machine learning or neural network to be enough for their autonomous approach.
Elon Musk is not very pleased with LiDAR technology but LiDAR could potentially save some lives in such confusing situations where the sensors do not know what to do. The semi-truck collision is a reminder of how fragile depending on cameras and ultrasonic sensors could be. LiDAR sensors could possibly be successful to detect the semi-truck in this situation. According to Tesla,
“All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. The system is designed to be able to conduct short and long-distance trips with no action required by the person in the driver’s seat.”
It’s important to remember that Tesla’s autopilot system is not perfect, and accidents like this can happen. However, Tesla is constantly working to improve the system and make it safer. Hopefully, future versions of autopilot will be able to avoid accidents like this one.
“The Model 3 Tesla car did not sense the stationery overturned truck and rammed into it on the highway.”
Since it was “stationery” the Tesla determined it was paper thin and safe to pass through.
Autonomous braking in all newer cars…..mandatory I 2 years o all new cars in NA would prevent this.
This just proves Teslas don’t truly understand their environment. They certainly cannot handle unknowns via a set of generic fallbacks.
The truck blocked view of the road, which could be seen extending to either side of it and grew in view as it was approached.
It doesn’t matter how “smart” their system is if it can’t figure out the fundamental concept of “obstruction”. Something is in the way. Don’t know what it is, but it is there. Don’t hit it.
The big error in this article is that this version of Tesla did have LiDAR may of 2021 is when they announced that they were not going to be using LiDAR in the future.
So the premise of this article doesn’t make much sense.
The opposite of what you said is true. Musk has always been against the use of LIDAR. No Tesla ever built has a LIDAR in it.
Recently, Tesla bought a company that makes LIDAR and is considering putting it in their cars in the future, probably because all the autopilot failures and crashes.
Tesla doesn’t own a LIDAR company…. it does test some vehicles with LIDAR senses as a comparison study, not to implement it.
If you look at the Independent testing, you will see that Tesla has had the best Safety record and assessment of its systems. “probably because of all the autopilot failures and crashes”, hmmm are you talking about recent failures and crashes, or what has happened since Autopilot’s release?
BTW You are correct Tesla has never used LIDAR in their vehicles. I think Al my be thinking about Radar. Radar was active on this car at the time of the crash… Tesla hadn’t switched to a vision system at the time of the accident.
The article is talking about a crash that happened over 6 years ago, I think many things have changed since then….no?
Read thisvaryicle again this article is about a model 3 in 2020 it reference a crash in florida in 2016 but its about 2020 please read before spouting off
Nope. Tesla announced that they would only use cameras, not radar. LiDAR isn’t radar. LiDAR is used to map the entire environment in 3 dimensions. Radar is used by many vehicles to do ranging of objects, particularly for adaptive cruise control. Tesla think they can achieve the same thing using just cameras.
Haha! Yes! Exactly but don’t expect any bay sayers to pay attention to your comment, or mine. BTW it’s on the driver to pay attention and make appropriate decisions still to this day but we won’t mention that. We might lose our click bait crowd.
Why didn’t the driver take control?
Probably sleeping or on the phone, as always.
Bob makes a good point . Why doesn’t the article mention an attentive driver as a solution?This would have helped in this situation. Despite its misleading name, Autopilot is a Level 2 assistance system.
I’m a pilot and Autopilot is not misleading in any way. In fact a German Court just recently confirmed this meme.
Autopilot is an assistance tool, not something designed to take responsibility.
That may be the legal interpretation in Germany, but if you ask almost anyone who has heard of it, they will most likely tell you they think it is truly a “hands off autonomous system”.
That’s one of the reasons there are crashes. Too many people misunderstand it’s capabilities because of the name.
Have you used Tesla FSD? You must first agree to all of the conditions and there are many, including keeping yiur hands on the wheel at all time, remaining attentive, and being able to take over control at a mos notice….
Looks like the car tried to brake just where the truck driver is standing. So my question is did the driver override the system?
2 year old story… really must be a slow day at home?
There are many fingers pointing at the technology.
Where was the drivers mind and eyes that they didn’t apply the brakes?
You cannot bkame the tools. They are only helpers. People need to take responsibility fir their actions. Or in this case their inaction.
In today’s society people blame criminals and go after the law enforcement. So, you think they won’t go after the tool?
That Tesla had the old version of Autopilot. The newest Tesla full self driving software would have detected the truck. The new software is made possible by Tesla Vision which is based on an entirely new neural network platform and currently available to cars in US only. It is expected to start rolling out to cars in other countries in 2023. You can watch videos of Tesla owners in US and it is unbelievably skillfully at driving along all kinds of roads, even with pedestrians, parked cars, multiple lanes etc.
WHY THIS HAPPENED… THE DRIVER WASN’T BEING A DRIVER. 100% DRIVER ERROR
Why this happened? The driver wasn’t driving. 100% driver negligence.
No…. Why the f*** didn’t the driver see the big ass truck sitting in the middle of the road and not stop the car? What the f*** does this have to do with autopilot?????
TWO YEAR OLD STORY
Tesla vision in cars TODAY already solved this issue. Tesla doesn’t just sit still and do nothing. They innovate at a pace not seen before.
More regurgitating old stories to spread Tesla FUD as usual via clcikbait. Because nothing gets more clicks these days than negative articles with the words Tesla or Elon in their titles.
All a bit dated. Neural nets is the leading tech because they can continiouly improve with new data. But they are not so good with things they have never seen before, hence this accident.
The issue is that our expectations are that AI should not only be as good as a human, but better. We tolerate human accidents, but not machines. So to get autonomous driving working we need the cars to be able to do things human can’t do. For example, ability to see around corners to see if a vehicle is approaching and at what speed. For that, we should have public real time LIDAR maps being broadcast from traffic lights or nearby power poles. If we had that, we could make autonomous driving 100% safe.
We do not tolerate human accidents that why we have dangerous driving laws people go to prison people get banned from driving they have to take there driving test again after the ban period. So no we do not tolerate. If they kill someone its murder.
This happened 4-5 years ago I think. Software and vision systems have improved tremendously since then.
The problem was NOBODY was driving the car!!! Automation should assist not take control.
Firstly there are no self driving cars currently registered for road use outside of experimental platforms.
Next where was the driver ? Only a moron would allow untested to self drive.
Next a tesla car warn you that they are notvself driving that you have to be incontrol of your car.
Next ultra sonic sensor are well known for not picking up white flat surfaces. I have owned several car with front and rear sensor for parking they never detect my white garage door .
Stop blaming cars for what is nothing but dangerous driving forget all these driving aids you the driver are totally incharge of that car you are responsible I ask again where was the driver.
Please read up on the new Tesla occupancy networks and how this whole article is already behind the times.
I can’t believe no one has seen the obvious about the reason the Tesla didn’t see the truck. Think stealth fighter. The truck is on its side exposing the flat roof to the car, but at an angle. So the Tesla’s radar was scattered to the side – not reflected back to the car. It would have seen a vertical line at the right edge of the road created by the top rear edge of the truck’s box where it transitions to the back where the doors are. That would look like a sign post to it. The cab of the truck would have generated a signature but it’s in the other lane so the Tesla would ignore it.
I think that first bit of you’re smoke you’re seeing was the driver mashing the brakes. Autobraking doesn’t apply the brakes hard enough to lock them up.
This is a realistic explaination of this event.
It would be if this is how radar worked. Radar scatters like light scatters. Things don’t become invisible when they move from the perpendicular
I have a bigger concern with the auto pilot type systems. While I am in favor of new technology and improvements, such as this is, I still have a safety concern.
My concern involves the possibility of the software used in this system could be compromised internally or externally. Could someone hack into this system and cause problems. It happens every day on software programs such as the defense department and other government agencies where hackers have gotten through supposed safeguards. In addition, what if there was a deranged programmer working within the organization and on the software. Is it possible he or she could put some coding in that could cause Tesla cars to go out of control on a mass basis? Again, this has been done on other products or programs.
I have not seen or heard anyone address these concerns, and unless I could be convinced that nothing could happen, I think I will still take control of my car.
Was there a passenger in the car. Ie, the driver, so why didn’t the person responsible for that journey avoid the collision, or is the future ov road use going to be full ov “ACCIDENTS”.
It’s not Tesla’s fault.
The safety message said “ Driver must have both hands on the steering wheel at all times and it is not a self driving car either “
It’s the driver’s fault always.
Tesla’s autopilot is not nearly as good as GM’s. No way I would buy a Tesla with all these death
What happens if insects collide with the car and block the view of one or more of the cameras.
I’m inclined to think its harder to sit and second guess the autopilot than just drive the damn thing.
As an aside, don’t forget to trade it in before your 80,000 km warranty on everything except the drive train runs out.
There are several different forward facing cameras. Rear and side cameras don’t generally get hit by bugs. The car automatically uses the windshield wipers and sprayers to clear the windshield of any debris, so how about blocks one of the cameras it will just turn the wipers on and clean the windshield until it’s view is restored. I have a Tesla that is past warranty, and it is fine. No issues since the warranty ran out.
Waite a minute fudsters, this is news over 2 years old! New improvements have been made. Must be a slow weekend in the news pub
Prove me first that Tesla was on autopilot
Prove me that the Tesla was on autopilot at the time of the crash!!!!
Why not ask a more relevant question like:
Why is Tesla or any other company permitted to beta test an autonomous vehicle or self driving vehicle on public roads where the public around the beta test is not aware of the potential risk and has not been asked to participate in the beta test in which they could be killed, injured, or have loss if property.
Motor vehicle collisions are one of the leading causes of death in the United States. The earlier that we developed autonomous system that anyone can use that is more safe than the average driver, the sooner that we can drastically reduce these deaths. The only realistic way to speed up this process, is to have a lot of cars on the road collecting data that can be used to train neural networks. Forcing companies to use small manufactured courses, or simulations, will guarantee that it will not happen during our lifetimes and will kill many more people, than allowing Tesla to do what it is doing. Anytime something happens with a Tesla, it makes the news. It looks like it even keeps being brought up years later, like this article has. There is an overrepresentation in the media about accidents that Tesla causes. Tesla’s are actually many times safer than the average driver already, but part of that is because the driver is supposed to also pay attention. So you basically have two different sets of eyes on the road. If the driver is doing their job and paying attention, it can only be safer than the driver alone. The system has improved over time and I think it’s in everyone’s best interest to allow this to continue. The fact that this author had to reach back two years to find something interesting to talk about, should tell you how safe it is currently.
OM God, this was such a long time ago. Tesla used to use radar. The problem with Doppler radar is that when something is stationary, it looks like everything else the radar can see, like the road. This is part of why Tesla did away with radar and why it switched to using videos to train it’s neutral networks instead of still images. Instead of just classifying objects as a car, a truck, a pedestrian, or whatever, it now also determines the location of every pixel it’s cameras receive. So even if it doesn’t recognize something as a truck, because it’s never seen an overturned truck in that exact position before, it will still know that there is something in is path. This example is irrelevant when discussing Tesla’s current system.
Most cars would of done the same thing. If you’re going a certain speed stopped objects are ignored. Their vision software 100% saw it but isn’t allowed to react to it. The driver was given warnings just like all other cars. Lidar would of also saw it as well but would of fallen into the same scenario of ignoring it because it is stopped. Radar by default for car systems ignores everything that isn’t moving. If it didn’t do that then it would read sidewalls and bumps in the road as immovable objects.
ARBE 4D radar is better than LIDAR and cheaper. Musk needs to be willing to consider outside sourcing.
Just another Tesla hit piece. This happened over 2 years ago and your rehashing it now? How stupid. The Tesla FSD software is light years ahead of where it was 2 years ago.
Side bar whenever you engage EAP or FSD it warns you to keep your hands on the wheel and be ready to take over obviously that didn’t happen. FAKE NEWS!!!!!
The most technologically advanced commercial aircraft have “Full” Autopilot from take off to landing. These planes require fully alert pilots ready to take control. Yes, they have many passengers to protect. Does not matter how many people flying or driving. These “Autopilot” driving systems will require an alert driver for the foreseeable future. No sleeping while your supposed to be driving!
What an unbelievable stupid and click baity article filled with vapid speculation and zero substance. It’s websites like this that highlight the dispicable capitalist agenda of the modern internet
The driver said it was on autopilot to avoid fines. But it’s clear that it was NOT on autopilot. You can also see that from the history/”Blackbox” of the car. İf it was on autopilot the ultrasonic sensor would alert the car and it would brake. So it was NOT on autopilot. İt’s clear. So stop assuming this because the driver said so. İ am sure he wanted to avoid penaltys. İ fully trust on ultrasonic sensor/radar. They can’t do anything wrong. İ am sure the driver was watching something on bis phone or messaging and did not see it.
Well unfortunately for woke liberal Libtards self driving cars are not anywhere near perfect. Cars were originally designed to be driven n enjoyed. Battery powered cars will never replace gasoline powered vehicles n self driving failed computers will never replace a real person! Suggestion, wake up all you woke clowns before its too late!
People who have achieved nothing compared to what Elon has, are becoming AI scientists on a silly internet story (without knowing what happened to the driver inside the Tesla maybe he passed out and pressed the accelerator) and judging him and Tesla.
What a shame
Purnima is out if touch woth Tesla FSD progress and spouts outdated nonsense