HomeGuideTesla Vision vs Ultrasonic Sensors, Will Tesla’s New Sensor Tech Pass With...

Tesla Vision vs Ultrasonic Sensors, Will Tesla’s New Sensor Tech Pass With Flying Colors?

The automotive industry has seen a rapid advancement in sensor technology over the years, with top manufacturers like Tesla constantly searching for new ways to improve the safety and convenience of their vehicles.

One such example is Tesla’s decision to switch from ultrasonic sensors to vision-based sensors for their parking assist system. The aim was to provide drivers with a more accurate and reliable system while also saving on component and assembly costs. However, despite these intentions, the reviews for Tesla’s new system have been mixed, with some drivers reporting issues with accuracy and reliability.

No, we are not trying to scare you away, this article will explore Tesla’s shift to vision-based sensors for parking assist and the challenges they have faced in implementing this new technology.

Tesla Vision vs Ultrasonic Sensors

What Is Tesla Vision?

Tesla Vision is a suite of advanced driver-assistance systems (ADAS) that uses computer vision, machine learning, and artificial intelligence to provide a range of features to drivers. In late 2022, Tesla made the decision to remove ultrasonic sensors (USS) from its vehicles, leaving some drivers without the park assist feature they had grown accustomed to.

In response to these complaints, Tesla has now introduced a new vision-based park assist feature with its latest software update, which does not require the use of USS hardware.

Tesla’s vision-based park assist feature has been slowly rolling out, and many users, like the famous YouTuber with the channel It’s Only Electric, have already shared their experiences with the new system.

The YouTube channel shared a video of their experiences with the system, providing valuable feedback on the effectiveness and accuracy of Tesla’s Vision by comparing Tesla Vision Park Assist and ultrasonic sensors.

The latest update from Tesla has introduced a new vision-based park assist feature that utilizes the 360-degree camera system to detect nearby objects, replacing the previous ultrasonic sensors.

Unlike the USS, which only beeped when drivers approached obstacles in reverse or at the front, the new camera-only Park Assist option is able to visually display the distance to nearby objects. However, the new feature is currently only available for cars without USS and is limited to FSD buyers in North America who have applied for Beta access.

Tesla Vision Vs Ultrasonic Sensors

Tesla Vision Park Assist vs Ultrasonic Sensors

The YouTuber gave a detailed comparison of the Tesla Model Y Performance 2022 model with USS and the latest Model Y QuickSilver Performance model without USS both on the latest software edition. BTW we also love how pretty the QuickSilver looks. Here is how both systems behave when compared side by side on different tests.

Garage Test Front Entrance

Tesla Vision Only

At first, the measure reads 74 cm but again measures the distance ahead when the Youtuber drives again to show about 30 cm of space left. However, after going ahead with the Vision, the car doesn’t even say to stop with 60 cm of space between the bench against the wall. Later the car again measures the distance wrong saying there is 86 cm of space left in front but this is not the case in reality. So, not very accurate there.

  • The distance on Tesla Vision: 76 cm
  • The real distance from the bench: 10 cm

Front Entrance Ultrasonic Sensor

The sensor here works consistently asking to stop with a measure of 30 cm on the screen and when he steps out and checks it, the result is close to 31.09 cm on the measure.

  • The distance on Tesla: 30 cm
  • The real distance from the bench: 31.9 cm

Garage Test Front Entrance With Obstacle

Tesla Vision Only

When he tries to check the accuracy with an obstacle in front of the car with Vision only to make a complex situation for the park assist. However, the vision system does not show any measure of the obstacles and asks to stop quite far away. He still moves ahead and the Tesla asks to stop just near to the obstacle but does not recognize it. The real distance here measures:

  • Bench: 47 cm
  • Real Obstacle: 26 cm

Ultrasonic Sensor

The sensor here works consistently again where the system asks to stop noticing the obstacle at 31 cm. When he measures, the real distance is:

  • Bench: 50 cm
  • Obstacle: 30 cm

Garage Test Backing Up

Tesla Vision With Backup Camera

The backup camera with vision system works well where the car asks to stop around 45 cm first when in real the car is not actually inside the garage and has a lot of space left. However, when continuing to back up into the garage the car system points to stop at 30cm.

  • The distance on Tesla: 30 cm
  • The real distance from the bench: 34.9 cm

Tesla Vision Backup Camera With Dirt

The vision works surprisingly better in distance measurement despite the dirt on the camera. However, the touchscreen control centre states that the camera is not able to work its full potential and be cautious. The system shows around 35 cm but does not ask to stop for quite some time.

  • The distance on Tesla: 30 cm
  • The real distance from the bench: 22.6 cm

Ultrasonic Sensor With Backup View

Tesla Model Y with the ultrasonic sensors works effortlessly with a backup camera view and consistent accuracy. The sensor asks to stop immediately after 50 cm on the screen.

  • The distance on Tesla Sensor: 50 cm
  • The real distance from the bench here measures: 38 cm

Box Test

Tesla Vision Only

When he tries the same test with Model Y QuickSilver edition without USS, the Tesla Vision does not recognize the obstacles ahead twice. Moreover, when he drives back even further to get a clearer view of the box ahead but Model Y does not recognize any traces of the box on the system.

The YouTuber again tries to test it with a box by putting another box on the earlier one to see if Model Y finds it. The Model Y briefly recognizes the boxes but the sign of any obstacle vanishes as he moves the Tesla ahead. He then moves ahead and shows how the Tesla Model Y without USS crashes smoothly into the boxes.

Tesla blind spot

The angle of the camera in Vision only Model Y is not enough steep and that creates a dead angle (Tesla Blind spot) in front of the hood making the view a lot limited.

Ultrasonic Sensor

The YouTuber tests both Tesla Model Y systems by putting an obstacle in the form of a cardboard box in front of the Tesla to measure the first reaction and locking the car to unlock and try again.

  • The distance on Tesla Sensor: 30 cm
  • The real distance from the bench here measures: 38 cm

The ultrasonic sensors do not quite sense the obstacle ahead after locking the car once and unlocking it to try again. The YouTuber hits the box as the car does not recognize the box and recognizes the box ahead after he puts the car in the park to drive again.

Pavement Test

When driving towards a traditional pavement, Model Y QuickSilver without USS can recognize the pavement and measure the distance to be 50 cm and asks to stop when the vision calculates the distance to be 30 cm. However, the real distance here is 50 cm.

He then locks the car and again put it into the drive to see if it remembers the pavement and it does. So, he concludes that if the objects and obstacles are longer or wider in front of the car the system is able to calculate them with the front camera coverage.

How Accurate Is Tesla Vision?

The YouTube review of Tesla’s new vision-based park assist feature found it lacking compared to the previous USS sensors. He said that after testing both systems for a full day, the reviewer stated that the Tesla Vision Park Assist was “obviously not good enough yet” and suggested that Tesla may need to add additional cameras or radar to improve it.

The frontal view of the camera system did not extend all the way down to the top of the bonnet as they had expected, which could be a contributing factor to its limitations.

The use of the B-pillar cameras in Tesla’s Vision-based park assist system is an interesting feature that sets it apart from the USS sensors. These cameras can help to improve the visibility of long objects that go through the whole frontal line of the car. While USS sensors can detect obstacles in front of the car, they may not be able to detect long objects that are very close to the car. The B-pillar cameras can help to address this issue and provide additional visibility to the driver.

Despite the company’s optimistic expectations, the reviews for Tesla’s vision park assist feature have been poor, with many drivers complaining about the system’s accuracy and reliability. Even drivers who had prior experience with Tesla’s ultrasonic sensors have reported difficulties with the new system.

One common complaint is that the vision-based system struggles to accurately detect objects in low light conditions or when there are reflective surfaces nearby. Another issue is that the system can be slow to react, leading to near-miss situations where the driver has to take control to avoid a collision.

Why Does Tesla Rely Less on Sensors?

Tesla does trust sensors, and in fact, the company relies heavily on a combination of sensors to gather data about its vehicles’ surroundings. However, Tesla also believes that the data gathered by sensors alone may not always be sufficient for safe and efficient autonomous driving.

This is why the company has developed its Tesla Vision system, which uses 360 degrees cameras to gather data about the environment, and then processes that data using advanced artificial intelligence algorithms to provide a more complete and accurate understanding of the vehicle’s surroundings.

Elon Musk, the founder of Tesla, is known for his unique approach to designing and manufacturing electric vehicles. His “best part is no part” mentality means that he seeks to simplify and streamline everything about his cars, from the design to the technology used. This approach has led to the minimalist design and user-friendly interface that Tesla vehicles are known for.

When it comes to sensors, Musk has taken a controversial stance by removing them from Tesla vehicles and rejecting the use of LiDAR technology. He’s been quoted as saying that LiDAR is a “fool’s errand,” and has criticized companies that rely on it, claiming that it’s too expensive and impractical.

Instead, he believes that vision-based systems are the key to safe and efficient autonomous driving, which aligns with his overall philosophy of reducing complexity, cost, and weight wherever possible.

What to Expect in the Future?

European Tesla drivers with newer vehicles will continue to rely solely on their own spatial orientation and visual perception when parking, as the new USS-less Park Assist feature is currently only available to Full Self-Driving (FSD) subscribers based in the US or Canada.

It remains to be seen how effective the new system will be compared to the previous one, and whether Tesla will eventually expand the rollout to other regions and customers. The automated system is designed to improve the safety and convenience of driving by providing features such as lane departure warnings, automatic emergency braking, adaptive cruise control, and self-parking.

When it comes to which parking assistance technology wins here, there’s no clear winner. Our tip is to always back up into narrow areas relying on your camera because the backup camera is still great and provides a clear view of what’s behind you.

You don’t need to rely solely on Tesla Vision for that, just trust your own eyes. However, if you prefer to pull in forwards, Tesla Vision can provide additional assistance. Just remember to always bring a spray bottle of water and mild detergent to keep your cameras clean.

Dirty cameras can affect the accuracy of Tesla Vision, so it’s important to keep them clean. This is a simple and cheap solution that can help you avoid scratches or accidents that could damage your car.

Why Did Tesla Get Rid of Ultrasonic Sensors?

The vision-based parking assist system was designed to help Tesla drivers park their cars without hitting anything. The system uses cameras mounted on the car’s exterior to provide a bird’s-eye view of the surrounding environment. The cameras capture real-time video footage, which is processed by the car’s computer to detect nearby objects and provide guidance to the driver.

Tesla’s switch from ultrasonic sensors to vision-based sensors for parking assist was aimed at providing a more accurate and reliable system while also saving on costs. However, the reviews for the new system have been quite mixed if not outright poor, with many drivers reporting issues with accuracy and reliability.

Tesla continues to work on improving the system, but it remains to be seen if it will be able to meet the expectations of its customers.

Purnima Rathi
Purnima Rathi
Purnima has a strong love for EVs. Whether it's classic cars or modern performance vehicles, she likes to write about anything with four wheels, especially if there's a cool story behind it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular