In one of the freakiest incidents regarding Tesla’s Full Self Driving (FSD) suite, a 2021 Model 3 almost crashed into some construction barricades in Arizona. The incident occurred last month on March 17th, when Joseph Richards was driving in Maricopa County in Arizona.
The incident occurred when he was driving on a relatively traffic-free road, and there were some barricades warning people of some ongoing road construction. Joseph Richards had activated the FSD mode, and the vehicle was going at 45 mph (72 kph), which was the speed limit. The autonomous system, however, failed to pick up the warning sign on the road, and almost crashed into the barricade. Joseph took control immediately and swerved his car away from the barricade. He ended up driving over the roadside pavement and getting back on the road. Fortunately, he got away from the incident without a scratch.
Joseph managed to get a recording of this incident and he has shared the video with us. You can watch the video here:
It is actually quite scary to think what could have happened if Joseph had intervened a little late. The vehicle was going along at quite a decent speed and a crash may have been quite risky.
Joseph had bought his 2021 Model 3 at the end of last year. The vehicle was less than 3 months old when the incident occurred. According to him, he was using the latest software version of the FSD suite. Currently, he uses the 2021.4.15 version, which Tesla released a few days ago.
Tesla Full Self Driving Suite
There have been tons of articles about the full self-driving suite that Tesla has released last year. Of course, this suite is still in the beta phase, meaning the driver needs to be alert at all times even when the self-driving is activated. Till now, there has been more good news about this.
We have seen FSD complete a 30-minute trip without any intervention. There has been an incredibly hard night-time test in California which the FSD passed with flying colours. There was also the Lombard Street test in San Francisco, which is a 180-metre long road with a natural gradient of 27 degrees.
Of course, there have been some bugs as well. One owner found that the system was capturing the street number as the speed limit. An incident in Vietnam saw the vehicle get really confused due to the heavy and erratic traffic, and asked the driver to take over. YouTuber Chuck Cook tested his vehicle taking a left turn on its own into oncoming traffic, and he had to intervene almost 50% of the times. The system has its fair share of bugs, but the general consensus was that the problem was related more to the confusing signs and lane markings than faults in the system. And with constant updates rolling out off Tesla’s platform, these bugs were quickly getting ironed out.
This latest incident may be an issue though. You may dismiss it as a one-off, but any autonomous system cannot allow even a one-off error. One major incident has the potential to derail projects and set them back in terms of regulatory approval. Tesla is already facing heat from some other companies, who claim that Tesla’s approach towards achieving full autonomy isn’t quite right.
This is a month-old incident, and there have been no further incidents reported by Joseph. So it seems like the incident was a one-off. But as I mentioned before, for an autonomous system to be considered reliable, it has to be air-tight. Companies release beta programs for this very purpose, but Tesla’s approach is going to seem a little wrong once an accident takes place. This incident was very close to that.
Tesla has mentioned that their next major update will see the system based completely on computer vision. They will try to minimize the use of radar technology, trying to bring the system as close to real-world AI as possible. Let’s hope this update solves bugs such as this, and there are no further incidents like this.
This is not FSD, it’s called autopilot and it’s designed for highway driving only, lane assist. Recommend google.