Tesla’s recent leap towards autonomous driving has been talked about a lot. Autopilot already has a lot of incidents on its name (no crashes of course), and since October 2020, the new FSD (Full Self-Driving) Beta release is also making waves. Elon Musk has said that the new FSD suite can evolve towards achieving Level 5 autonomy. This will later evolve into an autonomous RoboTaxi service, which is one of Musk’s visions.
However, for the current FSD to evolve into a completely autonomous system, it has to work on the toughest of streets. Of course, the FSD Beta has shown itself to be comfortable driving around California’s traffic with a lot of edge cases. But there’s a huge difference in edge cases in California and some countries around the world. South-east Asia, in particular, has a reputation for having the most unpredictable traffic conditions. Tesla’s Autopilot recently encountered one such situation in Vietnam. The system tried its best to understand the different test scenarios, but in the end, it asked the driver to take over.
A Tesla Model X owner was driving around on Autopilot in Ho Chi Minh City in Vietnam in thick traffic. There was a vast number of cars as well as motorcycles on the road. The two-wheelers, in particular, were causing a hassle, weaving around in and out of lanes. The situation was quite chaotic, as evidenced by the system’s visualization, which was going haywire.
The car kept battling through the traffic, hitting speeds of 6 kph to get closer to the vehicle in front. But as and how the volume of traffic increased, it couldn’t keep going. It finally flashed a message saying “TAKE OVER IMMEDIATELY”.
The Implications of Such Situations
While you may think of this situation as a negative, it is, in fact, one of the most positive feedbacks a vehicle can give. Artificial intelligence will only be able to replace humans when it makes the correct decisions, and admitting that the system isn’t ready for this kind of situation is definitely a correct decision. It tells us how much the system focuses on the safety of the passengers.
The entire chitchat around autonomous vehicles is mainly about safety. It isn’t about doing certain tasks, it is about doing them as safely as, or rather even more safely than a human would. We have seen instances where the car thought taking a turn would be safe, but the driver felt that the timing wasn’t right and intervened. Of course, there are also clips where difficult streets have been easily manoeuvred by the FSD suite.
But Tesla has to think about the entire world when they talk about actual self-driving. You cannot say that your car is completely autonomous and then say it won’t drive around the small lanes in India, Vietnam or other Southeast Asian countries, where people play fast and loose with traffic laws. The road planning in some of these countries is also not up to the mark. A four-wheeler, as big as a full-sized SUV, could easily emerge from the narrowest of lanes, one which isn’t even visible until you actually reach its entrance.
The street seen in the above video is an example of what to expect in the rural regions of these countries. These situations are going to be a common occurrence for Tesla when it expands operations. Of course, in the video, the Model X in question was on Autopilot, not the FSD Beta. But if Tesla wants to develop the FSD suite as much as possible, they have to roll it to the outside world as well.
The easiest way to develop an autonomous system is through real-life testing. The companies hoping to achieve full autonomy need to have billions of miles simulated and millions of miles actually driven on their system. But this doesn’t mean driving them around on the same routes in the same cities over and over again. Tesla has to spread out the reach of the Beta release. Only then will the system be able to encounter maximum types of scenarios. Most of the edge cases in smaller countries are incredibly riskier than anything you witness in California. Tesla is only going to benefit from expanding the Beta release to other countries.
What do you think about this incident? What are your views on Tesla’s approach to the testing of the FSD Suite? Let us know in the comments.