HomeGuideCounterview: Tesla Vehicles Will Drive Themselves on Autopilot Even When There is...

Counterview: Tesla Vehicles Will Drive Themselves on Autopilot Even When There is No One in the Driver’s Seat

Recently, there was news of a Tesla Model S crashing into a tree while being on Autopilot. The crash ended up killing two people. However, the most interesting thing is that neither of the two people in the car at the time was reportedly in the driver’s seat. This led many people to wonder if Tesla should actually allow passengers to activate Autopilot when there is no one in the driver’s seat.

Consumer Reports Test

Consumer Reports is an American nonprofit consumer organization dedicated to unbiased product testing. The organization’s engineers decided to trick a Model Y’s system to drive it on Autopilot without anyone in the driver’s seat. Now remember, this is an extremely dangerous test, and Consumer Reports carried it out on a closed test track. Funnily enough, the system allowed them to activate the Autopilot without alerting that there was no one on the driver’s seat.

Jake Fisher, senior director of auto testing in CR, stated:

In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all. Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.

Regarding the crash incident, Harris County Precinct 4 Constable Mark Herman was on scene at the crash. He told CR that he’s almost certain that no one was in the driver’s seat when the vehicle crashed. Tesla CEO Elon Musk tweeted Monday evening that data logs recovered from the crashed Model S “so far show Autopilot was not enabled”. He also suggested that it would not be possible to activate Autopilot on the road where the crash took place because of the lack of painted lane lines.

CR wanted to see whether they could prompt their Tesla to drive down the road without anyone in the driver’s seat.

How They Ran The Test

Fisher engaged Autopilot while the car was in motion on the track, then set the speed dial to 0. This brought the car to a complete stop. Fisher then placed a small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand. He slid over into the front passenger seat without opening any of the vehicle’s doors. Opening the door disengages Autopilot. Using the same steering wheel dial, Fisher reached over and accelerated the vehicle from a full stop. He stopped the vehicle by dialling the speed back down to zero.

The car drove up and down the half-mile lane of the track, repeatedly. However, it never noted that no one was in the driver’s seat, never noted that there was no one touching the steering wheel, and never noted there was no weight on the seat. Fisher says, “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”

Disclaimer: CR could perform this experiment because they have a private test track. They also had safety crews standing by, and at no time did the vehicle exceed 30 mph. Fisher and Kelly Funkhouser, CR’s program manager for vehicle interface testing, are trained test drivers who are extremely familiar with Autopilot. They each have evaluated multiple Tesla vehicles over tens of thousands of miles. They have also specifically configured the track to evaluate driver assistance systems.

Making Sure Drivers Pay Attention

Activating Autopilot and even the “Full Self-Driving” does not make the car self-driving. Truly self-driving cars don’t yet exist for consumers to buy. CR and other safety advocates – including the Insurance Institute for Highway Safety – recommend that all vehicles that incorporate steering automation and adaptive cruise control also include systems to make sure drivers are present and looking at the road. This is present in GM, where an infrared camera monitors the driver.

Fisher says that this test shows that Tesla Autopilot is incredibly risky. Autopilot can actually makes mistakes. But it also evaluates whether or not it can drive the vehicle in the current conditions. It then tells the driver to take over, like it did in this incident in Vietnam.

As is the case with many vehicles, the only way a Tesla determines whether a driver is present is by examining steering wheel inputs. If there is weight on the wheel, even if the driver’s hands are elsewhere, the vehicle assumes a driver is driving and paying attention.

Funkhouser says that it might be possible to abuse the active driving assistance systems of other manufacturers’ cars in the same way. If they lack technology that monitors whether a driver is present and paying attention, it is possible to bypass the system. “Even if the driver has a hand on the wheel, it doesn’t mean they are looking at the road”, she says.

What Companies Need To Improve

BMW, Ford, GM, Subaru, and others use camera-based systems that track the movements of a driver’s eyes and head position. This ensures that they’re looking at the road. Some vehicles can automatically slow to a stop if drivers ignore repeated warnings to look at the road. This happens in case of GM’s Super Cruise as well.

Tesla could at least use the weight sensor in the vehicle’s driver’s seat. This would determine whether there is a human sitting behind the wheel in order for Autopilot to work. They already use these sensors for seat belt warnings and airbags, among other things. So, it wouldn’t be a major leap to program a vehicle to turn off features like cruise control if it senses that the driver’s seat is empty, Funkhouser says.

Some Tesla vehicles do monitor drivers, but not in real-time. Instead, there’s a driver-facing camera above the rearview mirror in Model 3 and Model Y vehicles. They call it the cabin camera. It can capture and share a video clip of the moments before a crash or automatic emergency braking (AEB) activation. This helps Tesla develop future safety features and software enhancements, according to their website. The Model S, however, does not have this cabin camera.

Why This is an Important Issue

Driver monitoring systems will be a part of the requirements for Europe’s Euro NCAP automotive safety program as of 2023. Active driving assistance systems are now becoming increasingly automated. So, NHTSA should take a similar step, says William Wallace, manager of safety policy at CR:

These systems come with a real risk of people checking out from the driving task. Fortunately, the technology exists to make sure their eyes are on the road, and it’s improving and becoming more prevalent. NHTSA should ensure that every car with an active driving assistance system comes with this standard.

A Counter-View: Are CR’s Testing Methods Questionable?

Tesla analyst Pierre Ferragu is of a different view, however. He is not at all impressed with the approach Consumer Reports took in their tests. He said that no one in their right mind would go through that series of loopholes just to sit on the passenger’s seat to activate Autopilot. Ultimately, one can subject a test to any car to make its operation look unfavourable. He says that it is tough to see how one could blame Tesla for owners tricking the FSD sensors.

Ferragu is a notable Tesla analyst but is also a valuable critic of the company. He gives candid synopses, but he also speaks frankly when the automaker isn’t at its best. Pierre himself got to experience Autopilot in a 70-mile (113-km) highway drive in February, and he was impressed.

Our Opinion

It is interesting to see Tesla not doing enough to solve this issue. They must adopt more effective driver monitoring systems. These systems have performed well in CR’s performance tests. A driver assist system makes an electric vehicle attractive, no doubt. But companies should combine them with driver monitoring systems as well. As long as the vehicle does not have the license to self-drive, the driver must pay attention to the road.

At the same time, it is important to understand the approach used for carrying out these tests. No one is actually going to put their vehicle into Autopilot mode, shift into the passenger’s seat, and control the Autopilot settings. But Pierre also should understand that the purpose behind this test was to show that the system can actually run without anyone in the driver’s seat. No one may purposefully do this, but shutting the system down when there is no driver may help save lives in an accidental scenario.

Mihir Tasgaonkar
Mihir Tasgaonkar
A mechanical engineer who loves reading and writing about new technologies in the automobile industry.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular