Monday, 30 August 2021

Another Tesla Allegedly Collides With Emergency Vehicle in Autopilot Mode

Stop me if you’ve heard this one: A Tesla owner in Orlando collided with a parked police vehicle early on Saturday morning. The cruiser was on the side of the road, lights flashing while the officer helped the driver of a stranded Mercedes. According to the Tesla driver, her car was in Autopilot mode, but it failed to see the police vehicle and ended up hitting both the cruiser and the Mercedes. An investigation is underway, but this doesn’t look great as Tesla faces questions from government regulators about this very issue. 

Tesla rolled out Autopilot to Model S owners in 2014, and the feature has since become a major selling point of its vehicles. Even the cheap Model 3 and Model Y have Autopilot functionality, but you can also upgrade to “Full Self-Driving,” which adds features like lane changing, summon, and traffic sign identification. Even the regular Autopilot implementation lets you set a destination and let the car handle highway driving. 

Following the accident, the driver told authorities that the vehicle was “in Autopilot.” Police are investigating, and Tesla will probably have its say as well. But even if this is true, the driver is going to be held responsible. Tesla’s self-driving technology is not true self-driving. It’s what’s known in the industry as SAE level 2 automation. That means that the car can control speed and lane position at the same time, and can also take corrective actions like applying the brake. However, the driver must remain aware of their surroundings to take over from the car at a moment’s notice, and that seems to be the issue. 

A person driving along and watching the road will always notice emergency vehicles with flashing lights, so you would expect a computer vision system that is always watching from multiple angles could do the same. We’re starting to understand that these systems are still not perfect, but they’re so good so much of the time that people become complacent. Humans are simply not as good at “monitoring” activities as we are at actively taking part in things. When someone does need to take over from Autopilot, they might not realize until it’s too late. 

Self-driving cars don’t become truly self-driving until SEA level 3. At that point, the car should be able to handle an entire trip under certain conditions, and it will proactively alert the driver if they need to take over. While Tesla does not meet this threshold, the marketing and features (they even call it “Full Self-Driving Capability”) can make people feel like they’re getting a true autonomous experience. Perhaps that’s a dangerous illusion.

Now read:



No comments:

Post a Comment