Autonomous Vehicle

Tesla’s project Rodeo uses high-risk autonomous vehicle testing

Tesla’s Project Rodeo Uses High-Risk Autonomous Vehicle Testing

4 November 2024

Project Rodeo is a autonomous vehicle safety testing initiative run by Tesla.

A Business Insider exposé claims that Project Rodeo pushed test drivers to purposefully expose Tesla’s self-driving software to hazardous conditions in order to observe how the system would react.

Tesla’s Risky Safety Testing

The work of “critical-intervention” test drivers, who are responsible for allowing the software to continue operating even after it makes a mistake and waiting until the very last minute to stage an intervention, is the most contentious part of Project Rodeo. Several key intervention drivers reported feeling pressured to put themselves in risky situations and fearing that their jobs would be jeopardized if they intervened too soon, according to the report.

According to reports, incidents involving vehicles swerving into other lanes, running red lights, and exceeding speed limits happened during the testing program.

Test Vehicles Commit Traffic Safety Violations

One critical-intervention driver reportedly drove 35 mph below the speed limit on an expressway and allowed their car to speed through yellow lights without turning off the autonomous driving software, according to the report.

Five Tesla drivers reported that they barely avoided collisions in 2024, including nearly running into a group of pedestrians.

Autonomous Driving Data Collection

In order to train its self-driving AI models, Tesla’s safety program gathers real-world data about high-risk scenarios. Similar strategies have been employed by other autonomous car companies. Waymo, for instance, computes and examines “counterfactual disengagement,” or the scenarios that might have occurred if a driver hadn’t stepped in.

There is no proof, though, that Waymo purposefully urged drivers to avoid stopping in hazardous circumstances.

Back to top button