One of Tesla’s internal teams has an absolutely terrifying job. Project Rodeo is staffed with drivers whose job it is to test the absolute limits of the automaker’s self-driving software, which is available on vehicles including the Model 3 and Model S. They’re trained to wait as long as possible before taking control even when they know a crash is coming in an effort to fully test the system. Sometimes those tests included unwitting drunk pedestrians on public streets.
Tesla
Tesla, Inc. is an American electric vehicle manufacturer largely attributed to driving the EV revolution. Through the Model S and subsequent products, Tesla has innovated and challenged industry conventions on numerous fronts, including over-the-air updates, self-driving technology, and automotive construction methods. Tesla is considered the world’s most valuable car brand as of 2023, and the Model Y the world’s best-selling car in the same year, but the brand’s greatest achievement is arguably the Supercharger network of EV charging stations.
- Founder
- Martin Eberhard, Marc Tarpenning
- Headquarters
- Austin, Texas, USA
- Owned By
- Publicly Traded
- Current CEO
- Elon Musk
Automaker Pressures Drivers To Avoid Taking Over
Business Insider spoke with nine current and former test drivers for the team that’s internally known as Project Rodeo. The name comes from their mission, which is to try and ride their Tesla for as long as possible before taking over. The reason they’re trained to wait is because the longer the car is in control the more data the team gets.
This makes sense when you’re testing on a closed course. Where obstacles are artificial and usually soft. Where the pedestrians are dummies who won’t get hurt. Where running off the pavement means wide spaces, not real trees.
Related
Tesla Model 3 With FSD Fails To Detect Railway Crossing, Narrowly Avoids Train Crash
Who is to blame here, the driver or FSD? Or both?
But Project Rodeo drivers, says the report, are doing this on public roads. Testing the software to extremes with unsuspecting drivers, cyclists, and pedestrians who don’t know they’re being used as subjects for Musk’s self-driving dreams.
Add CarBuzz to your Google News feed.
While none of the test drivers said they had been in a crash, they reported some very dangerous conditions. One told BI that they “sometimes ventured into their city’s bar district late at night to see how Tesla’s FSD software reacted to drunk patrons spilling out after last call.” A driver from San Francisco recalled testing how close FSD would get to pedestrians at crosswalks on the Stanford University campus.
Test Drivers Told “That Was Perfect” After Dangerous Near Misses
The drivers, also called “critical-intervention test drivers,” reported serious pressures to push the FSD system as far as possible. BI says five current and former employees said they weren’t told to break the law, but received feedback if supervisors thought they intervened too early.
One driver said they let the car speed through yellow lights and drive 35 miles per hour under the limit on an expressway. Just so they could let the system keep “driving” the car.
“I vividly remember this guy jumping off his bike. He was terrified,” one driver told BI, describing a near miss at a roundabout. “The car lunged at him, and all I could do was stomp on the brakes.” They said the trainer was pleased by the incident. “He told me, ‘That was perfect.’ That was exactly what they wanted me to do.”
Related
Watch Another Tesla FSD Beta-Equipped Car Crash On Its Own
The system still has a long way to go.
BI spoke with safety experts about this driving, including Mark Rosekind, a former NHTSA administrator and current safety innovation officer for autonomous taxi company Zoox. “There are very few rules around autonomous testing and a lot of dependency on self-reporting,” he said. “If companies aren’t reporting, it’s hard to know what’s going on.”
What Tesla is doing doesn’t seem to be standard practice. Two former Waymo employees told BI that the company’s similar team was limited to closed track testing. Former employees of GM’s Cruise said that when they did open road testing, they were to take over much sooner. Cruise, notably, pulled all of its vehicles after one dragged a pedestrian it struck after the pedestrian was hit by another car.
Philip Koopman, an autonomous driving expert at Carnegie Mellon University, called Tesla’s approach “irresponsible.” He told BI that these scenarios should be conducted on a closed course. “By allowing the software to continue misbehaving to the point a test driver needs to avoid a crash, Tesla would be imposing a risk on other road users who have not agreed to serve as test subjects,” Koopman said.
It is important to note that while Tesla instructs its test drivers to let the cars drive, its own documentation reminds customers of FSD’s limitations. Specifically, the “features require active driver supervision and do not make the vehicle autonomous.”
Source:
Business Insider
News Summary:
- Tesla FSD Test Drivers Describe Horrifying Near Misses
- Check all news and articles from the latest Tech updates.
- Please Subscribe us at Google News.