Skip to main content

Volvo engineer: Tesla Autopilot is ‘an unsupervised wannabe’

Tesla Model X
Tesla Model X Image used with permission by copyright holder
The auto industry’s shift toward self-driving technology is so far more of a trickle than a flood. Each year, cars get steadily smarter and smarter, incorporating semi-autonomous features like adaptive cruise control and automatic lane keeping, and Tesla’s Autopilot system is one of the most high-profile examples out there. That said, not everyone is a fan.

In a recent interview with The Verge, Volvo’s senior technical leader of crash avoidance, Trent Victor, had some harsh words for the system. Victor claims Autopilot is not as capable as Tesla alleges, and could even be dangerous because of it.

Recommended Videos

“It gives you the impression that it’s doing more than it is,” said Victor. “[Tesla’s Autopilot] is more of an unsupervised wannabe.”

Please enable Javascript to view this content

Tesla categorizes its self-driving technology as Level 2 automation, which is defined by the National Highway Traffic Safety Administration as the automation of “at least two primary control functions designed to work in unison.” Unlike Level 4 — which requires no driver input at all — the driver is still responsible for maintaining control of the vehicle in Level 2; however, equipped Teslas can operate by themselves on the highway for relatively long periods. It is worth noting that in the press kit for the Model S’ 7.0 software update, Tesla describes Autopilot’s Autosteer function as a Beta.

Tesla Autopilot
Tesla Autopilot Image used with permission by copyright holder

As for Volvo, the brand will launch its Drive Me vehicle program next year, which will include 100 Level 4 automated XC90s driving among normal traffic in Sweden. According to Victor, Level 4 is a much safer option than Level 2 or Level 3 because there’s always an electronic safety net if something goes wrong. In lower degrees of autonomy, features like automatic lane keeping can be switched off if there’s no driver input for a set amount of time.

Read more: Tesla’s updated Model S officially exceeds the 300-mile range marker

“In our concept, if you don’t take over, if you have fallen asleep or are watching a film, then we will take responsibility still,” said Victor. “We won’t just turn [autonomous mode] off. We take responsibility and we’ll be stopping the vehicle if you don’t take over. That’s a really important step in terms of safety, to make people understand that it’s only an option for them take over.”

Self-driving technology is already more common than most people think, and that trend is only going to continue. Is Tesla Autopilot safe? Are self-driving cars safe at all? Let us know your thoughts in the comments.

Andrew Hard
Former Digital Trends Contributor
Andrew first started writing in middle school and hasn't put the pen down since. Whether it's technology, music, sports, or…
Teslas likely won’t get California’s new EV tax rebate
teslas likely wont get californias new ev tax rebate ap newsom 092320 01 1

California seems eager to reassert itself, not only as one of the largest economies in the world, but one where EVs will continue to thrive.

Governor Gavin Newsom has announced California will seek to revive state-tax rebates for electric vehicles should the incoming Trump administration carry out its plans to end the existing $7,500 federal incentive on EVs.

Read more
Hertz is selling used Teslas for under $20K, Chevrolet Bolt EVs under $14K
2018 Chevrolet Bolt EV

Tesla CEO Elon Musk recently nixed hopes of a regular Tesla model ever selling for $25,000.

But he was talking about new models. For car rental company Hertz, the race to sell used Teslas and other EVs at ever-lower prices is not only still on but accelerating.

Read more
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more