Navigation

Phoenix may get Tesla robotaxis an expert calls ‘completely defective’

Self-driving Teslas may soon compete with Waymos on Valley roads. One engineer says they're prone to “catastrophic failure.”
Image: a tesla logo on the front of a car
Policymakers are set to meet with Tesla executives about possibly letting the company test self-driving taxis in Arizona. Ivan Radic/Flickr/CC BY 2.0

What happens on the ground matters — Your support makes it possible.

We’re aiming to raise $7,000 by August 10, so we can deepen our reporting on the critical stories unfolding right now: grassroots protests, immigration, politics and more.

Contribute Now

Progress to goal
$7,000
$2,200
Share this:
Carbonatix Pre-Player Loader

Audio By Carbonatix

Tesla wants to bring autonomous “robotaxis” to Arizona, despite safety concerns from critics. 

On Aug. 8, representatives from the governor’s office, Arizona Department of Public Safety, state universities and the Arizona Department of Transportation, among others, are set to meet with Tesla behind closed doors to discuss their application to bring robotaxis to the Grand Canyon State. 

“There are many, many, many, many serious problems,” Dan O’Dowd, founder of the Dawn Project and prolific Tesla critic, told the Arizona Mirror about Tesla’s self-driving and assisted driving technology. 

O’Dowd, an engineer who has created operating systems used in military aircraft such as the F-35 and B-1B, has been conducting tests on Tesla’s technology and sounding the alarm about what he says is a dangerous product being tested on the roadways. 

The president and CEO of Green Hills Software, O’Dowd unsuccessfully ran for U.S. Senate in 2022, with a campaign focused on criticism of Tesla’s automated driving systems.

Tesla recently applied to bring its autonomous “robotaxis” to Arizona, which would allow the company to conduct testing and operations with and without human drivers, in a similar fashion to what Waymo has done in Phoenix

The Self-Driving Vehicle Oversight Committee, a public body created under Gov. Doug Ducey in 2015 to advise ADOT and the governor’s office, was disbanded after only two public meetings and renamed the Connected and Automated Vehicle Team. 

“The team serves in an informal, information-sharing role and doesn’t make decisions on applications to operate autonomous vehicles,” ADOT spokesman Bill Lamoreaux told the Mirror. “This arrangement has been in place since the Arizona Self-Driving Vehicle Oversight Committee, created in 2015, concluded its work during the previous administration. The group typically has discussions monthly.”

That previous committee has a record of one public meeting in 2016, but according to ADOT, the committee met twice before wrapping up its work and becoming the CAV Team. 

The CAV Team, which is an “informal working group,” has “no requirement for attendance” and no “concrete list of members” but membership includes many of the same people who were on the previous committee including ADOT, DPS and representatives from the governor’s office. 

The Aug. 8 closed door meeting with Tesla representatives comes as the company is facing increased scrutiny from people like O’Dowd who are concerned about a growing number of incidents involving Teslas and their self-driving technology, while regulators appear to be asleep at the wheel

“They should at least do a minimal test to verify this,” O’Dowd said of state and federal regulators. “All of our tests are reproducible tests and there is catastrophic failure.”

And data analyzed by the Mirror shows that Tesla already has a track record of issues in the state.

click to enlarge
Tesla CEO Elon Musk speaks during the New York Times annual DealBook summit on Nov. 29 in New York City.
Michael Santiago/Getty Images

Patchwork of regulations

Laws and regulations that govern automated vehicles differ vastly from state to state

And the federal government has regulations of its own, via the National Highway Traffic Safety Administration which often has powers that states do not, like being able to issue certain types of vehicle recalls. Those factors make for a confusing landscape for companies and policymakers alike. 

Back in 2015, Ducey signed an executive order allowing for the first testing and operation of autonomous vehicles in the state. Three years later, an Uber self-driving car killed Elaine Herzberg and shortly after, Ducey signed another executive order that put more regulations in place on testing the tech. 

Since 2015, Arizona has seen a boom in companies that are testing autonomous vehicles on its roads. ADOT currently lists Aurora, Beep, Cruise, Gatik, May Mobility, Nikola, Pony.ai, Nuro, Stack, Torc, Waabi and Waymo as all testing in the state. 

Waymo first started testing in Arizona in 2017, but it kept employees behind the wheel and in the passenger seat to monitor the autonomous driver. It wasn’t until 2020 that the company began operating its vehicles fully autonomously, eventually offering the service via an app available to users within a certain geographic area of Phoenix.

But there is more to the technology than just fully autonomous vehicles.

The systems

Automated Driving Systems, also known as ADS, is the terminology used to describe what most would recognize as a “robotaxi” or fully autonomous vehicle. The Waymo cars that drive around Downtown Phoenix use ADS technology. 

Alternatively, Advanced Driver Assistance Systems, or ADAS for short, only assist a driver who is expected to remain responsible for driving the car. This can include systems like adaptive cruise control, blindspot detection, parking assistance and any system that “assists” the driver. 

The autopilot and self-driving functions on Tesla vehicles are considered ADAS and not ADS systems. The NHTSA keeps track of incidents reported to them that involve both ADS and ADAS systems. 

A review of that data by the Mirror found that of the 97 incidents involving ADAS systems in Arizona that were reported from 2021, when the reporting period started, to May of this year, 81 of them involved a Tesla. 

The data on the incidents is also largely incomplete, as the section which contains information on which version of the system being used in the incident says “[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION].” 

Doesn’t recognize a school bus

O’Dowd’s group made national and international headlines recently when it performed a test on a Tesla in Austin, an area where the company is looking to deploy its “robotaxis.” 

The test had a Tesla driving down a road toward a school bus which had its stop sign out and a series of dummies dressed as school children nearby. As the Tesla, in self-driving mode, got closer, it didn’t slow down. Then a member of O’Dowd’s team pulled a string to make one of the dummies dart out in front of the Tesla.

The Tesla never stopped. 

His group has performed similar tests multiple times since the first in 2023, which turned out to be prophetic. 

In 2023, O’Dowd’s group bought time during the Super Bowl to air an ad showing its findings of how Tesla’s self-driving mode ignored “do not enter” signs and school bus stop signs. Tesla CEO Elon Musk responded online that the ad would boost the brand.  

A month later, 17-year-old Tillman Mitchell was hit by a Tesla while exiting a school bus. The Tesla that hit him was allegedly in self-driving mode. 

“I don’t think they’ve even worked on it because we did a New York Times full-page ad disclosing the school bus thing, Tesla ignored us,” O’Dowd said. “We went out and we pushed that everywhere, still nobody has taken any action.” 

 Tesla did not respond to questions about O’Dowd’s research.

O’Dowd has been conducting research into Tesla for years now and he is concerned that the technology is being falsely marketed to consumers as safe. In one incident in 2022, a Tesla in the full self-driving mode in Santa Barbara began to make a left turn into oncoming traffic. 

O’Dowd has even hired driving instructors to take a Tesla on the same route a teenager would follow for their driving test. The Tesla had four failures that the driving instructor said would’ve been “auto fails” for a human. 

“We can’t let this stand. We can’t just let somebody get away with building completely defective products,” O’Dowd said. 

We have to do it without them’

Self-driving cars are here to stay. 

In places like San Francisco, they’re a major tourist attraction

But regulation remains patchworked and the future is not entirely clear on how the technology will be overseen by state and federal governments. 

“Regulators need to have recall authority,” Mark MacCarthy, an expert at the Brookings Institute with a focus on technology, law and policy told the Mirror. “There should also be liability on the manufacturer for unreasonable driving performance of their self-driving cars.”

One of the major pushes behind self-driving technology is that it will make roadways safer as it removes human error from driving, however, MacCarthy has pointed out that may not be entirely the case. 

Last year the Association for Computing Machinery warned policymakers that they should not make this assumption because, simply put, computers make mistakes that humans don’t.

Additionally, the code for the machines is written by fallible human beings

MacCarthy says that a “third layer of protection” for consumers in this space is “proactive regulation,” where an entity like the NHTSA would make sure that self-driving vehicles meet certain standards. 

“That is nowhere in sight at this point but it will have to happen at some point,” MacCarthy said. 

Data analyzed by the Mirror found that Waymo vehicles make up 337 of the 403 incident reports in the NHTSA ADS database in Arizona from 2021 to May of this year. The majority of those incidents are caused by human error, when a human-driven car collides with a Waymo. 

O’Dowd said he gives Waymo credit for starting cautiously and developing its tech with employees rather than “testing it with civilians.” 

He said he plans to continue testing Tesla’s features and trying to bring his concerns to regulators; he’s already sent his findings to Congress, Tesla and a number of other agencies. 

“The regulators knew about that problem, did a detailed study but then did nothing,” O’Dowd said, referencing an issue found with Tesla’s autopilot feature by the NHTSA. “We have to do it without them.” 

This story was first published by Arizona Mirror, which is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity.