In another 2020 incident, a police officer claimed a sudden stop by a Waymo vehicle caused a rear-end collision, but the officer was cited.
The incidents are detailed in newly released police reports obtained by Phoenix New Times that shed fresh light on the function and operations of the driverless vehicles, which are often cloaked in secrecy. The reports from Chandler and Tempe, released last week under state public records law, detail all Waymo-related cases since January 2020.
Earlier this month, the company bragged about how well its vehicles would perform if they replaced human drivers, and how many lives they could have saved in Chandler. Yet the company hasn't been totally transparent with metro Phoenix residents, refusing to turn over data showing how many times the vehicles' autonomous function has failed while driving around Chandler, Tempe, and other Valley areas. The latest police reports help to explain why the company has struggled in its deployment of a fully driverless fleet. Although its vehicles can operate in driverless mode, nearly all still have backup drivers behind the wheel when they're on the road.
noted in the vehicles for years, but rear-end collisions are usually seen as the fault of a driver who didn't stop in time. The October 8 incident is harder to figure out. At about 11:30 a.m. that day, a white Waymo minivan was traveling westbound in the middle of three westbound lanes on Chandler Boulevard, in autonomous mode, when it unexpectedly braked for no reason.
A Waymo backup driver behind the wheel at the time told Chandler police that "all of a sudden the vehicle began to stop and gave a code to the effect of 'stop recommended' and came to a sudden stop without warning."
A red Chevrolet Silverado pickup behind the vehicle swerved to the right but clipped its back panel, causing minor damage. Nobody was hurt. An officer who filled out the crash report checked boxes indicating that the Waymo vehicle had "stopped in trafficway," but also that the pickup was driving "too fast for the conditions." The pickup had been going 45 mph in a 45 mph zone, and no distractions for either driver were listed.
The officer didn't issue either the Silverado's driver or the Waymo backup driver a citation. Sergeant Jason McClimans, a spokesman for Chandler police, said that officers have discretion on whether to cite anyone in minor collisions, and "it depends on the situation."
Following a January 2020 collision, the department issued a citation to one of its own officers for rear-ending a Waymo vehicle. Yet in that case, like the later case involving the pickup, the collision followed an unexpected move by the driverless vehicle. When a streetlight turned green, the Waymo vehicle "began to move forward and then stopped prior to entering the intersection per the programming of the vehicle." The Waymo backup driver reportedly saw the other vehicle coming up behind him "and attempted to disengage the autonomous mode by pressing the accelerator but it did not disengage in time" and the unmarked police vehicle hit the Waymo minivan. Both vehicles had slight damage.
The department declined an interview of the officer; all names were blacked out of the reports.
In the other two Chandler incidents involving minor rear-end collisions, the Waymo vehicles were in manual mode, and only one of the drivers received a citation.
How Safe Are They?While Waymo's achievements in autonomous vehicle technology have been praised worldwide, fully driverless vehicles remain rare on the roads, even in Chandler, where local Waymo operations are based. The company has tested the vehicles in the Phoenix metro area since 2016, when Governor Doug Ducey invited autonomous vehicle companies with the promise of fewer government regulations. Uber, the other company prominently testing the vehicles in Arizona, left the state after one of its vehicles struck and killed a pedestrian in Tempe. The backup driver, Rafaela Vasquez, was charged with negligent homicide in the case and is tentatively scheduled for trial in May.
Waymo has served select passengers with its Waymo One program in Chandler and parts of metro Phoenix since late 2018, sometimes utilizing vehicles with no backup drivers. The secretive company, owned by Alphabet, the parent of Google, won't say how much of its fleet is running without backup drivers at any given time, but previous reports have estimated it's no more than 10 percent.
Overall, the company has fallen far short of a 2018 boast that it would soon have a working fleet of up to 62,000 driverless vehicles. Delays in deploying a failsafe, fully driverless fleet, which is Waymo's goal, caused the valuation of the company to drop from about $200 billion in 2018 to $30 billion by March 2020. The pandemic put up another obstacle, causing the company to temporarily suspend service.
"It's an extraordinary grind," Waymo CEO John Krafcik told the Financial Times in January. "I would say it's a bigger challenge than launching a rocket and putting it in orbit around the Earth... because it has to be done safely over and over and over again."
raised $3 billion as of May of 2020 under Krafcik's leadership. In October, it released details on 18 Phoenix-area crashes its vehicles had been involved in. That report, which indicated that Waymo was blameless in essentially all of the crashes, examined crashes from January 2019 to September 2020 and didn't include the October 2020 crash.
In early March of this year, Waymo released results of a simulation showing that if its vehicles could replace the vehicles involved in 72 fatal collisions in Chandler from 2008 to 2018, almost no one would have died.
"In total, the simulated Waymo Driver completely avoided or mitigated 100% of crashes aside from the crashes in which it was struck from behind, including every instance that involved a pedestrian or cyclist," wrote Trent Victor, Waymo director of safety research and best practices in a blog post.
Waymo used the same software for its modeling that has guided its vehicles' performance over millions of real and simulated driving miles, though, and didn't calculate what would happen in the collisions if the driverless vehicle didn't perform as expected. As an associated paper on the simulations explains, the study of simulations doesn't, by itself, actually show how safe Waymo's driverless vehicles are "across all possible conflict scenarios."
While the study of real-world collisions is useful, "this alone" doesn't show all the ways a driverless vehicle "may induce a collision when deployed."
Andrew Maynard, a professor and associate dean for Arizona State University's School for the Future of Innovation in Society who researches issues related to autonomous vehicles, said the Waymo simulation shows "strong evidence" that its program could reduce crashes. But he noted that it assumes the driverless system is working perfectly, and that the vehicles aren't themselves doing something that leads to fatal crashes.
"The chances are that unique AV risks will be far less significant than human driver risks, but this is something that needs to be addressed," he said.
While the research may be sound, he added, "if you work out the probability of a future potential car crash involving an AV, we’d need an awful lot of AVs on the road to reduce fatality rates!"
For almost three years, Governor Ducey has declined to take Waymo up on its offer for a ride in a fully driverless vehicle with no backup driver.
Potential 'Hazard'The 15 months' worth of reports from Tempe and Chandler include several other notable safety-related incidents:
* A bicyclist reported that a Waymo vehicle with no backup driver seemed to prepare for a turn on a residential street, but wasn't slowing down, causing the bicyclist to stop. The vehicle "appeared not to detect him as it accelerated through the northbound turn." The man reported "he would have been struck had he not stopped." An officer noted that Waymo would be contacted about the incident, but no investigation occurred.
* In Tempe, a caller reported that a Waymo vehicle and another vehicle were "parked in the middle of [the] street" and "causing a major traffic hazard." The cars apparently soon moved, and police had nothing further on the incident.
* Waymo vehicles were involved in four other collisions, including two hit-and-runs. No injuries were reported. In one of the hit-and-run collisions, police noted that Waymo provided video of the Tesla that hit its minivan, but blurred out the Tesla's license plate.
Waymo vehicles and the people riding in them as backup drivers or passengers may face other problems besides being in a collision:
* One man in Chandler was found passed out in a Waymo vehicle from suspected narcotics use.
* Another man believed to be inebriated got into a Waymo vehicle that a registered user had just exited, and closed all doors. Waymo representatives arrived at the scene but had trouble waking the man up. Tempe police didn't write a report on the incident.
The Chandler reports also detail vandalism or implied threats to the vehicles and their backup drivers, a phenomenon first reported in late 2018 by the Arizona Republic:
* A police officer saw a car reverse rapidly toward a Waymo vehicle, barely avoiding a collision. The driver was arrested for driving on a revoked license.
* Someone threw an ice-cream cone into the open window of a Waymo vehicle.
* A man in a black car threw eggs at multiple Waymo vehicles. The Waymo backup drivers didn't want to be contacted by police.
* A Waymo backup driver switched to manual mode after seeing traffic slowing and maneuvering around what turned out to be a dead dog in a lane. As the backup driver passed the dog, "she heard someone yelling and swearing at her to slow down which she had already done." A man standing in the bicycle lane, who she believed might have been the dog's owner, lashed out at the vehicle, punching and breaking the mirror. The backup driver told an officer she'd have to ask Waymo if she should press charges. No further investigation took place.
* After a Waymo vehicle turned left in front of two motorcycles, the two motorcyclists — a man on a Harley Davidson and a woman with long dark hair dressed in an "orange-colored Tigger costume or pajama set" riding a sportbike — blocked a second Waymo vehicle in a parking lot temporarily. The man got off his Harley and began yelling at the second vehicle's backup driver, who was able to soon escape. Waymo didn't provide the police with any video, although the vehicles are outfitted with several exterior cameras. No further investigation took place.
Waymo's TakeWaymo insisted for this article that New Times attribute information directly to the company rather than any specific spokesperson or executive. The company was well-prepared for New Times' questions — Sergeant McClimans said he sent word of the request, as well as the documents requested, to Waymo on the day he sent them to New Times. Chandler police tips off local corporations to news media inquiries sometimes, as a courtesy, he said.
According to Waymo, its vehicles are designed to slow down and stop when they're having a technical problem. The company insisted that technical failures that result in a disengagement, or manual takeover, are rare. While the company won't release its local disengagement data, it reported to California in 2020 that it had 21 disengagements over 629,000 miles of autonomous driving in that state, for a rate of 3.3 per 100,000 miles.
"Given how rare both of these scenarios are, the likelihood that a Waymo vehicle is involved in a fatal crash as a result of being struck from behind at a high speed, at the exact time that there is a technical failure, is extremely small," Waymo said.
On the crash involving the police officer, Waymo said its vehicle was "taking a cautious approach to entering the intersection after a traffic light phase change from red to green" during the time it began to move and then stopped. The company specifically pointed to Arizona Revised Statue 28-730, which prohibits following a vehicle more closely than is reasonable and prudent.
However, Arizona also has a law that bans driving that is too slow unless "reduced speed is necessary for safe operation."
Whether "safe operation" includes driverless vehicles experiencing failed software in traffic may be among the legal concepts that city judges soon may have to figure out.
As to the problems with vandals or drunks, Waymo promised that it has its riders covered. The company can help facilitate contact with authorities, and Rider Support allows riders to get assistance whenever they need it.
"In short, we're learning every day and have policies and procedures in place to address the types of events you mention," Waymo said.
(Correction: The article has been changed to note that Waymo claimed it would add "up to" 62,000 driverless vehicles to its fleet, and that its crash modeling utilized the vehicles' software, not an average of vehicle performance.)