Metro Phoenix has been a hotbed for the testing of so-called autonomous, or self-driving vehicles, with radar-equipped vehicles sometimes swarming local surface streets.
One challenge that these vehicles have shied away from, though, is freeway driving. Although some of Google's Waymo vehicles and possibly others have ventured onto the freeways at times, freeway spottings remain relatively scarce.
The companies probably fear what would happen at freeway speeds: One good screw-up at 65 mph could cause a Hollywood-style, multivehicle pileup.
Waymo, which has been testing vehicles in metro Phoenix since April 2016, announced last month it would begin putting self-driving vehicles on Valley streets with no backup drivers behind the wheel. But it's unclear when such vehicles might be tested on local freeways, where the margins are much lower for error.
Waymo has had ride-alongs for public officials and the news media, but not on freeways, Waymo spokesperson Amanda Ventura said.
Ventura declined to say whether no-backup-driver vehicles have been tested on freeways, or when they might be tested on freeways.
Uber said through a spokesperson that self-driving vehicles don't operate on freeways, but that surface streets were actually "much more complex than freeway driving given the many objects on the road ..."
The Uber self-driving team is, however, examining the "challenges that come with freeway driving" as part of its goal to create self-driving long-haul trucks.
Freeway driving by self-driving cars is "not impossible," said Ashraf Gaffar, an assistant professor at Arizona State University's School of Computing, Informatics, and Decision Systems Engineering. But he doesn't expect totally pilotless vehicles driving on the freeway "like you and I drive on the freeway" for quite a while.
One problem is something he calls the window of intervention. When an airplane or ship is on autopilot and something happens that requires human attention, seconds or even minutes are available for the pilot to regain manual control.
"They have room to take over, which we do not have in autonomous vehicles," he said.
Studies show it might take a driver five or 10 seconds under ideal conditions to switch a vehicle from autonomous to manual mode and fully understand "what's going on" in terms of necessary traffic maneuvering, Gaffar said.
That might rise to 20 to 30 seconds if the person at the wheel of the autonomous vehicle is severely distracted. Yet the National Highway Traffic Safety Administration suggests that distractions of more than two seconds can elevate the risk of a crash, he points out.
Autonomous vehicles have a "negative window of intervention," according to Gaffar. In other words, there just wouldn't be time to take over if you needed to.
Potentially, this could make it tough to test autonomous vehicles in normal, high-speed, congested freeway conditions. The backup driver, to be safe, would need to remain non-distracted and ready to take over vehicle operations literally in a second's notice.
Gaffar said he sees two other significant challenges for autonomous vehicles.
First, it's unclear how vehicles with the technological sophistication of a fighter jet will survive the normal rigors, if not outright abuse, to which people subject their motor vehicles on a daily basis.
Adverse road conditions are an even bigger problem, Gaffar said. Fog, snow, sleet, dim lighting — such cumulative adverse effects cause autonomous vehicle performance to degrade "exponentially," he said.
At ASU, Gaffar is helping to develop artificial intelligence systems that monitor people's driving styles and adjusts intervention technology, like warning signals, as needed.
Some high-tech vehicles already allow drivers to take their hands off the wheel on freeways. But systems like Cadillac's Super Cruise technology also may monitor a driver's eyes and hands, ensuring that the driver is never distracted. That's a stretch from the goal of fully self-driving cars, which is, conversely, to allow distraction and let everyone in the car be a passenger.
Waymo may or may not be having success with no backup drivers in its vehicles — only the company knows how many fully driverless vehicles are being tested, and how much control of vehicles is being done remotely.
Whether autonomous vehicles with no backup drivers are fully legal, and who would be responsible in a crash, isn't fully clear. Governor Doug Ducey's executive order authorizing testing of autonomous vehicles implies such driverless testing should occur only on university campuses. It also states that all autonomous vehicles require a responsible person with a valid driver's license, even if that person can only take control remotely.
Since Waymo announced the no-backup-driver testing phase last month, no crashes have been in the news, so that's one piece of evidence that things might be working out so far.
But it seems safe to say that in this beginning phase of autonomous vehicles, cruising along at freeway speeds in the back seat, with no one at all in the front seat, would be downright thrilling.