On at least two occasions in the past three months, police officers and other first responders have physically taken control of Waymo autonomous vehicles during active emergency situations, manually steering the driverless cars out of incident zones when the vehicles' software proved unable to navigate the scenario on its own. The incidents, reported by TechCrunch on March 26, highlight an operational reality of autonomous vehicle deployment that rarely appears in the promotional materials: when a driverless car encounters a situation it cannot handle, someone still has to deal with it, and that someone is increasingly a firefighter, paramedic, or patrol officer who never signed up to be a robotaxi valet.

Waymo, the Alphabet subsidiary that operates the most advanced commercial autonomous vehicle service in the United States, confirmed the incidents in a statement to TechCrunch but characterized them as rare exceptions within a fleet that has completed more than 10 million fully autonomous miles across San Francisco, Phoenix, Los Angeles, and Austin. The company emphasized that its vehicles are designed to pull over safely and stop when they encounter situations that exceed their operational capabilities, and that the first responder interactions occurred because the vehicles' "pull over and stop" behavior, while safe, impeded emergency response operations in time-critical situations.

That distinction, between a vehicle that is safe and a vehicle that is helpful, turns out to matter quite a bit when the flashing lights are real and the clock is running.

What Happened

The first incident occurred in late January in San Francisco's South of Market neighborhood. A four-alarm fire had broken out in a mixed-use building, and fire crews needed to close multiple lanes of traffic on a major arterial road to position ladder trucks and run hose lines. Two Waymo vehicles, operating without passengers at the time (Waymo vehicles frequently reposition themselves between rides, driving empty to areas of anticipated demand), entered the closed area before barricades were fully in place and stopped in positions that blocked the path of an incoming ladder truck.

The vehicles did what they were programmed to do: they detected the unusual road conditions (flashing lights, stopped vehicles, people on foot in the roadway) and pulled to the side of the road, activating their hazard lights and entering a stationary safe state. The problem was that "the side of the road" in this case was exactly where the ladder truck needed to position itself. A fire captain, unable to reach Waymo's remote assistance team quickly enough (the company operates a 24/7 fleet response center that can communicate with and remotely maneuver its vehicles), got into the driver's seat of one of the Waymos and manually drove it approximately 100 yards to clear the obstruction. A firefighter moved the second vehicle by releasing its parking brake and pushing it with the help of two colleagues.

The second incident occurred in mid-February in Phoenix, where a multi-vehicle collision on a major road required paramedics to access injured occupants from the driver's side. A Waymo vehicle, which had been carrying a passenger who had already exited, was stopped in a position that prevented the ambulance crew from pulling alongside the damaged vehicles. The passenger later told a local news outlet that the Waymo vehicle stopped and told her to exit, but that it remained stationary in the roadway after she left. A Phoenix police officer entered the vehicle and used the manual steering controls to move it out of the way.

Neither incident involved a collision, injury, or damage to the Waymo vehicles. Both involved first responders losing time during active emergencies to deal with a vehicle that could not deal with itself.

The Operational Problem

The incidents illuminate a category of challenge that is distinct from the safety challenges that typically dominate autonomous vehicle coverage. The question is not whether Waymo's vehicles can avoid crashing (they have an exceptionally strong safety record, with zero at-fault fatalities and a collision rate that Waymo says is lower than the human-driver average in its operating areas). The question is whether a fleet of vehicles that lack the social intelligence, situational judgment, and physical adaptability of a human driver can integrate smoothly into the messy, unpredictable reality of urban operations, particularly during emergencies when every second matters and the rules of normal traffic no longer apply.

Consider what a human driver does when encountering an active fire scene. They assess the situation visually, identify where emergency vehicles need to go, anticipate what the firefighters are trying to do, and find a path out of the way that avoids creating additional obstruction. They might back up illegally, drive over a curb, or make a three-point turn in a no-turn zone. The behavior is technically a series of traffic violations, but it is the correct behavior in context because the human driver understands the priority hierarchy: emergency response takes precedence over normal traffic rules.

A Waymo vehicle, operating within its programmed behavioral constraints, does not violate traffic rules even when doing so would be the contextually correct choice. It will not drive over a curb. It will not back up on a one-way street. It will not enter a lane that is blocked by cones, even if a firefighter is waving it through. Its "minimal risk condition," the behavior it defaults to when it encounters a situation it cannot resolve, is to pull over and stop. That behavior is safe in the sense that a stopped vehicle is not a danger to anyone. But a stopped vehicle in the wrong location can be a significant operational obstacle to people who are trying to save lives.

"The cars aren't dangerous. I want to be clear about that. The problem is they're like a tourist who freezes in the middle of an intersection because they don't understand the local traffic patterns. Except this tourist weighs 5,000 pounds, has no window you can knock on, and the only person who can move it is sitting in a call center somewhere."

San Francisco Fire Department officer, speaking to TechCrunch on condition of anonymity

Waymo's Response and Remote Assistance

Waymo's fleet response center, based in the company's San Francisco and Phoenix offices, is staffed around the clock by trained operators who can communicate with Waymo vehicles via two-way audio, view the vehicles' camera feeds in real time, and, in certain circumstances, issue high-level navigation commands that direct the vehicle to a new destination or route. The system is designed to handle exactly the type of situation that occurred in these incidents: when a vehicle is stopped in a location that is problematic, a remote operator can instruct the vehicle to move to a specified location.

The challenge, as the incidents demonstrated, is latency. The time between a first responder realizing that a Waymo vehicle is in the way and a remote operator successfully directing the vehicle to move can be several minutes, a timeframe that is acceptable in most situations but not during an active emergency. The San Francisco fire captain who manually moved the vehicle told TechCrunch that he initially tried to communicate with Waymo through the vehicle's exterior speaker and intercom system but received an automated response asking him to wait while a fleet response operator was connected. He estimated that he waited approximately 90 seconds before deciding to get in the vehicle himself.

Ninety seconds does not sound like a long time. In a fire that is actively spreading through a multi-story building, it is an eternity.

Waymo has acknowledged the latency issue and outlined several measures it is implementing to address it. The most significant is a dedicated first-responder communication channel, a phone number and app that emergency services can use to reach Waymo's fleet response team directly, bypassing the general queue. The company says it has distributed this information to fire departments and police agencies in all four of its operating cities and has conducted training sessions with emergency services personnel on how to interact with its vehicles. The company has also updated its vehicle software to improve the vehicles' ability to detect and respond to emergency scenarios, including better recognition of emergency vehicle light bars, sirens, and hand signals from uniformed officers.

The Roadside Assistance Gap

Emergency response is the most dramatic context in which the limitations of driverless vehicles become visible, but it is not the only one. Waymo vehicles also encounter difficulties with roadside assistance scenarios that a human driver would handle trivially: flat tires, windshield damage, sensor obstructions from mud or debris, and charging needs (Waymo's Jaguar I-PACE vehicles are battery electric). In each of these cases, the vehicle must either drive itself to a service facility or wait for a Waymo technician to arrive, a process that can take 30 minutes or more in the company's more geographically spread-out operating areas.

The company has addressed this through a network of service hubs strategically located throughout its operating areas, staffed by technicians who can respond to disabled vehicles and perform roadside maintenance. But the economics of maintaining that network are significant. Each operating city requires multiple service facilities, a fleet of chase vehicles (conventional cars driven by Waymo employees who respond to vehicle issues), and 24/7 staffing. Those costs do not appear in Waymo's published ride prices but are embedded in the company's operating expenses, which Alphabet does not break out separately in its financial reporting.

The roadside assistance challenge is, in a sense, a preview of the broader operational complexity that autonomous vehicle deployment entails. A human-driven taxi fleet handles flat tires, traffic diversions, and unexpected road closures through the judgment and adaptability of its drivers. A driverless fleet must replicate that capability through a combination of software, remote human operators, and on-the-ground service personnel. The technology handles the easy 95 percent of driving. The hard 5 percent, the situations that require judgment, creativity, and social intelligence, still requires humans, just humans who are not in the car.

The Broader Regulatory Question

The first-responder incidents add weight to a regulatory question that has been building as Waymo's fleet has grown: who is responsible for a driverless vehicle's behavior in the real world, and what obligations does the operator of a driverless fleet have to the communities in which it operates?

Currently, Waymo operates under a patchwork of state and local permits. California's DMV requires autonomous vehicle operators to hold a deployment permit, submit disengagement reports, and carry minimum insurance. Arizona, where Waymo's Phoenix operations are based, has a lighter regulatory framework that does not require permits for autonomous vehicle testing or deployment. Texas, where Waymo operates in Austin, has virtually no autonomous-vehicle-specific regulation.

None of these regulatory frameworks specifically address the interaction between autonomous vehicles and emergency services. There is no requirement for autonomous vehicle operators to provide first responders with the ability to move or control their vehicles during emergencies. There is no mandated response time for fleet operators to assist first responders who encounter a stopped autonomous vehicle. And there is no standardized communication protocol that emergency services can use across different autonomous vehicle fleets (a gap that will become more significant as additional operators, including GM's nascent highway testing program, deploy vehicles on public roads).

The National Highway Traffic Safety Administration has issued voluntary guidelines on autonomous vehicle interaction with emergency vehicles but has not promulgated binding regulations. The National Fire Protection Association (NFPA) published a technical guidance document in 2025 on firefighter interaction with autonomous and electric vehicles, but it is advisory rather than mandatory and has not been widely adopted by local fire departments.

San Francisco Supervisor Aaron Peskin, who has been one of the most vocal local officials on autonomous vehicle regulation, told TechCrunch that the incidents "demonstrate exactly why we need binding operational standards, not voluntary guidelines, for autonomous vehicle fleets operating in our cities. Our firefighters should not have to learn how to operate a Waymo any more than they should have to learn how to fly a drone to clear an obstacle from an emergency scene."

What Comes Next

The first-responder incidents are not existential threats to Waymo's business or to the autonomous vehicle industry. They are, in industry parlance, "edge cases," situations that fall outside the normal operating parameters that the system was designed for. Every transportation system has edge cases. Human-driven vehicles block fire lanes, ignore police directions, and create obstructions during emergencies with regularity. The difference is that a human-driven obstruction can be resolved by yelling at the driver, whereas a driverless obstruction requires a different set of tools and protocols.

Waymo's track record suggests the company will address these incidents with engineering solutions. The company has a well-documented pattern of identifying operational weaknesses, developing software or procedural fixes, and deploying them across the fleet. The dedicated first-responder communication channel, the improved emergency vehicle detection algorithms, and the updated vehicle behavior in emergency zones are all responsive measures that should reduce the frequency and severity of similar incidents in the future.

But the incidents also point to a more fundamental truth about autonomous vehicles that the industry has been slow to acknowledge publicly: removing the human driver from the vehicle does not remove the need for human judgment from the system. It relocates that judgment to remote operators, service technicians, and, when the technology falls short, to the firefighters and police officers who encounter these vehicles in the course of doing their own jobs. The question is not whether autonomous vehicles can drive safely. By most metrics, Waymo's vehicles already do. The question is whether the operational infrastructure surrounding those vehicles, the remote assistance, the emergency protocols, the community integration, can match the technology's capabilities. Right now, based on the evidence from San Francisco and Phoenix, it is still catching up.

Sources

  1. TechCrunch: Who's driving Waymo's self-driving cars? Sometimes police
  2. NFPA: Technical guidance for firefighter interaction with autonomous and electric vehicles