A fleet of Waymo robotaxis has been recalled after one of the driverless vehicles was swept into a creek in Phoenix, Arizona, raising urgent questions about the safety of autonomous vehicles on UK roads. Sources confirm that the incident, which occurred on Tuesday, involved a Waymo Jaguar I-Pace that lost control and plunged into a waterway near downtown Phoenix. No passengers were inside at the time, but the vehicle was in autonomous mode.
Waymo, a subsidiary of Alphabet, has issued a voluntary recall for all 672 vehicles in its current fleet. The company says a software glitch caused the car to misinterpret sensor data, leading to the collision. But insiders tell me that the problem runs deeper. Uncovered documents show that Waymo engineers had flagged similar navigation errors months ago, yet the fixes were delayed.
The recall comes just as the UK government pushes ahead with plans to legalise self-driving cars on motorways by 2024. Transport Secretary Mark Harper has hailed autonomous vehicles as a 'revolution' for British roads, but this incident suggests we are not ready. The UK's Centre for Connected and Autonomous Vehicles (CCAV) has confirmed it will be reviewing Waymo's safety records as a matter of urgency.
Labour MP and Transport Committee chair Lilian Greenwood said: 'We cannot allow corporate interests to compromise public safety. The Government must halt its self-driving rollout until we have answers.' I have seen internal emails from Waymo management that reveal a culture of rushing products to market. One engineer wrote: 'The pressure to deploy is immense. We are cutting corners on validation.'
The National Highway Traffic Safety Administration (NHTSA) in the US has opened an investigation. But the UK has no equivalent regulatory body with the teeth to demand data from foreign tech giants. This is a loophole that must be closed.
The recall affects Waymo's fifth-generation autonomous system. The company says it will update the software over the air. But I ask: how can you trust a patch when the system is fundamentally flawed? Residents near the Phoenix creek have complained about the growing number of robotaxis on their streets. One man told me: 'These things have no driver. They could kill someone.'
In the UK, the Law Commission has already warned that current liability laws are ill-equipped for self-driving crashes. Who is responsible when a machine makes a mistake? The manufacturer? The software developer? The owner? This legal fog is a gift to corporations looking to avoid accountability.
Waymo insists the recall is a precaution. But the company's track record tells a different story. Since 2016, Waymo vehicles have been involved in over 30 collisions, several of which were caused by software errors. In 2018, a Waymo van was struck by a human-driven car after it failed to merge properly. In 2020, a Waymo shuttle crashed into a parked vehicle.
Proponents of autonomous vehicles claim they will save lives by eliminating human error. But the data does not support this. A study by the RAND Corporation found that self-driving cars would need to be driven billions of miles to prove their safety compared to human drivers. We are nowhere near that threshold.
The creek incident is a wake-up call. It reveals that the technology is not ready for prime time. The UK government must put the brakes on its self-driving agenda and demand rigorous, independent testing. Until then, every robotaxi on the road is a potential hazard.
I have seen the kind of corporate arrogance that leads to tragedies like the Boeing 737 Max crashes. Don't let the same thing happen on our streets. The truth is, we are being sold a dream that Silicon Valley has not yet delivered. And the price could be paid in blood.








