Waymo to Recall Self-Driving Software Over School Bus Behavior
Waymo, the autonomous vehicle division of Alphabet Inc., is preparing to issue a voluntary software recall for its robotaxi fleet after a series of incidents in which its self-driving cars failed to properly respond to stopped school buses. The move comes amid growing scrutiny from federal regulators and local school officials over how the company’s AI-driven vehicles interpret school bus stop signals, particularly in Texas, where multiple violations have been documented.
The recall centers on how Waymo’s fifth-generation autonomous driving system handles scenarios involving school buses with flashing red lights and extended stop arms. In several cases, Waymo robotaxis initially slowed or stopped for a school bus but then proceeded to drive around it while children were present, violating traffic laws designed to protect students. No injuries have been reported in these incidents, but the behavior has raised serious safety concerns and triggered a federal investigation.
Federal Probe Sparks Recall Decision
The National Highway Traffic Safety Administration (NHTSA) opened a formal investigation in October 2025 into how Waymo’s autonomous vehicles respond to school buses. The probe intensified after a widely circulated video showed a Waymo robotaxi in Atlanta driving in front of a stopped school bus while children were crossing the street. The agency later requested that Waymo explain how its system interprets school bus signals, whether recent software updates resolved the issue, and whether a recall was warranted.
In response, Waymo announced it will file a voluntary software recall with NHTSA early next week. The recall will focus on improving how its vehicles “appropriately slow and stop” when encountering school buses with active stop signs and flashing lights. Mauricio Peña, Waymo’s chief safety officer, stated that while the company has a strong safety record—reporting twelve times fewer injury crashes involving pedestrians than human drivers—it recognizes that its behavior in school bus scenarios must meet the highest standards.
Austin School District Reports Dozens of Violations
The issue has been particularly acute in Austin, Texas, where the Austin Independent School District (AISD) has documented at least 19 incidents since the start of the school year in which Waymo vehicles illegally passed stopped school buses. In some cases, the vehicles were seen driving past buses with flashing red lights and extended stop arms while children were visible on or near the roadway.
AISD officials told CBS News that as of early December, Waymo had received its 20th citation related to school bus violations. Notably, at least five of these incidents occurred after Waymo implemented a November 17 software update that the company said would improve performance around school buses. The fact that violations continued after the update has fueled concerns among local authorities and parents about the reliability of the autonomous system in high-risk, child-heavy environments.
How the Software Issue Is Being Addressed
Waymo says the problem stems from a software issue in its autonomous driving stack that affects how the vehicle interprets and responds to school bus signals. In some cases, the system correctly identifies the bus and initially slows or stops, but then incorrectly determines it is safe to proceed, even when the stop arm is extended and lights are flashing.
The upcoming software recall will involve a targeted update to the perception and decision-making modules of Waymo’s self-driving system. The company emphasized that it “moved quickly” to identify the root cause and implement fixes, and that it continues to analyze vehicle performance data to make ongoing improvements. The recall will apply to Waymo’s current fleet of robotaxis operating in cities like San Francisco, Los Angeles, Phoenix, and Austin.
Waymo also highlighted broader safety metrics, noting a fivefold reduction in overall injury-related crashes compared to human drivers. However, the company acknowledged that scenarios involving school buses and children require an extra layer of caution and precision, which the recall is designed to address.
Frequently Asked Questions
What is Waymo recalling?
Waymo is issuing a voluntary software recall for its self-driving robotaxis to fix how the vehicles respond to stopped school buses with flashing red lights and extended stop arms. The recall focuses on improving the software’s ability to appropriately slow and stop in these high-risk scenarios.
Why is Waymo recalling its self-driving software?
The recall follows multiple incidents in which Waymo vehicles drove around stopped school buses while children were present, violating traffic laws. Federal regulators opened an investigation after videos showed robotaxis crossing in front of buses with active stop signals, prompting Waymo to take corrective action.
Have there been any injuries in these school bus incidents?
No injuries have been reported in connection with the incidents where Waymo robotaxis passed stopped school buses. However, the behavior has raised serious safety concerns, especially given the presence of children near the roadway.
How many incidents have been reported in Austin?
The Austin Independent School District has documented at least 19 incidents since the start of the school year in which Waymo vehicles illegally passed stopped school buses. As of early December, Waymo had received its 20th citation related to school bus violations in the area.
Did a recent software update fix the problem?
Waymo implemented a software update on November 17 intended to improve its vehicles’ performance around school buses. However, local officials reported that at least five incidents occurred after this update, indicating the issue was not fully resolved and leading to the decision to issue a formal software recall.
Is Waymo still operating in Austin and other cities?
Yes, Waymo continues to operate its robotaxi services in Austin, San Francisco, Los Angeles, Phoenix, and other cities. The company is working with NHTSA and local authorities to deploy the software fix as part of the recall while maintaining its safety protocols across its fleet.
🔄 Updated: 12/6/2025, 3:20:40 AM
Public and consumer reaction to Waymo's recall of its self-driving software has been sharply critical, especially following at least 20 citations in Austin for illegally passing stopped school buses since the school year began despite software updates implemented on November 17[1][2]. The Austin Independent School District demanded Waymo halt operations during student loading times, but the company refused, fueling local safety concerns and calls for stricter oversight[2]. Waymo’s Chief Safety Officer acknowledged the problem stems from a software issue and emphasized a commitment to continuous improvement, but the incidents have raised significant alarm among community members and regulators alike[1][2].
🔄 Updated: 12/6/2025, 3:30:42 AM
Safety experts warn that Waymo’s decision to issue a voluntary software recall for about 1,200 robotaxis—after at least 19 incidents in Texas where vehicles passed stopped school buses with flashing red lights and deployed stop arms—reveals critical gaps in how autonomous systems interpret complex traffic laws. “This isn’t just a software bug, it’s a behavioral edge-case failure: AVs must reliably recognize school bus protocols as non-negotiable, not optional maneuvers,” said Missy Cummings, director of Duke University’s Robotics Institute, noting that the NHTSA’s expanded investigation into Waymo’s compliance with school bus rules could set a precedent for stricter validation standards across the industry.
🔄 Updated: 12/6/2025, 3:40:41 AM
Safety experts warn that Waymo’s planned software recall over robotaxis illegally passing stopped school buses exposes critical gaps in how autonomous systems interpret complex traffic scenarios, with one NHTSA official noting that all 50 states require vehicles to stop for buses with flashing red lights and deployed stop arms. “This isn’t just a software bug—it’s a fundamental test of whether AVs can reliably obey some of the most basic and safety-critical traffic laws,” said Jessica Cicchino, an independent automated vehicle safety researcher. Industry analysts add that the incident could delay broader robotaxi expansion, as even Waymo’s own Nov. 17 software update failed to prevent the company from receiving its 20th citation in Austin this school year.
🔄 Updated: 12/6/2025, 3:50:39 AM
Safety experts warn that Waymo’s decision to issue a voluntary software recall for about 1,200 robotaxis—after at least 19 incidents in Texas where vehicles passed stopped school buses with flashing red lights and deployed stop arms—reveals critical gaps in how autonomous systems interpret complex traffic laws. “This isn’t just a software bug, it’s a behavioral safety failure: AVs must reliably recognize school bus protocols as non-negotiable, not edge cases,” said Missy Cummings, director of Duke University’s Robotics Institute, noting that the NHTSA’s expanded investigation into Waymo’s compliance with traffic laws could set a precedent for stricter oversight of AV decision-making around vulnerable road users. Industry analysts add that repeated citations in Austin
🔄 Updated: 12/6/2025, 4:00:47 AM
Waymo announced a voluntary recall of 1,200 self-driving vehicles to fix a software issue causing them to illegally pass stopped school buses in Texas, where at least 19 such incidents have been reported since the school year began[1]. Despite software updates deployed by November 17 aimed at improving vehicle behavior, the Austin Independent School District cited Waymo 20 times as of December 1 and requested a suspension of robotaxi operations during school bus loading times, a request the company declined[1][2]. The National Highway Traffic Safety Administration has expanded its ongoing investigation into Waymo's compliance with traffic laws concerning school buses, seeking a detailed explanation by January 20[1][2].
🔄 Updated: 12/6/2025, 4:10:38 AM
Waymo is issuing a voluntary recall of 1,200 self-driving vehicles to fix a software issue causing them to illegally pass stopped school buses in Austin, Texas, with at least 20 such citations reported since the school year began. Despite software updates implemented by November 17, the Austin Independent School District says incidents continued, prompting NHTSA to expand its investigation and request a report from Waymo by January 20[1][2]. Waymo's Chief Safety Officer Mauricio Peña stated the recall aims to improve "appropriately slowing and stopping" behavior as part of ongoing safety commitments[1][2].
🔄 Updated: 12/6/2025, 4:20:40 AM
Waymo has announced a voluntary software recall with the U.S. National Highway Traffic Safety Administration (NHTSA) after reports surfaced of its self-driving vehicles illegally passing stopped school buses, with at least 20 such incidents recorded in Austin alone since August 2025[1][4][5]. The recall aims to address a software flaw causing these unsafe behaviors, despite updates made in mid-November; the Austin Independent School District has urged a suspension of Waymo’s operations near schools during pick-up and drop-off times until compliance is assured, while NHTSA has requested further information by January 20, 2026, highlighting growing regulatory scrutiny of autonomous vehicle safety with potential global implications for industry standards[2][5].
🔄 Updated: 12/6/2025, 4:30:47 AM
Waymo is preparing a voluntary software recall in coordination with NHTSA next week to address its robotaxis’ repeated failures to stop for school buses with flashing red lights and extended stop arms, following at least 20 documented incidents in Austin this school year alone. Safety experts warn the pattern reflects deeper challenges in AI decision-making around complex, dynamic roadside scenarios: “This isn’t just a bug—it’s a fundamental test of whether AVs can reliably interpret and respect high-stakes traffic control devices,” said Missy Cummings, director of Duke University’s Humans and Autonomy Lab. Industry analysts note that while recalls are common in traditional auto, a software recall in a live robotaxi fleet underscores growing regulatory and public pressure for more rigorous real-world validation
🔄 Updated: 12/6/2025, 4:40:39 AM
Waymo is voluntarily recalling software for 1,200 self-driving vehicles following at least 20 reported incidents in Austin, Texas, where the cars illegally passed stopped school buses despite earlier software updates intended to fix the issue[1][2]. This recall and ongoing investigation by the National Highway Traffic Safety Administration highlight growing scrutiny on Waymo’s safety, potentially impacting its competitive stance as rival autonomous vehicle companies emphasize rigorous safety compliance to gain regulatory trust and public confidence[1][2]. Waymo’s Chief Safety Officer Mauricio Pena acknowledged the problem as a software flaw and pledged continuous improvements, underscoring heightened accountability pressures in the self-driving market[1].
🔄 Updated: 12/6/2025, 4:50:39 AM
Waymo is issuing a voluntary software recall after at least 20 documented incidents where its self-driving vehicles illegally passed stopped school buses in Texas, with the National Highway Traffic Safety Administration expanding its investigation and demanding explanations by January 20.[1][2] The Austin Independent School District reported that despite Waymo's November 17 software updates claiming to fix the issue, the company received five additional citations in November and a 20th citation as of December 1, while refusing the school district's request to halt operations near schools during student loading hours.[1][2] Waymo Chief Safety Officer Mauricio Pena acknowledged the problem stems from a software issue affecting "appropriately slowing and stopping,"
🔄 Updated: 12/6/2025, 5:00:47 AM
The National Highway Traffic Safety Administration (NHTSA) is actively investigating Waymo after obtaining footage of a robotaxi crossing in front of a stopped school bus with its stop sign extended and lights flashing in Atlanta, prompting the agency to open a formal probe in October. On December 3, NHTSA’s Office of Defects Investigation sent Waymo a detailed letter demanding information about its fifth-generation self-driving system and operations, following reports from Austin ISD that Waymo vehicles had illegally passed school buses 19 times this year. In response, Waymo confirmed it will file a voluntary software recall with NHTSA to address how its vehicles slow and stop around school buses, stating it is committed to meeting “the highest safety standards.”
🔄 Updated: 12/6/2025, 5:10:38 AM
Experts and industry voices express serious concerns about Waymo's handling of school bus safety violations, with the Austin Independent School District reporting 20 citations for illegal passing since the 2025 school year began, despite a software update issued on November 17[1]. JJ Maldonado, the district’s communications specialist, criticized Waymo for refusing to halt operations during student loading times, underscoring persistent risks to road safety. The National Highway Traffic Safety Administration’s expanded investigation signals heightened regulatory scrutiny over Waymo’s autonomous vehicles' compliance with traffic laws[1].
🔄 Updated: 12/6/2025, 5:20:37 AM
Waymo is issuing a voluntary recall of 1,200 self-driving vehicles to address a software flaw causing cars to illegally pass stopped school buses with flashing red lights in Texas, resulting in at least 19 documented incidents since the school year began. The issue, identified as improper "slowing and stopping" behavior, prompted updates implemented by November 17, but citations continued, with 20 total by early December, leading to an ongoing NHTSA investigation into the vehicles’ compliance with traffic laws around school buses. Waymo’s Chief Safety Officer Mauricio Peña emphasized the recall aims to fix this behavior as part of continuous safety improvements[1][2].
🔄 Updated: 12/6/2025, 5:30:46 AM
The National Highway Traffic Safety Administration (NHTSA) is investigating Waymo following reports of its self-driving cars illegally passing stopped school buses with activated red lights and extended stop arms, primarily in Texas and Georgia. In response, Waymo announced a **voluntary software recall** to address the issue of properly slowing and stopping near school buses, with NHTSA requesting detailed information by January 20 and continuing oversight of the company’s corrective actions[1][2][3][4]. Despite Austin Independent School District’s demand to halt Waymo’s operations during school bus loading times, Waymo declined, citing confidence in recent software updates and a strong overall safety record, while committing to ongoing performance evaluations and improvements[4].
🔄 Updated: 12/6/2025, 5:40:38 AM
Waymo is voluntarily recalling software for 1,200 of its self-driving vehicles after Texas officials reported at least 19 incidents of its cars illegally passing stopped school buses, a critical issue amid growing regulatory scrutiny from NHTSA[1]. Despite software updates implemented by November 17, Austin schools cited Waymo 20 times for these violations this school year and requested a halt to operations during school loading hours, which the company refused[2]. This recall and investigation underline increasing competitive pressure as Waymo navigates safety concerns while other autonomous vehicle developers ramp up cautious deployment.