8730 Wilshire Boulevard, California 90211 top-bar-image

Get Help Now

Phone Icon1-800-529-8255
1-800-529-8255
April 23, 2026 05 min

Are Self-Driving Vehicles Safe? Legal and Safety Considerations

Self-driving car accident lawyer

This question took center stage again when, in December 2025, multiple reports of Waymo self-driving cars illegally passing school buses drew the attention of the National Highway Traffic Safety Administration.

In Austin, Texas, Waymo self-driving vehicles have failed to yield to school buses at least twenty times during the current school year, according to the Austin Independent School District (AISD). Video footage posted by a local TV station shows Waymo robotaxis driving past buses with red lights flashing, behavior that violates Texas law. In one widely shared clip, a Waymo vehicle passes a stopped school bus as children are present near the roadway.

AISD officials say the violations continued even after Waymo deployed a software update intended to address the issue. According to reports, the district has issued multiple stop-arm violations and asked Waymo to pause operations during student pickup and drop-off hours, a request the company has declined.

Similar incidents have been reported in Atlanta, Georgia. According to a local investigation, at least six Waymo vehicles have illegally passed stopped Atlanta Public Schools buses. Video footage shows a Waymo vehicle slowing near a stopped bus before driving around it, while its crossing arm was extended and while students were exiting.

Waymo has acknowledged shortcomings in how its autonomous driving system responded to stopped school buses and issued a voluntary software recall affecting more than 3,000 vehicles. The company said the update improved school bus red light detection and reiterated that safety remains its top priority.

According to a report, NHTSA investigators are reviewing Waymo’s autonomous driving system, while focusing on the Austin and Atlanta incidents. Moreover, NHTSA has requested detailed information from Waymo on system design, incident data, and corrective actions. The evaluation remains ongoing.

Software Safety Issues

Autonomous vehicle companies touted self-driving cars as much safer alternatives to human-operated vehicles. This contention isn’t entirely misplaced. Driver error causes over 90 percent of the car crashes in California.

But autonomous vehicles, like all other electronic gadgets, have issues. Technological reliability may be the most pressing one.

Driverless cars rely on a complex array of systems such as sensors, cameras, radar, lidar, and artificial intelligence. These systems often can’t adapt to poor weather conditions like heavy rain, fog, or snow. That’s especially true if weather conditions change suddenly.

For the human backup driver behind the wheel, software errors or system failures also lead to incorrect decisions, potentially causing accidents. All these systems also give many drivers a false sense of security.

Adaptability of Self-Driving Cars

Usually, autonomous vehicles have no problem navigating empty streets in clear weather. But streets in Los Angeles are far from empty. Instead, they’re crowded with other vehicles and pedestrians who often make poor and unpredictable decisions.

Human drivers make sudden lane changes, ignore traffic rules, and otherwise drive aggressively. No software program can predict what a drunk driver will do next.

Pedestrians and cyclists rely heavily on eye contact, gestures, and informal cues to communicate with drivers. Without this communication link, everyone is at risk. Autonomous vehicles may have difficulty interpreting or seeing these subtle social signals, increasing the risk of misunderstandings and collisions, especially in busy urban environments.

Cybersecurity Concerns

Crashes are the primary concern, but not the only concern. Some people don’t want to cause crashes. They just want to watch the world burn, often by hacking into computer systems.

As mentioned, driverless cars are highly connected systems that rely on software updates, GPS, and communication with other vehicles and infrastructure. This connectivity creates vulnerability. A successful cyberattack could allow hackers to take control of a vehicle, disable safety features, or cause traffic disruptions.

To make self-driving vehicles safe, autonomous vehicle companies must remain very diligent and always stay at least one step ahead of hackers.

Ethical and Legal Issues

Ethical and decision-making challenges also raise safety questions. Autonomous vehicles operate in uncertain environments. So, they may face situations where a crash is unavoidable and the system must choose between two harmful outcomes.

A sudden emergency is a good example. A driverless car may be able to swerve suddenly and avoid a pothole. But that maneuver might put other drivers and pedestrians at risk.

Programming these decisions into algorithms is controversial, to say the least. It forces developers to make value judgments that may not be universally accepted.

On a related note, lack of transparency during this process reduces public trust in autonomous vehicle technology.

Finally, legal and regulatory issues affect the safety of self-driving vehicles. Determining responsibility in the event of an accident involving a driverless car is complex. A Los Angeles self-driving car accident lawyer must identify the responsible party, which could be the vehicle owner, manufacturer, software developer, or another party.

Unclear legal frameworks often slow safety improvements and make manufacturers cautious about deploying fully autonomous systems. For this reason, a self-driving car accident lawyer in Los Angeles also advocates for victims in the statehouse, encouraging legislators to pass victim-friendly laws.

Self-driving vehicles are not completely safe. For a free consultation with an experienced premises liability law attorney in California, contact us today at the Law Offices of Eslamboly Hakim. After-hours visits are available.

Overwhelmed by your injury and its aftermath?

We’re here to ease your stress and guide you through every step of recovery.

Get Support Now

FAQs

Self-driving vehicles can reduce crashes caused by human error, but safety concerns remain. Recent incidents show software and system limitations still exist.

NHTSA is reviewing reports of Waymo vehicles illegally passing stopped school buses. The investigation focuses on system design, incident data, and corrective actions.

Waymo vehicles failed to stop for school buses in Texas and Georgia. Video footage showed cars passing buses with flashing red lights and extended stop arms.

Yes, self-driving cars rely on sensors, cameras, and AI that can fail or misinterpret situations. Sudden weather changes and system errors can lead to unsafe decisions.

Autonomous vehicles struggle in crowded urban environments like Los Angeles. Human drivers, pedestrians, and cyclists often behave unpredictably.

People rely on eye contact and gestures to communicate with drivers. Autonomous vehicles may not detect or interpret these social cues effectively.

Yes, autonomous vehicles are highly connected and vulnerable to hacking. Cyberattacks could interfere with vehicle controls or safety systems.

Autonomous systems may face unavoidable crash scenarios requiring split-second decisions. Programming these choices involves controversial value judgments.

Liability may rest with the vehicle owner, manufacturer, software developer, or another party. Determining responsibility is legally complex.

Unclear legal frameworks slow safety improvements and deployment. Victim-focused legislation can help improve accountability and public trust.

Category: Self Driving Cars
Reviewer
Posted by Sharona Hakim

I like the fight – the fight to hold Big Insurance accountable, the fight to find justice for real people, and the fight to level the playing field for...Read More