As autonomous vehicles (AV) move out of science fiction and onto roads worldwide, it’s important to establish who’s liable if you’re hit by a self-driving car.
Road accident liability laws weren’t written with self-driving cars in mind. As the technology in autonomous cars evolves, so will the surrounding law.
AVs ‘make roads safer’
South African roads are some of the most dangerous in the world and the vast majority of accidents are caused by human error.
Improving road safety through self-drive vehicles that employ artificial intelligence (AI), forward-collision warnings and automatic braking systems can help remove human error, especially by drunk and speeding drivers.
No AVs in South Africa – yet
Currently, there are no self-drive vehicles on South African roads. In 2019, Blade Nzimande, then the Minister of Transport, told Parliament that the government has plans to introduce them but not in the immediate future.
The operation of motor vehicles on public roads is governed by the National Road Traffic Act. It contains no regulations regarding the operation of self-driving vehicles. Fortunately, there’s time to update legislation before AVs arrive.
What’s happening in other countries?
In the meantime, we need to look elsewhere, principally the United States, where AVs are increasingly common on its roads, to see how the law is being applied in cases involving accidents with self-driving vehicles.
There are 38 states in the US that have enacted legislation or issued executive orders regarding AVs. Eighteen states now allow testing or deployment without a human operator in the vehicle. That said, most AVs are still deployed with a human operator in the vehicle.
At the moment, vehicles have six levels of autonomy. At the highest level, five, there is full autonomy and “no longer a need for human vehicle control inputs”.
Many cars in SA are already at autonomous level one or two, but moving beyond that would require a massive investment in our infrastructure.
Who is liable: human or technology?
There’s no doubt AVs will eventually become ubiquitous on roads around the world. They will reduce the number of accidents, but accidents won’t disappear altogether.
Proving liability is far more complex when an AV is involved. Who’s liable if, hypothetically, the AV fails to break and hits a pedestrian? It could be:
- the operator of the vehicle
- the manufacturer of the vehicle
- the manufacturers of artificial intelligence that enables autonomous systems
- the manufacturers of hardware technology that enables autonomous systems
- the passengers
- the highway authority
- the pedestrian.
AV kills pedestrian in Tempe
Inevitably, collisions and fatal accidents have occurred involving self-drive vehicles. In 2018, the first fatality involving an AV and a pedestrian took place in Tempe, Arizona.
An autonomous car operated by Uber during real-world testing struck and killed Elaine Herzberg, 49, a woman pushing a bicycle across a highway at 10pm. Herzberg was jaywalking and had crossed two lanes of traffic before being hit by the Uber Volvo XC90.
Neither the AV nor the backup human operator, Rafaela Vasquez, in the driver’s seat noticed Herzberg until it was too late. A backup driver behind a wheel is, when necessary, expected to take over to avoid an accident.
Video from the crash showed Vasquez looking down in her lap just prior to the accident. She applied the brakes only after Herzberg had been hit. But why did the self-drive car fail to brake?
Liability buried in settlement
Herzberg’s daughter retained a law firm and, together with Herzberg’s husband, reached an undisclosed settlement. Local and federal authorities continued to investigate the crash.
The settlement was confidential so liability was buried, but the settlement suggested a sufficient legal cause of action. There was an abundance of information from event data recorders.
The Yavapai County Attorney declined to charge Uber with a criminal violation in 2019 for the death of Herzberg, but a Maricopa County grand jury indicted the safety driver on one count of negligent homicide in 2020.
The incident caused some companies to temporarily cease road testing of self-driving vehicles. The accident raised the question – is the artificial intelligence that drives a self-drive car ready to handle the real world?
Out-of-court settlements
So far, accidents involving self-driving cars have all been settled outside of court. Uber and microchip developer Nvidia Corp settled within 10 days of the accident in Tempe.
This means companies avoid the risk of a judgment going against them and precedent can’t be set.
What we offer at DSC Attorneys
It remains to be seen how judicial systems, including our own, will approach the issue of liability in accidents involving self-driving cars and other fully automated machines.
For now, pursuing any road accident claim is a lengthy, complex process. For several reasons, it’s not a good idea to submit a RAF claim without professional legal representation.
At DSC Attorneys, we’re experts in road accident claims, with extensive experience in handling claims against the Road Accident Fund (RAF). Contact us online or call 0861 465 879 for legal support and representation that’s effective, ethical and caring.

