Some items on our site have recently moved. Visit our News Hub for selected articles, special reports, podcasts and other resources.
Carmakers, software firms must be liable for self-driving vehicles, UK law bodies recommend
26 January 2022 00:01 by Jakub Krupa
The UK needs a new law on self-driving vehicles with a robust two-step regulatory framework to ensure they can be safely rolled out, a joint report by the UK's key legal review authorities has recommended after three years of study.
The Law Commission of England and Wales and the Scottish Law Commission today released a final 315-page report that sets a high standard for self-driving cars as carmakers and software developers need to prove their features are "safe even if an individual is not monitoring the driving environment, the vehicle or the way that it drives."
Approval and authorization
Recommendations for an Automated Vehicles Act aimed at facilitating "safe and responsible" deployment of automated vehicles would require all AVs to go through an exacting regulatory process before being put on the roads, and an "authorized self-driving entity," or ASDE — a carmaker, software developer or partnership of the two — primarily responsible for the safety of the vehicle.
The procedure would consist of a technical approval of the vehicle — either at international level or through a more flexible domestic program — and subsequent authorization.
As part of this, a new authorization authority would have to check whether the vehicle and its systems meet the proposed threshold of "self-driving features" and ensure the ASDE complies with technical and legal requirements for safe deployment.
But the commission stressed that AVs would then face further scrutiny, with a second regulator leading an in-use safety review armed with a set of sanctions at their disposal to ensure compliance.
While the recommendations do not propose to name a specific organization in charge of both processes in legislation, they suggest the new roles should be, at least initially, given to the Vehicle Certification Agency, or VCA, and the Driver and Vehicle Standards Agency, or DVSA.
In a move that could be seen as a pointed criticism of arguably misleading marketing of some systems such as Tesla's "Full-Self Driving" service, the commissions have also proposed introducing a new criminal offense of "engaging in a commercial practice in connection with the driving automation technology" without the necessary authorization.
But in the most eye-catching proposal, one that could radically overhaul the public perception of driving, the commissions have recommended that the person using automated features — what they call the "user-in-charge" — should have statutory immunity from a wide range of criminal offenses and civil liabilities, with the responsibility lying primarily with the ASDE.
The commissions called the proposal "an essential plank of the scheme," saying that "if users are told that they do not need to pay attention to the dynamic driving task, they cannot be prosecuted for failures." One primary exception to this would be if they fail to respond to alerts urging them to take over for safety reasons.
The report further said the aim is to "promote a no-blame safety culture that learns from mistakes," with a system of regulatory sanctions replacing the criminal sanctions applying to human drivers.
The new in-use regulator could impose informal and formal warnings, civil penalties or compliance orders, or it could even suspend or withdraw authorization altogether.
The proposed framework would also introduce criminal offenses linked with non-disclosure or misrepresentations in the regulatory process, with personal responsibility for senior managers.
The report includes an outline of rules for AVs with more advanced "no user-in-charge" features, turning people aboard the car into passengers supported by a licensed operator overseeing it remotely to help with unexpected problems such as road construction or lane obstruction.
This would be most beneficial for highly automated passenger services or for offering mobility services for older or disabled users. But the report notes associated risks related to connectivity, such as potential network latency when a vehicle is controlled remotely and cybersecurity risks.
The recommendations also include a statutory duty on data sharing that would oblige the ASDE to collect and store data on safety incidents for insurance purposes while complying with broader privacy legislation, the UK General Data Protection Regulation.
In a proposal likely to attract criticism from privacy campaigners, the commission recommends the extent of data stored by AVs should exceed current industry standards and include more sensitive information such as location data. Furthermore, it suggests that they should be kept for at least 39 months to facilitate possible insurance claims.
Separately, a new data-focused road collision unit would investigate the most serious collisions involving AVs. The commissions indicated that this role could be assigned to the new Road Collision Investigation Branch proposed in October last year and under consideration by the UK government.
The report comes amid heightened concerns that, despite promised safety and comfort benefits, the rollout of AVs may be hampered by public concerns about the technology. A YouGov poll last year found that only 12 per cent of people in the UK would trust a self-driving car to make better decisions than a human driver when faced with high-risk situations on the roads.
In a sign of challenges ahead, the commission conceded it could not find a consensus around the safety benchmark for automated vehicles, with consulted bodies agreeing only broadly that AVs "should be safer than human drivers," but radically split on what would be the acceptable level of risk without risking public uproar.
The commission's recommendations, three years in the making, are not binding on the UK government, which will have to decide whether it wants to progress them further.
Industry group TechUK has previously warned that the UK should "speed up" establishing the rules in law or face the risk of "falling behind in the regulatory space" behind other countries such as Germany and France.
A separate EU framework is being discussed at the working group level and is expected to be adopted by July.
21 April 2021 00:00 by Cynthia KroetMakers and users of "high-risk" AI tools, such as facial recognition, are at the center of the European Commission.
23 September 2020 14:46 by Cynthia KroetMicrosoft is to be asked by German watchdogs to improve its data-protection standards for its Office 365 program.