We use cookies to make your experience of our website better. Some of these are set by third party Google Analytics to help us analyse website traffic. To comply with privacy regulations, we require your consent to set these cookies. If you continue to use the site without selecting an option we will assume you are happy for us to use cookies.

Automated Vehicles – Navigating the Road to Regulatory Reform

Automated Vehicles – Navigating the Road to Regulatory Reform

The joint consultation by the Law Commission of England and Wales and the Scottish Law Commission on automated vehicles seeks views by 8th February 2019 on whether or not the UK’s current regulatory framework can cope with this new and ever-evolving technology, or whether reform is needed. Road traffic safety and regulation relies for the most part on the criminal law, which concentrates on the behaviour of the human driver. However, some offences may not be totally suitable for crimes involving driverless vehicles, and in some cases completely new offences will be needed. We consider some of the issues identified by the consultation below.


Shared Control of Automated Vehicles

Most driving offences are aimed at a human driver but the concept of driverless vehicles should shift responsibility for certain offences elsewhere. The consultation proposes two new concepts, the “user in charge” (UIC) and the Automated Driving System Entity (ADSE), which you can find out more about in our previous blog here. Where an automated vehicle requires a UIC to be in the vehicle ready to take control if required, should it be a requirement that that person be qualified and fit to drive, not under the influence of drink or drugs and in a position to operate controls? Should there be additional obligations placed on a UIC, such as a duty to intervene to avoid an accident, and criminal sanctions for failure to do so? These are all questions posed by the consultation.


Causing Death or Serious Injury

Currently offences of causing death or injury by driving only apply to a human driver. What if an automated vehicle “driving itself” causes an accident which results in death or serious injury of another road user or pedestrian? Where an automated vehicle causes a death, the public will expect the law to respond appropriately and hold the right people, such as an owner who fails to install the correct software or the ADSE responsible for the automated driving system, accountable. Without reform, that may not happen.


Individuals

In terms of the current law, when an individual is considered to be at fault, gross negligence manslaughter and manslaughter (or culpable homicide in Scotland) could be considered. However, whether any of those offences have been committed in the context of automated vehicles would depend on a court’s view of the particular circumstances. Case law would also need to develop in this area over time to provide the courts with guidance. The solution might not be immediate which could lead to uncertainty.


Companies

For the ADSE or corporate owner of an automated vehicle, there is a real possibility of prosecution for corporate homicide or manslaughter. It is for a jury to decide whether there has been a gross breach and failings by senior managers is a substantial element of that breach.  With most prosecutions for corporate manslaughter to date being of small to medium sized companies, serious failings by senior management could be difficult to prove in larger organisations with complicated management structures.


Creation of New Offences

All of the offences discussed above only apply where a death has occurred, and so new offences are likely to be needed to deal with situations where a driverless vehicle causes serious injury, in particular where the breach involves a company. This could apply to software developers, manufacturers of driverless vehicles and even fleet owners. Views are also sought on whether new offences are needed in relation to public behaviour, such as interfering with a vehicle’s sensors or road infrastructure.

There is also the suggestion of a “safe harbour” for human users, including UICs, so where an automated vehicle is “driving itself” and causes an accident, no offence can be committed by the human driver. This would cover situations where vehicles have a UIC ready to take the controls, as well as where a UIC has not been considered necessary. A UIC would not be a driver for the purposes of civil and criminal law while the system is driving itself. Where the system is authorised to function without a UIC, any occupants would be regarded as passengers rather than drivers. Reform would be necessary if the “safe harbour” concept was adopted.


What’s Next?

The overall aim of encouraging the use of new and innovative technology needs to be balanced against ensuring the safety of users of automated vehicles and other road users. There is no doubt that automation will play a major part in our everyday lives in the future. However, current road traffic law could see prosecutions of individuals and companies where they are not justified, or conversely not at all even where the facts seem to justify it. A robust regulatory regime is needed to plug the gaps in existing law and ensure that the right party or parties are held to account when things go wrong.

The consultation is part of a 3 year project and can be found here. Responses are sought by 8th February 2019. Although the full programme of work has yet to be decided, the next phase of the project will be to consult on the use of driverless vehicles in mobility as a service, focussing on buses, taxis and private hire vehicles. The project aims to provide recommendations for legislative reform by March 2021.


By Lynne Moss

Click here to set up your preferences so we can send you the insight you need to stay precisely informed.

Burness admin