Driverless vehicles will need to be programmed with a clear and agreed set of rules for decision-making, according to new research published by law firm Gowling WLG.

In its report on “The Moral Algorithm”, Gowling WLG finds that concerns over the so-called “trolley problem” – where a vehicle must choose between hitting defined individuals – may have been exaggerated, with most of the experts interviewed agreeing that autonomous vehicles will never be programmed to make such distinctions.

The report concludes with a series of eight recommendations, including the creation of an independent regulator to balance the legality, safety and commerciality issues surrounding autonomous vehicles, the development of a policy regarding how the moral algorithm will operate in terms of major safety situations and a programme of public education and consultation.

Stuart Young, partner at Gowling WLG, said: “It is important not to equate regulation with a burden. It can, in fact, facilitate new markets and important developments. Unless completely new legislation that accommodates new products in advance of them being produced is implemented, this is likely to impose huge additional risks on the companies producing them, as a result of regulatory uncertainty."

The “Moral Algorithm” study took the form of interviews with industry specialists and representatives from the UK Autodrive consortium during September and October 2016 as well as desktop research and analysis of publicly-available information.

Tim Armitage, Arup’s UK Autodrive project director, said: "As with any complex new technology, autonomous cars cannot be specifically programmed to respond to every possible scenario. This simply isn't practical when a machine is expected to interact with humans, in a complex environment, on a day-to-day basis. 

"Autonomous cars will drive to the speed limits and will not be distracted from the task of safe driving; they will make practical decisions based on their programming, but they cannot be expected to make moral decisions around which society provides no agreed guidance.

"To allow autonomous cars to demonstrate their capacity for practical decision-making in complex environments, and to begin to establish public trust through contact, the first step is allowing testing in relatively simple and well-defined environments. Of course, regulation will need to keep up, so in echoing Stuart's sentiments, it is vital the legal industry act now in order to help create a realistic and viable route to market for AVs."