A driverless dream or nightmare in waiting?

“If we could make all vehicles fully autonomous overnight there wouldn’t be the difficulty.

"The transition phase is the dangerous bit.”

Matthew Avery - Thatcham Research

Safer roads. The end of congestion. More free time for travellers. The introduction of self-driving cars promises all this and more. But how likely are these scenarios to become reality? What are the potential costs – and are they worth paying?

Last year, Chancellor of the Exchequer Philip Hammond said he aimed to have fully driverless cars on Britain’s roads by 2021. As part of its push, the government has established a series of funding initiatives and projects aimed at developing driverless technology. It even created a dedicated research body - the Centre for Connected and Autonomous Vehicles.

Meanwhile, industry has been busy at work too, with technology companies and carmakers alike vying to be the first to get self-driving cars on the road.

There are some obvious incentives in this drive towards autonomy, not least safety. Human error has been cited as a causal factor in 90-95% of road traffic accidents, according to a House of Lords Science and Technology Select Committee report [2017]. Eliminating this from the driving process has the potential to save countless lives.

Then there are the economic benefits. For example, the market for connected and autonomous vehicles in the UK is estimated to be worth £28bn in 2035, according to a report by Transport Systems Catapult [2017].

Other potential pros include less congestion, improved mobility for elderly and disabled people, and, without the distraction of driving, more time for road users to do other things instead. Like read, perhaps, nap or, more likely, go online.

But despite such opportunities, getting to a point where self-driving cars can safely navigate the road network and integrate with vehicles that are controlled by people is a massive challenge. Recent media coverage, such as the case involving a self-driving Uber that killed a pedestrian in Arizona, has highlighted just how difficult this is likely to be.

In addition, the advance of artificial intelligence (AI) – a key component of driverless vehicles – raises a number of questions, both technical and moral. For example, can machines really replicate the highly complex human judgment calls that take place when driving? And, in extreme cases, are we happy for machines to make life-and-death decisions on our behalf?

In light of such uncertainties, the government target of 2021 seems particularly ambitious. And even if this is achieved, will autonomous vehicles really benefit society in the ways we’re hoping for?

Source: Department for Transport, 2016 (RAS50001/RAS50007)

Source: Department for Transport, 2016 (RAS50001/RAS50007)


Nissan ProPilot technology Nissan ProPilot technology

Source: Nissan

Source: Nissan

Volvo DriveMe camera sensors

Volvo DriveMe camera sensors

Volvo DriveMe radar sensors

Volvo DriveMe radar sensors

Volvo DriveMe ultrasonic sensors

Volvo DriveMe ultrasonic sensors

Volvo DriveMe laser sensors

Volvo DriveMe laser sensors

Stu McInroy is chief executive of the Road Safety Markings Association. He argues that there are still many issues to be resolved before self-driving cars rule the road.

“This is not going to happen quickly,” McInroy says. “My concern is that government probably hasn't as yet fully understood and embraced the challenges of how we get there by way of the supporting infrastructure that would be required.”

One of the key pieces of infrastructure is road markings.

At present, most autonomous vehicles navigate using technology called optical sensing, which is currently used in safety systems such as lane assist. Optical sensing relies on a camera that monitors the likes of road markings to help the car navigate safely. This reliance on optical sensing, and therefore on the quality and visibility of road markings, presents some significant problems, says McInroy.

“The road markings in the UK tend to be refreshed on a six- to eight-year basis, and given the financial strictures that have been placed upon local authorities and the like across the country, that period is being extended. Of course, the result is that if lines aren't replenished, refurbished or replaced, they lose reflectivity .”

McInroy says in the worst cases these road markings “get worn away to such an extent that they can't even be seen by a human eye, never mind the optical sensors of a vehicle”.

In terms of funding, McInroy believes the government needs to step up to the plate. But he argues this needs to be thought through properly.

“Ultimately, government has stated that it intends to have autonomous vehicles operating by 2021. The bottom line being that, with that intent, government has to support that financially.

“Now, it may be that rather than being centrally funded, it is devolved down to local authorities with a particular interest. But the danger of that, of course, will be that when economic decisions are being made, certain areas will embrace the opportunity and spend the money but others may not.”

McInroy agrees that in practice this may lead to a situation where a person is travelling through one county and, because that local authority has invested in road markings, her autonomous vehicle will be able to navigate properly. However, if this person travels into another county and, all of a sudden, its local authority hasn’t kept up with maintenance, the car potentially won't be able to function in autonomous mode.

McInroy adds that there is a great difference between getting one self-driving car to work on a limited set of roads with the appropriate level of markings and a whole fleet of cars working across the country. He talks of the problems associated with a so-called mixed fleet: that is where both manually-operated vehicles and those with high degrees of automation occupy our roads together.

As a rough estimate, he cites 2040 as a date when we’ll have cars with high degrees of autonomy on our roads in substantial numbers, but thinks there’ll still be “plenty of other cars kicking around.”


Google Waymo in action - used with permission.

Google Waymo in action - used with permission.

Matthew Avery, head of research at car safety group Thatcham Research, agrees that the transition towards autonomy is likely to be problematic.

“If we could make all autonomous vehicles fully autonomous overnight then there wouldn’t be the difficulty,” he says. “The transition phase is the dangerous bit.”

In 2014, the US-based Society of Automotive Engineers identified six levels of automation. Cars with Level 4 technology and upwards are generally regarded as fully autonomous, or self-driving. But car manufacturers are already bringing out cars with Level 3 technology that, in certain conditions, can control many aspects of driving, such as braking, accelerating and even steering.

According to Avery, this new technology is confusing many car buyers into thinking their vehicles can fully drive themselves when they can’t.

“We’re starting to see real-life examples of the hazardous situations that occur when motorists expect the car to drive and function on its own. Specifically, where the technology is taking ownership of more and more of the driving task, but the motorist may not be sufficiently aware that they’re still required to take back control in problematic circumstances.

“Names like ‘Autopilot’ or ‘ProPilot’ are deeply unhelpful,” Avery says, “as they infer the car can do a lot more than it can.” He argues that “absolute clarity” is needed to help drivers understand when and how these technologies are designed to work, and that they should always remain engaged in the driving task.

“Fully automated vehicles that can own the driving task from A to B, with no need for driver involvement whatsoever, won’t be available for many years to come,” says Avery. “Until then, drivers remain criminally liable for the safe use of their cars and as such, the capability of current road vehicle technologies must not be oversold.”

One car that offers Level 3 automation technology is the new Audi A8, not out yet in the UK, which runs an AI system called Traffic Jam Pilot. When active, the system takes charge of driving in slow-moving traffic, controlling functions such as accelerating, steering and braking up to a speed of 37 mph. However, when the system encounters a situation it can’t handle, the driver will be alerted that they must resume manual control.

“As vehicles become fully autonomous, even the most observant human driver’s attention will begin to wane. Their mind will wander …"
Neil Stanton, Southampton University

This process of resuming control is referred to as “getting back in the loop”, and it poses a number of potential issues. For example, researchers in the Venturer trials, a three-part UK-based study into self-driving cars, found that under test conditions a driver could react to take control after about one second. But they then needed between another one to two seconds to resume 'active' driving control.

This is a total response time of between two and three seconds. The study showed that, at 50 mph, a car would travel nearly 45 metres before its driver managed to assume active control of the vehicle. What’s more, it found drivers tended to drive sub-optimally – typically more slowly - for up to 55 seconds after the handover period.

Outside of test conditions, it is reasonable to imagine that this lag time could be much longer, given a driver’s attention is likely to wander. The report by the House of Lords Science and Technology Select Committee [2017], which contained evidence from Professor Neville Stanton of Southampton University, sums up the issue:

“As vehicles become fully autonomous, even the most observant human driver’s attention will begin to wane. Their mind will wander … This is particularly true if they are engaging in other activities such as reading, answering emails, engaged in conversations with passengers, watching movies or surfing the internet.”

This handover period between car and human poses a number of legal and insurance implications too, which the government is aiming to address through the Automated and Electric Vehicles Act 2018. For example, when an accident occurs and a car is in fully autonomous mode, liability could lie with the manufacturer of the vehicle, not the driver. However, this is a complicated and developing subject, with plenty of grey areas.

Tesla Autopilot technology

Tesla Autopilot technology

Audi A8 Level 3 autonomous car

Audi A8 Level 3 autonomous car


Mercedes-Benz concept autonomous car Mercedes-Benz concept autonomous car

Source: Mercedes-Benz

Source: Mercedes-Benz

How Google Waymo interacts with other vehicles.

How Google Waymo interacts with other vehicles.

Even if such issues are addressed, as in time they may be, there is still the problem that the first fully autonomous cars will have to mix with vehicles controlled by human drivers.

Avery says, “Driving is really complicated. Drivers are constantly negotiating and taking chances.”

He gives the example that when making decisions on the road, such as, “can I pull out in front of this vehicle?” we look into the eyes of the other driver to see if he will let us go. “That process you do as a human, AI can’t do yet. And, although it will get there, the first self-driving cars are likely to be really cautious.” Avery therefore wonders whether travelling in a self-driving vehicle would be frustrating rather than relaxing.

“For example, [an autonomous vehicle] might leave a large gap between it and the car in front, but another car being driven manually may seek to fill that gap. And so the car will readjust and move further backwards to keep the gap, giving the passenger the feeling he or she is going backwards or not making progress,” he says.

The report How Autonomous Vehicles Could Relieve or Worsen Traffic Congestion [2016] by technology consultancy SBD Automotive, highlights a number of similar issues it sees occurring in what it calls a “risky” medium term of five to 20 years ahead.

“Why would a pedestrian walk to a crossing ... if it was possible just to step into the road, knowing that all the traffic would stop?"
Professor Martyn Thomas, Gresham College

“In the early days, the sudden introduction of a small number of highly-autonomous cars among millions of traditional cars will create unanticipated consequences,” the report states.

“Other drivers may be surprised by the different behaviour of autonomous cars, leading to more accidents. Slower acceleration and deceleration rates among autonomous cars may be required to enhance passenger comfort, but would lead to the overall flow of the road decreasing. Even as the penetration rate of highly autonomous cars rise, knock-on changes in travel patterns and human behaviours could also have negative effects on traffic congestion.”

For example, it suggests, “people who are less likely to drive or own cars now, such as the elderly and young, may suddenly start buying cars, leading to busier roads”.

And then there is the issue of how pedestrians would react to self-driving cars.

In a paper, Is Society Ready for Driverless Cars? [2017], Professor Martyn Thomas of Gresham College in London, says: “Why would a pedestrian walk to a crossing and wait for the traffic lights to change if it was possible just to step into the road, knowing that all the traffic would stop? Pedestrians and cyclists would rule the roads in cities, possibly making it impossible for vehicle traffic to flow.”


Source: Mercedes-Benz

Source: Mercedes-Benz

In addition to the many practical difficulties that need to be overcome, there are a few societal issues to explore too. For example, in a potential sign of things to come, earlier this year the website motor1 discovered that Ford had filed a patent for an autonomous police car.

It said the car would assume “routine” police tasks, such as detecting speeding drivers and even handing out tickets, all controlled by a super-advanced AI. As the article says, a large majority of patent applications like this never move past the initial stage, but “it’s a slightly unnerving sign-of-the-times that such things are even being considered”.

Concept of a driverless police car

Concept of a driverless police car

Concept of a driverless police car

In 'The Conscious Car', players are forced to make moral decisions on autonomous cars.

In 'The Conscious Car', players are forced to make moral decisions on autonomous cars.

The issue of machines taking over human tasks is, of course, nothing new. However, self-driving cars pose a number of particularly tricky questions. An article featured by the British Psychological Society asks the question of whether we can entrust moral choices to autonomous vehicles.

The authors talk of the so-called trolley problem – “an emergency situation where a driverless vehicle must choose between two courses of action, each of which would cause harm to or the death of one or more humans”.

The piece concludes: “It’s up to human trainers and programmers to make the moral choices in advance and to provide sufficient learning experiences to build an appropriate model.”

In order to see how humans would like self-driving cars to tackle such life and death situations, we created our own experiment: The Conscious Car. The Oxford English Dictionary defines conscious as “aware of and responding to one's surroundings.” And it was inspired by a similar experiment, the Moral Machine, by the Massachusetts Institute of Technology.

As part of our experiment, people were presented with different driving scenarios in which an autonomous car had only two choices of action – A or B. In either course of action, a human (or an animal in some scenarios) would die. So the experiment looks at who people would prefer to save.

While not an easy thing to decide upon, the findings are interesting and help shed some light on how humans would like self-driving cars to behave.

For example, in a situation where a self-driving car had to choose whether to protect a car’s adult occupant versus an adult pedestrian, we found 68% of people would prefer the car to be programmed to protect the pedestrian. This dropped to 64% when the pedestrian in question was an elderly person, but increased to 83% when the pedestrian was a child.

“It’s up to human trainers and programmers to make the moral choices in advance"
British Psychological Society

Meanwhile, the experiment also looked at whether humans would prefer a car to protect more lives where possible. When presented with a situation in which a car must choose between protecting either one adult occupant, or two adult pedestrians, 86% chose the latter. And when asked whether a car should protect either two adult occupants, or one adult pedestrian, 71% of people chose the occupants in this situation.

Looking at the results as a whole, 58% of participants showed a preference for protecting pedestrians over occupants, compared with 27% showing a preference for occupants, while 16% remained neutral.

Meanwhile, 33% showed a preference for protecting many lives over fewer, with 22% preferring cars to protect individuals and 46% remaining neutral. And in situations in which a self-driving vehicle had a choice of protecting older or younger people, 59% showed a preference for children to be protected, while 9% preferred the elderly, and 32% were neutral.


Of course, just because something is difficult to do, doesn’t mean it shouldn’t be done. But in this case, will the introduction of self-driving cars actually solve many of the problems we hope they will? In short, is it all worth it?

One person who thinks self-driving cars shouldn’t be held up as the transport equivalent of the Holy Grail is Dr Ian Walker, an environmental psychologist at Bath University.

"There is a real danger that all of today’s problems are being batted 30 years down the road."
Dr Ian Walker, Bath University

“There’s this huge list of problems we have and in the best possible science-fiction scenario, automation can’t fix most of them,” he says. “It can’t fix social isolation – if anything it makes it worse. It can’t fix people getting insufficient exercise – if anything it makes it worse; it can’t fix road wear; it doesn’t fix out-of-town retail harming city centres; it doesn’t fix suburban sprawl; and it doesn’t fix or reduce employee unproductivity and absenteeism because of a lack of activity. If anything it makes all of these things worse.

“And so I think we need to be incredibly careful for a couple of reasons: one, it might just make everything worse. But second, there is a real danger, and I’m seeing this occur more and more, that all of today’s problems are being batted 30 years down the road.

“So you say to people, ‘look there’s a problem here in this city. But it’s going to be fine because sooner or later we’ll all be in autonomous cars so let’s not think about it.’ And I really worry that they’re being used, both deliberately and unconsciously, as a way of just batting away today’s problems in the hope they’ll eventually fix themselves.”

“We’re failing to address urgent, pressing problems because of this vague promise of a messiah that’s going to come along in the future and fix everything,” he adds.

One of the main issues, according to Dr Walker, is the car manufacturers themselves.

“At the moment, the car manufacturers’ business model is to sell you one car per person. And so we have multiple-car households, with vehicles that are sat about doing nothing most of the time.”

Instead, he says: “Probably the only system that really makes any sense is to have a small number of self-driving cars that are shared. But that requires car manufacturers to completely change their business models. And it’s unlikely they’ll do that spontaneously.”

Although some manufacturers have talked about shared mobility, Dr Walker cites Ford as an example. He says: “It’s such a mind shift of 100 years of simply selling everybody a car that I’ll believe it when I see it”.

Source: (nts0205)

Source: (nts0205)

Source: (nts0205)

Source: (nts0205)

Source: (nts0205)

Source: (nts0205)

Source: (nts0205)

Source: (nts0205)

1950s Ford advert encouraging each family member to own a car.

1950s Ford advert encouraging each family member to own a car.

Instead of seeing driverless cars as a solution to many of our problems, Dr Walker says there’s a much more important question over how we live and work.

“Probably one of the bigger issues is planning. You know, we need to be thinking about where we put new housing, where we put workplaces, and have a much better plan of how these things are linked.”

Dr Walker adds: “If what we did as a society was to say, our vision is that everybody in Britain is able to live without having to buy a car, then that can feed into every decision. It can feed into health decisions; it can feed into school locations; it can feed into planning. And it would hopefully seed into individual people’s decisions about where they live and work.”

He says the issue of car ownership also raises some quite interesting questions about fairness.

“If you think about someone on a low income, for example, living in such a way that he or she is obliged to spend a large part of that income on a car and running it, and society doesn’t allow an alternative to that, then that’s actually really trapping people.”


In conclusion, it appears that the government’s target of having fully autonomous cars on our roads by 2021 is not going to happen easily. And even if it does, we’re most likely looking at a relatively small number of cars with certain automated abilities, rather than a big switch where everyone ditches their manually driven vehicles overnight.

More likely is this long and potentially troublesome transition period, in which not only does the technology need to develop much further, but legal and insurance issues need to keep up too.

Even when fully autonomous cars do reach us, we may not be ready for it.

Professor Martyn Thomas says: “It will take at least a decade and maybe far longer to solve all the problems that currently prevent manufacturers from building Level 5 autonomous cars and showing that they are at least as safe as human drivers under all conditions. Unfortunately, hubris, commercial pressures, competition between nations and the limited understanding of complex issues by policymakers will certainly lead to Level 5 cars being used on public roads long before they have been proved to be safe enough.”

As for our original question, is it all worth it? Are we on the road towards a self-driving dream or nightmare? It really depends on what we’d like autonomous cars to achieve. Fully autonomous vehicles, once they eventually arrive, are likely to lead to a significant reduction in fatalities. So in that respect, the drive towards autonomy is definitely worth it.

And in the long term, traffic congestion should become significantly less too. As the report by SBD Automotive puts it: “At some stage in the distant future (likely due to government regulation), autonomous vehicles will become ubiquitous and manual driving will be restricted. At that stage, central management of all vehicle movements will become feasible, and alongside the elimination of traffic accidents, congestion will become a distant memory.”

However, both these outcomes could arguably be achieved by simply having fewer cars on our roads, perhaps through a combination of both shared mobility and encouraging people to take fewer journeys by car. Therefore, before we’re sold the self-driving dream, we should think carefully about what we’d like this technology to do for us.


Matthew Avery
Head of Research, Thatcham Research

Stu McInroy
Chief executive, Road Safety Markings Association

Dr Ian Walker
Environmental psychologist, Bath University

Written by Adam Jolley

Graphics by Jamie Gibbs

Video edits by Alice Campion

Interviews by Maria McCarthy

Edited by Chris Torney

Concept images by James O'Brien

Additional graphics by Jessica Morgan