MPs mull challenges of shift to autonomous vehicles

By Categories: NewsPublished On: Thursday 8 December 2022

A cross-party committee of MPs heard last month from witnesses on the challenges of autonomous vehicle rollout, as part of its inquiry into the development and deployment of the technology.

The experts highlighted a variety of technical, logistical and regulatory hurdles that would need to be addressed by vehicle and systems manufacturers, governments and users of the technology, and outlined potential implications for the logistics sector.

Among the witnesses appearing before the House of Commons transport select committee was Ian Wainwright, chair of the freight and logistics policy group at the Chartered Institute of Logistics and Transport (CILT).

He said self-driving technology offered “some clear potential advantages around safety”, as well as cost saving benefits – but highlighted some issues he said needed to be tackled.

“The biggest issue is probably about human behaviour,” he said.

“People walking around cities with mobile phones perhaps do not tend to look at the traffic so well. There is also the fact that, if a vehicle is failsafe, currently if somebody steps out in front of it, it should stop.

“If we all, as humans, get used to that, do we just wander off any pavement to cross the street, knowing the vehicles will stop for us? The potential for congestion could actually get worse.

“From a commercial vehicle perspective, that is a nightmare because they need journey time reliability.

“Buses are only going to work if they arrive on a regular basis. Deliveries only happen if they are there on time. There are some issues on those sides of things that need to be proven.”

He also highlighted the limitations of the potential cost benefits, citing the Department for Transport’s involvement in the platooning trial with TRL.

“The benefits that people thought they were going to get from the energy saving of putting vehicles together on the motorway were not necessarily as big as people thought they were going to be,” he said.

“There are some issues. There are some potential benefits, but the minute you get into the public domain there is a huge risk.

“Clearly, as a business, it is the business taking the risk. No business wants to put a driver in a situation where they are going to cause a problem, because they have a duty of care to their employees, apart from anything else. I think there are some real questions in there.”

On trials of driverless supermarket delivery vehicles, Mr Wainwright expressed scepticism.

“Clearly, if it is a Tesco or Sainsbury’s home delivery vehicle, there are several orders on there; it is not just one person’s,” he said.

“Who knows what to pick out, particularly if you have different temperature regimes because you want the product stored properly? Do you start at the motorway end and go for fewer people involved, so therefore it is more vehicle to vehicle, or do you start at the slower urban end, where you deal with speed? There is a big issue on both sides of that.”

He added: “There will be some opportunities in that, but I think it is about the pace of change. A lot of the conversation at the moment is dictated by companies trying to push the technology or the vehicle… We have to deal with the people side of it. I think that is important in how we deal with any of this stuff, going forward.”

Access to commercial data for safety purposes was also a consideration, he said.

“Clearly, at the moment we struggle to get commercial data from commercial vehicles, whether they be passenger vehicles or logistics vehicles, into the public domain because there is commercial sensitivity to it… We have to work out how to deal with some of that.

“Is it publicly accessible data, and what does that mean? Clearly, the road incident investigation branch will have to understand some of that data. How far back do they go? They may examine the crash, but are they examining what is there in terms of the driver’s previous behaviour…? There is a whole series of issues about the data and access to it, which may be more than just the incident itself.”

Mark Shepherd of the Association of British Insurers also emphasised the importance of data access.

“A big ask for us in the potential upcoming transport Bill is insurer access to collision data that will be collected or stored on these vehicles… We have set out a series of pieces of data that we think insurers need in order to be able to determine who was in charge and who, therefore, is liable.

“From our perspective, in particular on the data requirements in order to be able to determine liability and, we think, for these vehicles to be able actually to be deployed on the roads, we certainly have assurance from the Department for Transport that those are important issues that they plan to legislate for. There is a question about when that happens and whether this transport Bill is the vehicle that they use for it. That is why I think we would like this committee’s influence to push home the importance, we feel, of making sure that that happens.”

Mr Shepherd added: “If we are to meet the UK’s ambition of being one of the world leaders in bringing this technology to the fore, which I think from an insurance industry point of view is a great thing — it is fantastic and will no doubt reduce collisions, make our roads safer and our cities and urban areas probably better and more liveable spaces, so there are huge benefits to this technology being adapted — we need to have the regulations and the legislation that underpin it keeping up with the technology.”

Becky Guy, road safety manager at the Royal Society for the Prevention of Accidents, told the MPs that while automated vehicles have been shown in trials to operate in some contexts, their performance in other conditions needed further development before they become a common sight.

“We believe that safe, fully autonomous vehicles are still quite some way off and that probably, in the first instance, they would need to be confined to the simplest types of roads, and only when it is proven that they are safe and can operate reliably can they start to operate on other roads too,” she said.

Like Mr Wainwright, Ms Guy also highlighted a range human factors to take into account when adopting autonomous technology.

“One of the key issues for us at the moment is the idea of a partially automated vehicle that issues transition demands,” she said.

“The role of the driver effectively moves from operating the vehicle to becoming a system supervisor.

“The real challenge is keeping that person engaged and in the loop of the vehicle, especially if drivers begin to believe that the system can operate in lots of contexts and it is very unlikely that they would have to intervene, or that the vehicle would crash. They might tend not to pay as much attention to their driving.

“What we came across in this sphere was a study that spoke about ironies of automation. Far from alleviating the driver of the driving task, these systems ask the driver to take control of the vehicle and make quite complex decisions.

“In terms of task allocation, the moment the things that the average human driver can handle well are handed over to automated systems, it leaves the more challenging tasks to the human driver.

“There is also disengagement. By not driving all the time, and perhaps zoning out of situational awareness, drivers are less skilled, and they can be a bit more delayed in reacting to the vehicle when it issues the transition demand.

“If we think about it in an aviation context, commercial pilots are highly trained. They are given perhaps several minutes to make themselves aware of the situation again and take the appropriate action, but with vehicles it looks as though it could be seconds.”

She highlighted the danger of potential biases in the designs of autonomous vehicles.

“Current facial recognition shows a bias towards white male faces,” she said.

“When we look at people from different ethnicities and women, for example, the accuracy of those systems goes down. There needs to be a lot of testing and proof that these vehicles can detect different genders and ethnicities.

“We are looking at whether they are going to be looking at leg movements. Can they detect people who are wearing dresses or skirts rather than trousers?

“Can the vehicles detect disabled road users if they are in a wheelchair or on a mobility scooter? There is quite a lot that we need to do to look at the interactions with all the different types of road users, such as pets and horses. The list goes on.”

Meanwhile, Professor Jack Stilgoe of University College London told the committee that public perception would also play a key role in the viability of self-driving vehicles.

“At the moment we simply do not know what the levels of public acceptability of risk would be,” he pointed out.

“Even if we saw dramatic improvements in average safety as a substantial number of vehicles came to be automated, we might see a redistribution in risk and new types of risks — cyber-security risks and systemic risks — and therefore the defenders of these vehicles, whether they are the secretary of state or the manufacturers, might have to face difficult questions.

“Even if there were dramatic average improvements in safety, if one of their vehicles was found to have caused a death, how do they defend that?”

He added that important and potentially difficult conversations would be needed around the rules of the road, which he pointed out are currently configured for human contexts.

“They are designed around a world that is human readable rather than machine readable,” he said.

“We can think about particular parts of the road network; zebra crossings are a really interesting one, a conversational piece of infrastructure in which there is negotiation between a pedestrian and a driver.

“That makes it a very hard thing for a self-driving vehicle to navigate, dealing with those moments of uncertainty.”

He cited Exhibition Road in London as an example of ‘shared space’, where traditional separations between cars and pedestrians were removed.

“We can ask whether self-driving vehicles would be as comfortable there as they are in segregated spaces, where the rules of the road are much more hard and fast and where the expectations on pedestrians are much clearer and you are more likely to have cycle lanes, bus lanes and all the rest of it,” Professor Stilgoe said.

“As soon as you start to blur the boundaries, things become much harder for an automated system. We might see pressure to “upgrade” the rules of the road or upgrade infrastructure in order to suit automated vehicles. That might be a good thing for overall safety, but it is also likely to be contentious, particularly for vulnerable road users.”

Ashley Feldman, transport and smart cities programme and policy manager at trade association techUK, echoed Professor Stilgoe’s point that cyber-security was an area of concern.

“As we have more connected vehicles and more connected systems—what we call the cyber physical infrastructure—we are going to need more security professionals in our companies and in our public sector,” said Mr Feldman.

“Frankly, at the moment we do not have nearly enough. We need to ensure that we have those skills coming through because with cyber, which is an absolutely critical part of a vehicle’s safety case, we need to ensure that we have a culture of security baked in.”

He continued: “When we are talking about connecting automated vehicles and cyber, it is the connected part where the risk or the attack surface is. It is less in the actual self-driving part; it is more in the connected part. Any time there is an exchange of data or a command issued, you create a vulnerability.

“We can be frank about what the risks are. They are very significant. For example, the safety features of a vehicle could be hacked. What I mean by that is things like your steering, your braking, your acceleration and even the operation of the airbags. Those systems could become vulnerable and could be taken over by a malicious actor.

“We also have concerns about data. Vehicles store within their infotainment systems, or similar systems, potentially personal and sensitive information. There is a risk that that data could be breached.

“A final way is something like a ransomware attack, which is a particularly frightening incident to happen to somebody. It would be a message that you see in your car that says, ‘You are no longer in control of this car. We’ve got control and you have to pay a ransom.’ It is a really scary scenario to paint for people.

“The good news is that there is some positive work being done. Just as safety is being architected into the vehicles that are being developed, so security is being architected from first principles, ensuring that we have the right security people who understand secure by design principles at every stage, from inception to design to manufacturing and deployment.”

Mr Feldman added: “Emerging regulations will mean that a developer who wants to put a vehicle forward for approval will have to have demonstrated that they have met robust cyber-security standards.

“There are two regulations at UN level which the UK was very instrumental in helping to develop. There is good work under way. Yes, the risk is significant, but it is not insurmountable.”

Ben Gardner, senior associate at law fim Pinsent Masons LLP, agreed that the consequences of a cyber-attack on a connected automated vehicle could be “incredibly severe”, and emphasised the ongoing obligation on manufacturers and systems developers to keep onboard systems up to date.

“There is then an obligation on the user, the owner, of the vehicle to download those updates, to not jailbreak the vehicle, and to keep the vehicle security system to the required standard, as dictated by the entity that sold the vehicle to them in the first place.”

He added: “There are already criminal offences such as hacking and the like, but probably more needs to be done now to put some flesh on the bone of all the different instances, consequences and the grave risks that could result from hacking a [vehicle] versus an iPhone.”