Autonomous vehicles are already on our roads – but are drivers really ready for self-driving cars? A look at the remaining legal and technological challenges to be overcome.
It was just past 10 pm and completely dark when the woman in Tempe in the US state of Arizona stepped into the street. A car driving past at just that moment – an autonomous vehicle from the US ride service Uber – made no attempt to stop, according to the police. The resulting collision caused the first known fatality involving a self-driving car. How could it happen? Many of the vehicle’s high-tech sensors had been turned off, so they could not recognize a person with a bicycle was in the road. According to Uber, emergency breaking is disabled when the car is driven by the computer, in order to avoid unpredictable maneuvering that can confuse other traffic participants. Instead, the safety driver has to take action, relying solely on their eyes, as the system will not warn them.
The accident in Tempe reignited the discussion about self-driving technology and legal framework for it – even when studies predict that autonomous vehicles will actually make traffic safer. Developers of self-driving cars argue that 90 percent of all accidents are due to human errors that autonomous technology would prevent. One way to improve a vehicle’s ability to scan its surroundings is by complementing senors with a wireless connection. This allows self-driving cars to communicate with other traffic participants over wider distances, as well as predict conditions behind curves in the road and recognize cyclists around corners.
German drivers generally have a positive opinion towards autonomous vehicles. According to a recent study TÜV-Rheinland about the safety of autonomous cars, three quarters of German drivers are willing to travel in such a vehicle. Still, they remain skeptical about the implications they have for data protection, cyber crime and road safety amid ever greater automation. This is due to several remaining technical challenges and the lacking legal framework. Here are just a few examples:
ABILITY TO COPE WITH URBAN TRAFFIC
Although test vehicles have racked up plenty of miles on highways, self-driving cars have considerably less experience dealing with the more chaotic conditions of urban traffic. There’s still not enough practical testing to ensure the safety of children, cyclists and older people. Or dealing with stop-and-go traffic and busy intersections – all scenarios requiring a direct Car2X connection via LTE-V, or in the near future 5G-V2X.
200 OPENINGS FOR HACKERS
A connected car provides plenty of opportunities for hackers to create mischief – from an on-bard diagnotics (OBD) interface to wireless or USB connections. The more technology involved, the more security measures are required. IT security is cruicial to ensuring driving safety, which can be a matter of life or death. According to the study from TÜV Rheinland, consumers are usually willing to provide carmakers with their data. But 30 percent of those surveyed in Germany are skeptical whether this data is secure from misuse. The automobile industry is aware of this new environment and is working intensively to protect their cars from cyber attacks. One aspect of this effort is the project AUTOSAR, which is developing standards for control device software and offers rewards for uncovering and reporting security gaps.
LAWS FROM 1968
The laws relevant to autonomous driving have not yet been updated in many places. In Germany, for example, there’s still the Vienna Convention dating back to 1968. This states that drivers must always have control over their vehicles. The German parliament passed changes to the country’s traffic laws in March 2017, but there still needs to be changes to the international regulations that continue to rule out autonomous driving.
AN UNAVOIDABLE MORAL DILEMMA
An ethics commission established by the German Transport Ministry last year presented 20 principles for programming autonomous driving systems. Should it be impossible to avoid an accident, the experts understandably recommended giving the safety of people priority over objects and animals. But should a self-driving car decide to put the life of a child over that of a pensioner? It’s an unavoidable moral dilemma.
DATA SOVEREIGNTY FOR DRIVERS
Increasing connectivity means the amount of data collected will correspondingly grow. “Autonomous and connected driving will cause data volumes to explode,” said Udo Di Fabio, the director of the German ethics commission, while presenting its report last June. Accordingly, the commission recommended drivers must be able to decide who receives their data – in accordance with Europe’s new data protection regulations.
The two German houses of parliament passed a law in spring 2017 regulating the liability for self-driving cars. Essentially, when a vehicle is traveling in autopilot mode, the carmaker is liable, at other times the driver is. But who is ultimately responsible for any errors? The software supply? The manufacturer? The wireless provider?
“In the end, acceptance will depend less on the technology than what will be offered on top of it,” says Tim Lehmann from the Institute for Urban Mobility. That could mean a gaming console could be more important than data protection.
Expert Digital Marketing
Digitization and the Internet of Things are among the favourite topics of Daniel Kunz. He has been with Deutsche Telekom since 2017 and regularly writes about technology trends and many exciting topics, especially for the retail trade and the logistics industry.