We are in the process that software is taking over tasks, which today still are done by humans themselves. An example are self-driving cars, which most of the car manufacturers have on their to-do list. This development may be disturbed by a raising individualism of the vehicles. As we have both trends, the question is, will it be necessary that each car would have to get an individual driving license, including a practical test, similar as each human driver has to do? The first fatal accident of Tesla’s autopilot had been explained by the company: “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brakes was not applied.” A problem which could be electronically solved with updates on the software or mechanically on the camera. Similar as a person needs to advise for the getting or renewing the drivers license, if he or she needs glasses. In a next development step, the software can include Artificial Intelligence and learn from experience. This would lead to the situation that each autopilot would be different from another. As the car is connected to the Cloud, it has access to other cars’ experiences, but based on its own individualism, differences stay. Further mechanical parts, as for example camera lenses have wear-effects and for this, require regular maintenance and controls.

1970: Ferrari 512 S Modulo

Even if luxury cars as Tesla are an important communication channel for the self-driving technology, a faster growth in the beginning could come from the commercial vehicles, where the “pleasure of driving” is no relevant factor. An intelligent software can replace the human driver and so take aware the risk that because of cost pressure drivers often are fatigue and cause related accidents. Furthermore, the self-driving vehicle will be included in the company’s “just in time production”-process, so that speed and route could be adapted and with this the vehicle will arrive exactly at the time, when it is required to. This, of course, keeping in mind actual weather conditions and traffic situation. It is no surprise that McKinsey & Company forecasts that by 2025 already one third of the trucks will use advanced self-driving technology.

A modern company’s Compliance system works with values and controls, Asimov’s Law provides the basic fixed values for a robot or intelligent software. Similar to a Compliance system we cannot keep the machine alone with this. Artificial Intelligent researcher and author Andrew Rosenblum created the example that the self-driving car faces the situation that a truck is approaching from the front, what surely will destroy the car and kill its passenger. The only possibility to avoid this situation is to swerve and drive the car into a group of 15 pedestrian.  Purely based on Asimov’s Law, the car would have to do the mathematics and decide to sacrifice its own passenger. This may interfere with the car’s obligation to protect its passengers and owner. Due to this, the car manufacturer may feel tempted to include an additional guideline into the software that the car has to protected its owner, as this is the person who pays the company for the intelligent car. What about the 1:1 situation, where the decision is to sacrifice the passenger or one pedestrian? Here the car has to decide always in favor of its passenger? A government cannot burden such a decision on the car manufacturer, programmer or the software. It is required to establish laws and guidelines, which an Intelligent Software has to follow, especially in such grey areas. If such a near future scenario, the Compliance Officer must be able to control the potential “if-then”-strings of the software. As the discussion about the Volkswagen defeat-software and emission controls show, software engineers are under high pressure to reach the high external and internal goals; they are tempted to find solutions to bypass the regarding controls.

Today chip-tuning is already used to change the management of the engine and find additional horse-power. This is in most cases legal, but liberates the car manufacturer from its guarantee. When self-driving cars are a relevant market, it is a question of time, when programmers will offer software to ensure a higher safety for their owners, programmed preference for the passenger against the pedestrians. As different countries have different legal-systems and underlying values, for example Roman or Anglo-Saxon Law. Most properly an auto-pilot requires different processes for such decision making.  In one country choosing the option with the less numbers of victims maybe be adequate, but in another country actively driving the car against this one person may be interpreted as an active act of killing and murder. Governments and car manufacturers are required to find solutions how to avoid this, via law, but also technical protections against non-approved software. Similar to today’s computer viruses, it will be a continuous competition between new viruses and the anti-virus industry.

Sources:

Advertisements