The Federal Minister of Transportation and Traffic, Alexander Dobrindt followed the mandate of the Federal Cabinet and his Chancellor and constituted an Ethics Commission . Among other things, it is supposed to clarify whose liability it is if an autonomous vehicle causes an accident – the driver’s or the manufacturer’s.

After all, it is quite possible that one of those crazy autonomous computers will cause an accident because they made the car speed! Who will then get the ticket – or even the complaint?

However, the ethics commission is also supposed to find out if there are ethical norms which the autonomous vehicle has to adhere to in conflict situations. The former Federal Constitutional judge Udo di Fabio will preside over the commission. The minister gave the “Wirtschaftswoche” an interview about it.

11348857_10206989802848252_348583267_oEver since my first seminar with Rupert Lay in the early 1980ies, ethics has been something that interested me very much. As I understand it, ethics is also concerned with moral dilemmas. One of the fundamental examples is the Trolley-Problem (Trolley-Problem).

Let me cite a Wikipedia Article:

Due to a wrong switch stand, a freight train is threatening to collide with a stationary train full of passengers. A worker discovers the threat and moves the switch in such a way that the freight train will end up on an auxiliary track where it runs into a group of maintenance workers, all of whom die. How accountable is the person who moved the switch?

Welzel is said to have asked this question in 1951. In the following years, up until today, many “mental experiments” of this or a similar nature were formulated. One of the most acute, at least one of those that impressed me most, is the following:

A doctor has ten patient waiting in his medical practice. Every one of them is at death’s door because one of his inner organs (a different one for each patient) is completely destroyed. In order to get well, they all need an “organ donation” immediately. But there is no chance that any organs will be available.

By chance, a healthy person enters the practice. He has all the organs the doctor would need in order to save all his patients. Should the doctor kill the man in order to save all the other ten?

Well, the example brings the topic to a culmination. Regardless of it ethically being absolutely within the scope of consideration to kill one person in order to save ten, most people will consider this solution completely inacceptable. Why? Perhaps because then nobody would ever again dare to go and “see the doctor”.

To me, this seems the real purpose of moral: we want to make things we are afraid of impossible. Things that we want to avoid at all costs. Consequently, those are the things where you have to say: this is a no-go! The very idea is a taboo.

For me, this “mental experiment” is so valuable because perhaps it teaches us what lies behind morals (You do not do this!).

The public television channels, too, are now concerned with ethics. On October, 17th, 2016, the ARD broadcast the TV experiment “Terror – Your Verdict“. And then they asked the audience to decide how the film ends (guilty or not guilty for the pilot with the ethical dilemma). However, the critical voices I read afterwards were not really enthusiastic about the experiment.

Incidentally, I find the doctor example a lot more realistic than the one with the trolley. I imagine that doctors will actually once in a while be faced with this sort of dilemma, for instance if, after a catastrophe such as the Bad Aibling train crash, they have to decide what patients to help first. Even if this, too, is a lame example.

Let us go back to all those mental experiments with trolleys, trams, freight trains, etc. They are all rather exciting material for an intellectual discussion. But for practical application, it all seems extremely useless to me.

All those constructs originate in examples with traffic that is bound by tracks. However, I never heard of a single event where something like this happened in reality. Which means that no worker in transportation world-wide ever was confronted with this kind of situation. So we actually discuss and work intellectually and ethically with pure mind games.

In week-end SZ edition, you can find a well-written “digital” article about the Bad Aibling train accident. Twelve persons were killed and 89 wounded on the morning of February, 9th. The digital article is titled
Chronologie eines vermeidbaren Unglücks
(Chronology of an accident that could have been avoided).
I strongly recommend that you read the article by clicking on the link.

This shows that reality looks totally different. Especially if you have an accident situation. We learn that:

  • With those electronic signal-boxes that are technologically up-to-date as far as DB standards are concerned, the station inspector would have been notified of his first ok-signal a lot more sternly: at least by a thick, red, sparkling arrow. However, there is no such display at the Bad Aibling signal-box, because the technology was older. This was a safety risk the Deutsche Bahn had long been aware of. An internal guideline would have recommended as early as in the 1980ies that the relay signal-box should be updated. If the signal-box had been “digitalized” to meet the “current state of technology”, there might have been a good chance that the accident would not have happened. A complete digitalization would probably have prevented the entire accident. Maybe we should discuss if that is ethical?

What else do we learn?

  • Shift work is not a good thing! 
The station inspector had started work at 5 a.m. The way from his family residence on a farm to his place of work at Bad Aibling – ten kilometres west of Rosenheim – is forty-five minutes. Due to a storm the German Weather Service had announced during the night, the station inspector had probably left home even earlier than usual. This makes me assume that his alarm clock will have rung around 3.00 a.m. In other words: he cannot have had a very long night. 
Shift work is always a problem. It is detrimental for your health. There are many studies that prove this fact. And whenever I sit in an S-Bahn train early in the morning (with which I mean before 6.00 a.m.), I only see grey faces (except those of the young girls and boys who enter at Ostbahnhof on their way home from the “Kunstpark Ost”). And all those people are not really at their best at this time of day. At least I am not. But here the good news:  
Computers (digital systems) do not mind night shifts!

We also learn that you should not play computer games when working.

  • Computer games are dangerous! 
At 5.11 a.m., the station inspector starts the video game “Dungeon Hunter 5“ on his smartphone. In the virtual role play, he is hunting monsters and villains as reward hunter. It says in the railway service regulations that station inspectors may use their smartphones at work when it is necessary for their job. Games are explicitly forbidden. And everybody will immediately say that, of course, computer games are not allowed at work. 
But is that realistic? Who abides by the rule? After all, we get more and more standby work places. The best example is the extremely well-paid job of the pilot. They are top earners and their job is tough. Shifting work schedules, night shifts, climate changes, etc. 
Except that they told me that the average pilot of a long-distance flight of around eight hours only has two five-minute intervals during which he actually has to work hard. So what to do during all the other hours? Drink? Well, that is something you are not allowed to do. So the only thing you can do is play. I also remember well all the fairs I attended where the bored personnel enjoyed playing solitaire on their PCs – and I freely admit that I, too, had a time when I was solitaire addicted. Mind you, this is not because of the game addiction. Anybody can get game addicted. Instead, it is because this game was probably the reason why Windows ever became great. The good news is again: 
Computers (digital systems) do not play! They focus on their work!

That is why I believe we should – first and foremost – get digitalization well under way in order to make life healthier and safer.

Except – the cars of the future are now supposed to solve these problems by using programs – at least that is what the ethics commission thinks. And they have to decide which cyclist is to be victimized if in a situation (mental experiment!) there is a choice between killing one or the other cyclist. Let us assume the one cyclist is a man riding without a helmet. The other cyclist is a woman wearing a helmet. Should the system decide that the woman will be overrun because – due to her helmet – she has the better chance at survival? Or the man as punishment for not wearing his helmet? Or should they base the decision on gender or age? Or on what social responsibility the man or the woman has …

To me, this all looks like nonsense. Consequently, I do not appreciate the Dobrindt ethics commission. As likely as not, it is just another small piece in the mosaic for the next election race with which the Big Coalition wants to show what important topics it – as the only administration world-wide, just like with data security – has been tackling so courageously and prudently, thus having a particularly responsible position in digitalization. Even if such a position is actually far from reality in current times.
Someone once said: all politicians talk digital change and throw terms such as block chain and big data around. Yet they have no idea what those terms mean! Just like they want reforms but no change (reform is violence-free change). And innovation is promoted, but nobody promotes destruction. Except: innovation is basically nothing other than creative destruction. I always get the impression that politicians who hear stories of bloggers and blogs always secretly contact the block warden in order to prevent things from happening.

If at all, I would wish for an ethics commission in the ministry of Frau van der Leyen. Such a commission could relate how ethically desirable the use of fighting drones and robots  is, for instance, for freely killing humans. The problem that the internet runs following the motto “the winner takes it all” and the question if it is ethical that some day a concern like google might determine the world alphabet are perhaps useful commission topics. Why not for the Ministry of Trade and Social Relations?

(Translated by EG)

1 Kommentar zu “#Digitalisation – The “Ethics” of IT and “Artificial Intelligence”.”

  1. Guido Bruch (Sunday November 6th, 2016)

    Im Buch Silicon Germany wird die ethische Dimension diskutiert: Beispiel: Kind läuft von rechts auf die Straße. Links geht ein Rentner mit Rollator. Der Mensch würde entweder zufällig oder bewußt entscheiden, wen er umfährt, wenn er nicht mehr bremsen kann. Welche Vorgabe soll man aber einer Maschine geben? Immer das Kind schützen und somit eine Auslese betreiben(und dies bei Deutschlands Vergangenheit) oder einen Zufallsgenerator einbauen. Ich denke, hierum geht es.
    Aber sicherlich sind dies nur theoretische Fragen, da die Anzahl der potentiellen Unfälle gering sein dürfte. Vermutlich würde sich die Thematik hierdurch selber regeln. Gab es schon vergleichbare Unfälle?
    Eine andere Frage wäre, was die Autohersteller tun, wenn sie in anderen Ländern andere Auflagen bekämen? Z.B. in bestimmten Golfstaaten Einheimische schützen und zur Not Gastarbeiter überfahren.

Kommentar verfassen