In China, computers are now suing
An important topic related to artificial intelligence is how its use will change the world of work. For a long time it was assumed that certain professional groups were safe from being replaced by computers due to the complexity of their tasks. This premise is shaking more and more. In China, for example, computer programs are now being used as public prosecutors for the first time.
Computer decides on charges
Prosecutors have neither a relaxed nor an easy job. Their task, most clearly perceived by the public, is to represent the state as the prosecutor in criminal proceedings. But that’s only one side of the job. On the other hand, there are (in part) investigative work and the preparation of the indictment so that a trial can take place in the first place.
In China this is now partly done by a computer program. The AI called System 206 has been used in China for some time to evaluate evidence and assess the danger of criminals to the general public.
Now the algorithm should also be able to replace the public prosecutors to a certain extent. Chinese AI researchers prepared this expansion in collaboration with the Shanghai Public Prosecutor’s Office, which is the largest district attorney’s office in the country.
The AI has learned a lot
The use of the AI to decide on an indictment has so far failed because the algorithm was unable to filter out irrelevant information from a case. In addition, he was not able to process language, for example in the form of interrogations or the like.
The updated algorithm is able to do this and can decide whether or not to bring charges against the eight most common crimes in Shanghai (credit card fraud, gambling, reckless driving, willful assault, obstruction of authority, theft, fraud and political offenses). The program achieves an accuracy of 97 percent.
However, there are also objections to the expansion of System 206 and the use of AI as a replacement for the public prosecutor’s office. There is also criticism from the public prosecutor’s office. The crucial question is who is liable in the event of a mistake. Whether the responsibility falls to the responsible public prosecutor: in, the AI itself or the developers of the AI is still completely unclear.
However, on the other hand, there are of course enough cases in which errors in the prosecution can also be traced back to a person in the public prosecutor’s office. To what extent the AI will affect the internal processes of the public prosecutor’s office can ultimately only be shown in practice.