BENDED REALITY.COM
AI prosecuter_214

Chinese Scientists Develop AI ‘Prosecutor’ that Can Press its Own Charges




“Making such decisions would require a machine to identify and remove any contents of a case file that are irrelevant to a crime, without removing the useful information. The machine would also need to convert complex, ever-changing human language into a standard mathematical or geometric format that a computer could understand.”

What could possibly go wrong lol

By Stephen Chen
via South China Morning Post

Researchers in China say they have achieved a world first by developing a machine that can charge people with crimes using artificial intelligence. The AI “prosecutor” can file a charge with more than 97 per cent accuracy based on a verbal description of the case, according to the researchers.

The machine was built and tested by the Shanghai Pudong People’s Procuratorate, the country’s largest and busiest district prosecution office.

The technology could reduce prosecutors’ daily workload, allowing them to focus on more difficult tasks, according to Professor Shi Yong, director of the Chinese Academy of Sciences’ big data and knowledge management laboratory, who is the project’s lead scientist.

“The system can replace prosecutors in the decision-making process to a certain extent,” said Shi and his colleagues in a paper published this month in the domestic peer-reviewed journal Management Review.

The application of AI technology in law enforcement has been increasing around the world. Some German prosecutors have used AI technology such as image recognition and digital forensics to increase case processing speed and accuracy.

China’s prosecutors were early adopters when they began using AI in 2016. Many of them now use an AI tool known as System 206. The tool can evaluate the strength of evidence, conditions for an arrest and how dangerous a suspect is considered to be to the public. But all existing AI tools have a limited role, because “they do not participate in the decision-making process of filing charges and [suggesting] sentences”, Shi and colleagues said.

Making such decisions would require a machine to identify and remove any contents of a case file that are irrelevant to a crime, without removing the useful information. The machine would also need to convert complex, ever-changing human language into a standard mathematical or geometric format that a computer could understand.

China’s internet companies have developed powerful tools for natural language processing, but their operation often requires large computers that prosecutors do not have access to.

The AI prosecutor developed by Shi’s team could run on a desktop computer. For each suspect, it would press a charge based on 1,000 “traits” obtained from the human-generated case description text, most of which are too small or abstract to make sense to humans. System 206 would then assess the evidence.

The machine was “trained” using more than 17,000 cases from 2015 to 2020. So far, it can identify and press charges for Shanghai’s eight most common crimes. They are credit card fraud, running a gambling operation, dangerous driving, intentional injury, obstructing official duties, theft, fraud and “picking quarrels and provoking trouble” – a catch-all charge often used to stifle dissent.

Shi and colleagues said that the AI prosecutor would soon become more powerful with upgrades. It would be able to recognise less common crimes and file multiple charges against one suspect. It was unclear when or whether the technology would find applications in other fields. The team could not be reached for comment when the report was published.

A prosecutor in the southern city of Guangzhou said he had some concerns about the use of AI in filing charges. The prosecutor, who requested not to be named because of the sensitivity of the issue, said:

“The accuracy of 97 per cent may be high from a technological point of view, but there will always be a chance of a mistake. Who will take responsibility when it happens? The prosecutor, the machine or the designer of the algorithm?”

Direct involvement of AI in decision-making could also affect a human prosecutor’s autonomy. Most prosecutors did not want computer scientists “meddling” in a legal judgment, the Guangzhou-based prosecutor said.

Another issue is that an AI prosecutor could file a charge based only on its previous experience. It could not foresee the public reaction to a case in a changing social environment.

“AI may help detect a mistake, but it cannot replace humans in making a decision,” the prosecutor said.

Nonetheless, China is making aggressive use of AI in nearly every sector of the government to try to improve efficiency, reduce corruption and strengthen control. Some Chinese cities have used machines to monitor government employees’ social circles and activities to detect corruption, according to researchers involved.

Many Chinese courts have been using AI to help judges process case files and make decisions such as whether to accept or reject an appeal.

Most Chinese prisons have also adopted AI technology to track prisoners’ physical and mental status, with the goal of reducing violence.

About the Author:
Stephen Chen investigates major research projects in China, a new power house of scientific and technological innovation. He has worked for the Post since 2006. He is an alumnus of Shantou University, the Hong Kong University of Science and Technology, and the Semester at Sea programme which he attended with a full scholarship from the Seawise Foundation.

Stephen Chen
South China Morning Post


IF YOU ENJOY THIS SITE PLEASE TELL OTHERS ABOUT US…SHARING IS CARING
Start the discussion, leave a comment below

Leave a Reply

© 2015-2023 BENDED REALITY.COM

Close