Artificial intelligence looks to justice but raises ethical doubts | Innovation

| |

A woman appears at the entrance table of a court in Buenos Aires, Argentina. It is 10 in the morning on a cold Monday in August 2018. She arrives accompanied by her three children, who sit on the floor to play, while their mother talks to the employee. “I come to present an amparo to collect the allowance for street situation,” slides the woman with some embarrassment. After a few questions, the answers of which go to a paper form, the employee asks for his identity card and walks into the office. The lady decides to sit and wait. She is tired and knows that it will be months before this little snowball she has just pushed becomes the possibility of sleeping indoors.

What he does not know is that, if his file is not resolved in the first and second instance, the Public Prosecutor’s Office will intervene as a last resort. And there, unlike the previous instances, the causes are resolved in minutes. How is it possible? Because they work with a computer system that uses artificial intelligence (AI). Her name is Prometea.

When a judicial officer of the aforementioned ministry takes the file in his hands, he should only answer, by speaking or writing, the questions of a chat like WhatsApp. And, in exactly four minutes, you will have obtained the opinion, in addition to the relevant statistics for the case and links of interest to illustrate the decision. Then, the body’s lawyers only review the procedure, print and sign. They will have completed in half an hour a job that usually takes months.

A model that interests in Spain

In October 2019, authorities from the modernization area of ​​the Spanish Ministry of Justice visited the offices of the Public Prosecutor’s Office of the City of Buenos Aires. The objective was to get to know Prometea, the computer system used in the Argentine capital to solve cases of diverse but simple resolution: minor infractions, traffic accidents or social policies, among others.

Sofía Duarte Domínguez, general director of the body previously called Modernization of Justice in Spain —in January 2020 it became the Digital Transformation of the Administration of Justice—, made statements about it in the Argentine press: “We have studied everything about Prometea We know that it is a fabulous system and we want to see if we can take it to Spain. Even the [entonces] Secretary of State for Justice, Manuel Dolz, gave us carte blanche to move forward with this, which is, without a doubt, the future of justice ”.

The subject should not take us by surprise. A few days before the visit of the Spanish delegation to the Buenos Aires organization, the professor of Law and Political Science of the Open University of Barcelona (UOC) David Martínez explained, in an article published by The vanguard, that AI could well be used in Spain in cases of “easy legal response”, thereby reducing traffic in judicial files. Despite the fact that Duarte Domínguez stresses that the digitization of the entire Ministry is fertile ground to automate justice, she herself warns that one of the main obstacles to the process lies in the resistance of judicial workers, who believe that information technology will take away from them. the job.

In favor of automating justice

Martínez’s observations are in line with what some Argentine experts think, committed to the task of doing intelligent justice. This is the case of the judge of the Supreme Court of Justice of the province of Mendoza, Mario Adaro, who applies Prometea daily and recently participated in the first Ibero-American Summit on Artificial Intelligence, at MIT headquarters (Boston). “AI has a high-volume information processing capacity that shortens bureaucratic deadlines by no means negligible because, usually, the greater the number of causes and fewer decision-makers, the more time per case,” he points out to EL PAÍS RETINA. “Using automatic processes, the judge has greater capacity for analysis.”

The Deputy Attorney General of Buenos Aires, Juan G. Corvalán, created Prometea after having detected that, in half of the cases in which judicial personnel intervene, most of the time it is used to verify personal data, information that is reiterated , etc. Adaro illustrates this with the example of tax causes, noting that “they are serial, large-volume sentences, where decisions are groupable into clear sets and everything is quite mechanical and predictable. By using AI for these types of problems, Prometea reduces the number of errors in data loading, typing and redundancy significantly, “says the judge from Mendoza.

“AI can process information in large volumes, which shortens bureaucratic deadlines”

Origin: United States

There are three iconic AI enforcement cases in justice, in addition to Prometea. The most famous is the Compas program (Correctional Offender Management Profiling for Alternative Sanctions), which is used in several States of the United States. It is a software that has been used since 1998 to analyze, according to the criminal record of a defendant, his chances of recidivism. The program poses a questionnaire to the accused. Once he answers all the questions, the system calculates the risk of recidivism, so the judge defines, for example, whether or not to grant parole while the judicial process is completed.

Compas rose to fame in the 2013 Loomis case. Accused of running away from the police and using a vehicle without authorization from its owner, Eric Loomis had six years in prison and five years of probation because Compas estimated a high risk of recidivism. . Loomis appealed, claiming that his defense could not refute Compas’ methods because the algorithm was not public. The Wisconsin State Supreme Court dismissed the appeal. Years later, in 2018, it became known that the system analyzes 137 aspects of each accused. But, when contrasting the level of success between the predictions of Compas and those of flesh and blood jurists, it was found that the level of success of the AI ​​is not higher, or even serious errors are evident.

“Statistical averages say something about common behavior patterns in a collective. They do not describe individual profiles and are unable to capture the uniqueness of the human being, ”explains Lorena Jaume-Palasí, expert in ethics and technology and founder of Algorithm Watch and The Ethical Tech Society. “With this we can understand collectives with a slightly more architectural look, but we also incur the risk of putting individuals in standards that do not fit.”

To clarify whether it is feasible to prosecute someone criminally using AI, it is necessary to understand with what criteria the algorithm works (what the defense of Loomis claimed). Jaume-Palasí argues that, after all, Law is an algorithm that was applied long before computer science existed. “[Con el caso Loomis] all have set their eyes on the computer system and were scandalized by racism, but Compas allowed us to find out about the biases that the judges have, because the system was created by humans who had been working and deciding with those biases that the program later revealed. ”

“Statistical averages help understand groups, but are unable to capture the uniqueness of the human being”

Is Prometea like Compas?

In addition to his position in Justice, Juan G. Corvalán is director of the Laboratory of Innovation and Artificial Intelligence of the Law School of the University of Buenos Aires. In 2017, he created the Prometea software together with his collaborators.

Corvalán highlights, among the qualities of the system, that “Prometea does not use black box AI techniques, or what is known as deep learning; in other words, the entire algorithm process is open, auditable and traceable ”. Compas, on the other hand, applies two neural networks whose operation is unknown because “it was developed by a private company that owns the intellectual property rights of the algorithm.”

Argentine software, Corvalán argues, does nothing but reproduce the behavior of the country’s magistracy. “Prometea’s predictions are based on a history analysis of what the judges have decided, it is they who train the system. For example, in the Constitutional Court of Colombia [país en el que también se aplica el programa] it is the magistrates themselves who carry out the permanent adjustment of the Promethean predictions, with our technical assistance, of course ”.

Databases and biases

There is no AI worth without data. And when talking about data, the ghost of biases appears, like the racism that Compas has been accused of. Numbers build a discourse of objectivity that sometimes prevents questioning decisions. “Algorithms are nothing more than opinions contained in mathematics,” Cathy O’Neil wrote in her famous Weapons of Mathematical Destruction.

“What algorithms undoubtedly allow is to standardize decisions. In other words, standardize criteria so that two different responses to the same problem are not provided, ”says Pablo Mlynkiewicz, a graduate in Statistics and former head of the General Directorate of Information Sciences in Buenos Aires. “But, of course, for that to translate into real progress in justice, the database must have representation from all groups. If not, there will be mistakes. ”

In this way, Mlynkiewicz agrees with Jaume-Palasí and Adaro, highlighting a strong point in favor of the automation of judicial processes: they avoid giving two different responses to the same problem. In other words, it provides consistency of argument in the failures. Even though she is the most critical of these systems, the Mallorcan philosopher admits that automating judicial processes based on statistics can help correct errors that today justice refuses to accept. “We have known for a long time that the judges and the judicial system that we know are not very consistent. Being able to make traceability and statistics of judicial decisions thanks to AI is not bad, “he emphasizes.

Robot judges in China

In October 2019, the Internet Court, defined as an “online litigation center”, was launched in Beijing. According to official information, it is a platform on which the parties upload the data of the problem to be solved and the AI ​​does the rest: it searches for jurisprudence, analyzes the issue, contrasts evidence and issues a judgment.

The system does not have great technical differences with that of Estonia, where there is also a strong commitment to the automation of justice: there is no human intervention in the entire process. But between both countries there is a great distance in democratic standards. In the small Baltic country, considered the most advanced on the planet in digital matters, the young Ott Velsberg is the leader of the project, who claims that the demands presented to the digital court do not exceed 7,000 euros as the amount claimed for damages.

Everything flows there, because it is a society with high standards in civic matters. But when speaking of the Asian giant, things take on another tenor. “The development of virtual or cyber judges in China has followed the same line as the Social Credit System: from the bottom up,” explains Dante Avaro, a specialist in the Chinese government control model, referring to the controversial scoring mechanism of Citizens launched by Beijing to determine whether or not they are reliable. “They both started at the beginning of the new millennium. In the case of AI in justice, it was experimented in cities like Shandong, then in Hengezhou, Beijing, and Guangzhou. The objective was to bring efficiency to judicial processes in matters of electronic commerce, virtual payments, cloud transactions and intellectual property disputes, ”he illustrates.

The detail is that, in the hands of an undemocratic State that intends to order society by working crosswise on a scoring that Avaro calls “citizen traceability”, the application of AI in justice is dangerous because it is linked to the Social Credit System and the Yitu Dragonfly Eye facial recognition system. “A huge state surveillance apparatus is being built”, concludes Avaro

.

Previous

Unespa coordinates free insurance for toilets

“Seven Days Elsewhere”: the pianist’s blues

Next

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.