argia.eus
INPRIMATU
Artificial Intelligence
Visualizing what algorithms hide
  • Although the word algorithm is relatively well known, society in general does not know what they are and what they are used for. Based on its importance in the field of artificial intelligence, Lore Martinez Axpe, secretary and parliamentarian of digitalization of EH Bildu, explained that Hodei, created by the company Iametza, does not exist in the podcast. The creation of a public record of the algorithms used by the Basque Government has been agreed in the Basque Parliament and work has been done on the radio programme, as well as the possibilities and risks of artificial intelligence, among others.
Irati Irazusta Jauregi 2023ko uztailaren 26a
Argazkia: Eneritz Arzallus / ARGIA CC BY-SA
Argazkia: Eneritz Arzallus / ARGIA CC BY-SA

In the text agreed by PNV, EH Bildu, PSE-EE and Elkarrekin We urge the Basque Government to approve an Ethical Declaration of Data as a guarantee of ethical principles and values that guide the use of data in the public administration. A transparent public register covering all AI algorithms and systems used by the administration, as well as the suppliers subcontracted by the administration, will be established. The agreement is based on the motion tabled by Lore Martinez Axpe.

“Discriminatory” results

In essence, artificial intelligence would be the “decision-making” or the feeding of computers with millions of data of enormous computational capacity, the processing of them by algorithms and the identification of patterns. Algorithms are operations or procedures that are performed to find a solution. What is one of the problems? The results can often be “discriminatory”, according to Martínez. This also applies to public administration. That is, when we talk about flooding of artificial intelligence data, we have to bear in mind that the algorithms that process this information can discriminate against citizens.

For example: A study by Algorithm Watch concluded that Ertzaintza uses a xenophobic system to measure the risk of male violence. The EPV-R algorithm is based on a psychological questionnaire that asks whether they are “foreign” victims or criminals, among others. Furthermore, the questionnaire only sounds like a foreigner to non-Westerners. Inspector General of the Ertzaintza, Oskar Fernández, explained that this category includes “those who are not Europeans with a different culture”.

Also, the Cornel University of EE.UU. analyzed the performance of EPV-R and concluded that most cases considered “high risk” by other routes (53%) were “low risk”. That is, there is a high risk that this algorithm will not be evaluated as “serious cases”.

Refusal of grants

These artificial intelligences can also make decisions that affect the economy of citizenship. One of them is Bosco, which regulates the social bond for the payment of the electricity bill. It is involved in the review of applications and is competent to decide on the outcome. The Civio group denounced some discrimination in the algorithm for excluding certain sectors eligible for aid.

Low-income, large family or person receiving a minimum disability or retirement pension shall be eligible for this grant. A widow should choose the latter, but Bosco believes that the procedure is “incalculable” and is the message they send to electricity companies. Thus, she denies aid to widows, although it is up to them to receive it. Members of large families, whatever their income, are legally entitled to this aid. However, Bosco has to analyze the rent of applicants and if anyone does not allow to analyze their accounts, they are not given that help because it leaves out algorithms.

Martínez explains that algorithms can “exponentially increase” discrimination based on artificial intelligence applied to them. “That’s why it’s so important to prepare data well with algorithms. We must ensure that data is not discriminatory and that it has a good basis for accessing the algorithm.”

Public record of algorithms

What will the public be able to consult? They will basically have the right to know where and for what purpose personal data have been used. EH Bildu parliamentarian has brought an example to the podcast. Suppose someone has had a disease and has used your data to provide the conditions that people who have had it have to meet to get sick again. The owner of this data may know whether these data have had discriminatory effects. In addition, if someone does not agree to use your data, you will be informed of how you can appeal.

All citizens can know what has been done with personal data, but many still do not know what an algorithm is. Therefore, Martínez has given importance to the language used: “It must be explained in a way that the public understands. Registration is not for computer consulting but for consulting anyone.”

Creation of the public register “0. It's just a "phase," according to EH Bildu's parliamentarian. “The biggest benefit is that we’ve started to socialize the issue of algorithms. Knowing how our data are processed is a fundamental right of citizens, although much remains to be done.” Martinez considers that audits are necessary to demonstrate that they comply with the promises and that there is no discrimination.

First Autonomous Community

The CAPV has been the first autonomous community of the state to approve the public register of algorithms. However, there are more cases in some European cities: Barcelona, Helsinki and Amsterdam. After consulting with the EU Artificial Intelligence Commission on EH Bildu’s proposal in the Basque Parliament, Martinez explains that, according to the committee members, the development of public registers will be a “trend” in the rest of Europe as well.

The Catalan capital stands out. In April, four experts in the creation of a public registry of algorithms in the administration, including Michael Donaldson, representative of Technological Innovation, Electronic Administration and Good Government of the City of Barcelona, spoke in the Basque Parliament and reported on the case of Barcelona.

He told how in late 2022 the city council approved a protocol defining the necessary measures before and after using an algorithm. The protocol provides that before using an algorithmic public service delivery system, the computer service shall carry out a risk assessment. Considers processes affecting social rights or citizens' data as well as those which may discriminate against the granting of subsidies to be a high risk.

Before testing, to the market

Martinez warns of the risk of not testing the technology. “We have a great tendency to be progressive with technology and we use new products without prior reflection. We are researching, innovating, taking out new digital technological solutions and launching them into the market as soon as they are renewed.” Comparison with drugs. “Medicines are also researched and innovated, but thousands of tests are carried out before they go to market to learn about their consequences, which are then translated into a prospectus. This leaflet explains the harm and risk of this medicine.” In the opinion of Parliament, artificial intelligence should act in the same way: “Once the product is present, ask: 'What damage does it have? Does the data protection law guarantee? Is it discriminatory?’

He says they acted the same with artificial intelligence ChatGPT: “In four months the product was finished and marketed directly.” This tool does not comply with the data protection law, so it has been banned in Italy after its marketing. “I think that had to be done from the very beginning. Technology should not be allowed to be marketed if it is not previously tested.” Considers that this work is the responsibility of the company: “A company should not market anything that could harm people.”

What's in Basque Country?

Martínez has defended the “digital transformation of the Km0”, hoping that technological solutions will be transformed and improved in Euskal Herria. This would mean the permanence of the data. Moreover, with “good public policies”, technological talent would also be created, promoted and, if the good conditions were guaranteed, it would be here, in his words.
However, he considers that in Euskal Herria there is already a “strong ecosystem” of technology experts, “let’s not say in linguistic computing”. However, he warns that if this is not subsidised we have “party”. Among others, it considers it necessary to finance projects for language technologies in Basque, such as GAITEK. “The Basque country must be part of the digital transformation to survive, otherwise it will have no future.”

Martinez hopes that artificial intelligence will never anticipate the human being, “but it will not be easy.” He says that regulations and laws should help raise awareness: “Let’s see if we among all are able to make digital technologies a tool for improving human lives and not the opposite.”