A Chilling Ciao? Facial Recognition on the Borders of Italy
Information - 90%
Insight - 90%
Relevance - 90%
Objectivity - 85%
Authority - 95%
A short percentage-based assessment of the qualitative benefit of the recent post highlighting the new article from EDRi on dehumanizing facial recognition use on the borders of Italy
Editor’s Note: Shared with permission* from European Digital Rights (EDRi), an association of civil and human rights organizations from across Europe, the article titled Chilling use of face recognition at Italian borders shows why we must ban biometric mass surveillance explores the use of facial recognition technologies and biometric mass surveillance approaches by the Italian Police on the country’s borders. The article highlights commentary from EDRi member Hermes Center, an Italian civil rights organization focusing on the promotion and development of the awareness and attention to transparency, accountability, freedom of speech online and the protection of rights and personal freedoms in a connected world.
Chilling use of face recognition at Italian borders shows why we must ban biometric mass surveillance
Authored by EDRi
The Reclaim Your Face campaign has been investigating and exposing abusive and rights-violating uses of facial recognition tech, and other biometric mass surveillance, since its launch last year. In the latest in a long line of examples that show that these inherently discriminatory technologies are being used to further exclude some of society’s most marginalized people, Hermes Center explains how the Italian Police are deploying dehumanizing biometric systems against people at Italy’s borders. Now, more than ever, we need to call for a ban on these biometric mass surveillance practices. The Reclaim Your Face campaign’s major EU petition (European Citizens’ Initiative), launching on 17th February, will give us the legal means to demand just that.
The introduction of facial recognition systems in Italy continues to show the same symptoms that we denounce in the Reclaim Your Face campaign: lack of transparency, absolute disregard for the respect of human rights, and inability to admit that some uses of this technology are too dangerous.
The latest episode concerns the Automatic Image Recognition System (SARI), initially acquired by the Italian police in 2017 and now, as revealed in an investigation by IrpiMedia, at the center of a new public tender with the aim of upgrading the system and employing it to monitor arrivals of migrants and asylum seekers on the Italian coasts and related activities.
“In order to do so, the Ministry of Interior has used two strategies: taking advantage of the European Internal Security Funds and, as shown by some documents obtained by IrpiMedia thanks to an FOIA request, ignoring the questions of the Italian Data Protection Authority (DPA) that has been waiting for two years to close an investigation on the facial recognition system that the police wants to use,” reads the article.
In our Reclaim Your Face requests, we ask the Ministry of the Interior to publish all the evaluations of the algorithms used, the numbers on the use of the system, and all the data on the type of faces in the database used by SARI.
This information is fundamental in order to understand the effects of the algorithms that act on a database that is already strongly unbalanced and discriminatory: as revealed almost two years ago by Wired Italia, 8 out of 10 people in SARI’s database are foreigners. It is not clear how many of these are migrants and asylum seekers.
The biometric and digital identity processing of migrants and refugees in Italy has been studied in a Data&Society report carried out in 2019 by researcher Mark Latonero in partnership with Reclaim Your Face partner CILD, an Italian NGO. The field analysis uncovered an entire ecosystem composed of NGOs, government, researchers, media, and the private sector that collects, analyses, and manages digital information about migrants and refugees to provide them with support, regulate them, and study their behaviors. Collecting this data can lead to varying degrees of discrimination due to existing biases related to the vulnerability of migrants and refugees. Mindful of this study, we imagine how pervasive and unprotected a facial recognition system adopted on precisely one specific category of people could be. An additional level of scrutiny that we do not want to be normalized and become part of the daily lives of us all.
While requests on the transparency of the algorithm and the database are not met and even the DPA is still waiting for an impact assessment of the system, the Ministry is also exploiting European money from the Internal Security Fund.
IrpiMedia details the subject of the contract as follows: “The budget allocated for the enhancement of the system is 246000€ and the enhancement includes the purchase of a license for a facial recognition software owned by Neurotechnology, one of the best-known manufacturers in the world, able to process the video stream from at least two cameras and the management of a watch-list that includes up to 10 thousand subjects. In addition, the hardware and software configuration must be of small dimensions, to be inserted in a backpack and allow to carry out ‘strategic installations in places that are difficult to access with the equipment provided,’ reads the technical specs of the public tender of the Ministry of Interior.”
Biometric surveillance dehumanizes us into lifeless bits of data, depriving us of our autonomy and the ability to express who we are. This is even more dangerous when applied to people who reach our countries escaping from violence, economic disasters, and environmental catastrophes. Meeting human beings with biometric surveillance technologies destroys our humanity.
The story is shocking, but it is not inevitable. Brutal technologies that amplify already persecutory anti-migration strategies are the latest tool that shows the extent of these structural problems. Banning biometric mass surveillance means not only stopping the use of such tools but addressing the underlying inequalities and discrimination of our societies. You can support Reclaim Your Face’s campaign against discriminatory and intrusive biometric mass surveillance. Sign up to be the first to know when our new legal petition launches, and much more at https://ReclaimYourFace.eu
- Say Cheese? Facial Recognition Technology Use Warning for Dutch Supermarket
- An EDPB Update: Guidelines on Examples Regarding Data Breach Notification
* Redistributed with Permission Under the Creative Commons Attribution-ShareAlike 4.0 License