Facial recognition probed across Europe under GDPR ahead of new AI rules
20 October 2021 14:33
Facial recognition tools are coming under intense scrutiny in Europe, with privacy watchdogs using the General Data Protection Regulation to regulate the fast-developing technology, rather than waiting for a dedicated EU law on AI to be passed.
The draft Artificial Intelligence Act aims to impose strict limitations on “high-risk” applications of the technology, including market-entry authorization requirements, while applying a lighter touch to less risky uses. While there is some debate over what constitutes a high-risk application, facial recognition will certainly be included.
But those rules are some way off. Proposed by the European Commission in April, the bill must still pass through the EU’s legislative process and probably won’t enter into force for at least another two years. Several EU privacy regulators have decided that’s too long to wait, and are pushing ahead with the rules they already have.
The enforcement push comes as companies and regulators seek more legal certainty on the use of facial recognition. The UK's Information Commissioner Elizabeth Denham has warned about the rapid spread of live facial recognition, which can be "overly invasive" in people's "lawful daily lives," which she said could damage trust both in the technology and in the "policing by consent" model.
Biometric data, including that generated by facial recognition software, is considered a special category of personal data under the GDPR since it makes it possible to uniquely identify a person. The 2018 privacy law prohibits the processing of such data unless there is explicit consent, a legal obligation or public interest.
Last month, Italy’s Garante per la Protezione dei Dati Personali fined a Milan university 200,000 euros for using a remote exam-supervision system to record students and to identify any suspicious behavior. This process, which included the processing of biometric data, breached several principles of the GDPR, the authority found.
The Italian regulator will continue to scrutinize facial recognition applications, alongside other AI tools such as machine learning, data mining, and scoring systems, a spokesperson told MLex, without giving details of ongoing investigations: “Facial recognition systems are included in our six-monthly plan for inspection activities, implemented with the help of the Italian financial police.”
In 2019, the Swedish data protection authority issued a similar fine of around 20,000 euros, after it found that a facial recognition pilot program carried out by a school, which tracked the attendance of a small group of students by comparing images, violated the GDPR.
Live investigations
Elsewhere in the EU, several regulators have begun compliance audits of facial recognition software this year.
In May, the data watchdog of the German state of Baden-Württemberg began investigating PimEyes, a facial recognition search engine, for its processing of biometric data.
The company, which scans online images with faces and saves biometric data such as face shape and eye color on its database, must deal with allegations that it has no lawful basis for collecting and processing individuals' biometric data under the GDPR.
In Greece, US facial-recognition technology company Clearview AI came under investigation over potential breaches of the GDPR last month, following a complaint by a Greek digital rights NGO. The regulator didn't disclose when the probe would be finalized.
Clearview also faces complaints in at least four other European countries — Austria, France, Italy and the UK — over its automated "scraping" of images of faces on the Internet, which it can use to identify individuals in new images.
Legal tests
These investigations follow a call by the EU's privacy watchdogs for a general ban on any use of AI technologies to recognize human features in public places.
In June, the European Data Protection Board — the umbrella body of the EU's data protection authorities — and the European Data Protection Supervisor — the watchdog that ensures EU institutions abide by their own rules — said that applications such as live facial recognition "interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms".
Nevertheless, the use of facial recognition technologies and other biometric applications will become more widespread across the EU in the near future. Technology is marching on, with many potentially lucrative niches to be filled.
Until the AI Act comes into force, regulators and companies will be feeling out what is acceptable. Some uses of facial recognition are already becoming established, such as in airport security where there’s a strong case for public interest, or in unlocking a smartphone that requires its owner's consent — both legitimate justifications for processing data under the GDPR.
Applications without a clear consent framework or public interest justification are likely to remain the focus of regulators, at least to begin with. The large-scale use of the technology in public spaces seems particularly urgent, as the various probes into Clearview attest.
In the UK, whose privacy rules haven’t much diverged from the EU’s since Brexit, a court ruled last year that the use of facial recognition by a police force was unlawful. But judges focused on the specifics of how it was used, leaving open the possibility that such an application could in principle be lawful, if better implemented.
Case law in the EU is also thin on the ground. The GDPR leaves a lot of room for interpretation on issues such as consent and public interest, and existing privacy rulings may not map cleanly over to facial recognition applications.
Legal certainty for developers probably requires the EU courts to rule specifically on a case involving facial recognition. This will come, as it’s only a matter of time before such a case is appealed.
But the EU courts are notoriously slow. By the time the question of facial recognition under the GDPR is addressed, the Artificial Intelligence Act may already have come into force, bringing with it a whole new range of legal principles to test.
Related Articles
No results found