Facial Recognition & GDPR: The Good, The Bad, and the Problematic

Nov 21, 2019 12:00:00 AM | EU Facial Recognition & GDPR: The Good, The Bad, and the Problematic

Earlier this week, France announced that it will be the first country within the EU to introduce the use of facial recognition for government services. A decision that has raised a few eyebrows in terms of people’s privacy and the connotation it may imply in regards to the GDPR. Didier Baichere, a lawmaker with French president Emmanuel Macron ́s party, insisted that the general public shouldn't be worried. But, should they? What sort of implications can facial recognition technology make and just how protected are the public’s sensitive data? Let’s take a look.

facialrecognitiontechnology_privacyperfect_blogpost

Continuing what has recently been announced in France, Baichere added that there are “interesting and positive ways” of using such technology. At the same time, several civil rights groups have taken the decision with great concern. La Quadrature du Net, a group that defends digital rights & civil liberties, are one of the groups. They are also suing the French government at France’s highest court of appeals.
facialrecognitiontechnology_gdpr_privacyperfect_blogLooking back several months ago, Sweden issued their first ever GDPR related fine on a school which had used facial recognition technology on its students for attendance checking purposes. Much to the public concern, the Swedish DPA concluded that the school’s decision to use of facial recognition technology had violated the GDPR in three different ways: violating the principles set in Article 5 by processing personal data in a manner that overrides the necessary purpose, Article 9 by processing and gathering such sensitive and personal data without legal basis, and Articles 35 - 36 by not fulfilling the requirements of a DPIA or prior consultation.

As the GDPR continues to protect the general public’s sensitive information (such as biometric data) through a set of regulations, let’s take a deeper look into  what implications mass surveillance can cause on the EU’s sweeping privacy law.

What does the GDPR say about facial recognition?

The GDPR classifies facial images falling under facial recognition as “special categories of data” under Article 9. Facial recognition technology takes information off of an individual’s facial features. This is classed as biometric data, also considered as sensitive personal data. The GDPR defines biometric data aspersonal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images of dactyloscopic data.”

To sum up, the GDPR breaks biometric data into two main categories:
facialrecognitiontechnology_gdpr_privacyperfect_blog Physical Characteristics: finger prints, facial features, weight, etc.
Behavioural Characteristics: certain actions, habits, personality, etc.

The GDPR ensures that if any party aims to use this type of data from the other parties, there must be consent. Consent is defined in Article 4 of the GDPR as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”. Despite the GDPR providing a restrictive set of obligations to protect those rights of the general public, the use of facial recognition can still be done as long as:

The user has given their consent willingly 
If biometric data is needed for carrying out employment, social security, or social protection obligations
If biometric data is needed to protect the vital interests of the individual and they are incapable of giving such consent
If it is needed for legal issues
If biometric data is necessary to help public interest

In most cases, it is mandatory that organisations conduct a DPIA before implementing facial recognition technology. Article 34(3)b states that an assessment should be done in the case of a large-scale processing of such special categories of personal data. The assessment would also need to make clear of what other alternatives there are available other than facial recognition.

The good, the bad, and the even more problematic

The good
Facial recognition technology comes with numerous benefits. On a governmental level, facial recognition tech could help identify potential threats to the public. It would also mean that there would be less time spent on manual identification, and much less chances of inaccuracy (such as misidentification) as a result. On the general public level, such technology could benefit people more than just extra security. For example, implementation of facial recognition technology in airports has also become popular in the past year. Not only could the technology help airport security maximise the safety of each person, but it could also mean more productivity and conveniency for people. Earlier in September, London’s Gatwick airport had become the UK’s first airport to use facial recognition on a permanent basis for ID checks before people could board. Upon the decision, representatives of the airport had done prior research which showed that “90% of the 20,000 passengers interviewed stated that they found the technology very useful and easy to understand. The trial run conducted showed faster boarding processes and an overall significant reduction in queue time”.

facialrecognitiontechnology_gdpr_privacyperfect_blogThe bad
PrivacyInternational, a civil liberties group, had expressed worries when it comes to travellers at Gatwick not realising that they could opt out from having their faces scanned. The main concern expressed was towards the issue of consent and just how transparent airport officials will be with those who are boarding. The group had said that by only placing general signs that let passengers know that facial recognition technology is being used, is “inadequate” to satisfy the requirements that had been put by data protection laws and that that it puts risks associated with having biometric data being taken by airport private entities.  

The even more problematic
Also in September this year, a court in the UK ruled that the police use of facial recognition technology, does not violate any privacy rights. This was ruled after an activist had argued that by having his face scanned, the police had violated his privacy and data protection rights. Ed Bridges had brought the inquiry up after he had claimed that his face was scanned while shopping in 2017 and also while he was attending an anti-arms protest in 2018. He has also very recently decided to appeal against the ruling. Following public concerns, the ICO also pushed a statutory code that should be used to govern how and when law enforcement deploy facial recognition technology in public places. Elizabeth Denham, the ICO and the UK’s Information Commissioner had further stated that the police will need to justify using such technology. 

Where does that leave us?

With the GDPR, the implementation of facial recognition must go through a process that ensures the public’s data privacy rights. As large scale information is gathered in such a fast paced process, the GDPR makes sure that such sensitive data is handled with the most precaution. As we enter into a new year, will more public concern arise or would we be able to look over such concerns by looking into the benefits? And just how far could the GDPR stretch to ensure privacy rights safeguards?
facialrecognitiontechnology_gdpr_privacyperfect_blogStill wondering about the grey area of facial recognition technology?
You might be interested in: Facial Recognition & Data Collection: Will you collect happy points for good citizenship in 2025?

Wanting to dig deeper into biometric data and what it means on a more general public level?
You might find Discounts & Data: GDPR for retailers interesting.