Facial recognition in public spaces

Live facial recognition technology or automatic facial recognition (AFR) adds another dimension to CCTV monitoring and other surveillance methods. Using biometrics (certain physical and physiological features), the technology can map facial features to identify particular individuals by matching these with a database of known faces. This technology has been in use for some years by certain public and government agencies, but with the advent of AI and machine learning, it has become more prevalent in the private sector.

The current debate

You are walking along a side street towards your office. Unbeknown to you a closed-circuit television with facial recognition capabilities is tracking your movements. Is this lawful?

This question has recently been considered by both the UK’s Information Commissioner, Elizabeth Denham, and more recently by the High Court in R (Bridges) v CCSWP and SSHD [2019] EWHC 2341, which handed down its decision in the first case in the world regarding the use of facial recognition.

Ms Denham has also recently released statements addressing police and private use of facial recognition. She herself was an intervener in the case and indicated in July that her office was conducting an investigation into the trials undertaken by the police.

More recently on 15 August 2019 she advised that she would be investigating the use of facial recognition technology in King’s Cross London by a privately-owned development, stating that she is “deeply concerned about the growing use of facial recognition in public spaces, not only by law enforcement agencies but also increasingly by the private sector.”

The private development in question, Kings Cross Estate, has since released a statement on 2 September 2019 to confirm they had used facial recognition on the Estate until March 2018 and were co-operating with the ICO but that due to the “broad debate now under way”, had no plans to reintroduce any facial recognition technology.

Any concerns Kings Cross Estate had hoped to quell may now have been heightened by the subsequent admission on 6 September 2019 by the Metropolitan Police that they had given images to Kings Cross Estate for use with its facial recognition scheme.

An international perspective

The UK is not the only jurisdiction where facial recognition has been seen as a threat to privacy.

California has recently passed a law banning state and local law enforcement from using body cameras with facial recognition. The law was backed by the American Civil Liberties Union with the bill’s sponsor, Mr Ting, citing community trust in law enforcement as the basis for the ban and that if the software was installed, it will become “a tool of surveillance which was never the goal.” San Francisco and Oakland County had previously banned use by local city agencies and police.

Unjustified surveillance by law enforcement was also a concern voiced by musicians calling for a ban on facial recognition use at music concerts by Ticketmaster companies. Musicians have launched a campaign against such use which they say puts minority and undocumented individuals and those with criminal records at risk of being “unjustly detained, harassed or judged.”

The threat of state surveillance by facial recognition is of course heightened all the more in countries where privacy rights are limited, such as China who is a leading innovator of AI powered surveillance tools. Protesters involved in the recent Hong Kong riots feared that data captured by facial recognition systems in Hong Kong were being exported to mainland China.

This is all while Chinese companies are increasing their stake in a global market for facial recognition software with one Beijing-based company targeting an IPO raise of up to $1 billion.

Use by private companies

Private organisations which process personal data within the UK must comply with the General Data Protection Act (GDPR) and the Data Protection Act 2018 (DPA 2018).

Personal data is a well-recognised concept. It is data relating to a living individual who can be identified indirectly or directly from either the data collected alone or by that data and other information likely to come into the possession of the data controller.

In the Bridges case, the High Court found that images of members of the public caught on CCTV cameras and processed to extract biometric facial data are identified directly, or in the words of their Honours, “sufficiently individuated” to become personal data within the meaning of the DPA 1998 (para 124).

The processing of personal data must only be undertaken if any of the grounds under Article 6 of the GDPR apply. Further, where the processing involves special category data, which includes biometric data, a further justification must be found within Article 9.

The lawful bases

Article 6 (1) lists the lawful bases for processing. In the context of video surveillance, the applicable bases may include:

  • The consent of the individuals concerned (Article 6(1)(a)). For consent to be valid under the GDPR it must be freely given, specific, informed and an unambiguous indication given prior to the processing.
  • The processing is necessary for the legitimate interests pursued by the controller (Article 6(1)(f)). Processing for the purpose of “legitimate interest of a controller” will be lawful unless such interests are overridden by the fundamental rights and freedoms of an individual.

Special category data

If the processing of personal data involves special category data, in addition to identifying a lawful basis under Article 6, an exemption must also be found in Article 9 to justify such processing.

Facial recognition technology will collect a type of special category data, ie biometric data, if it is capable of uniquely identifying an individual. Biometric data involves the physical, physiological or behavioural characteristics of a person.

Therefore, if facial recognition technology is used to identify a particular individual as opposed to a category of persons (such as the profiling of customers by race, gender, age) this will be processing biometric data.

Article 9(2) of the GDPR lists a limited number of exemptions which may justify processing special category data. These grounds include the explicit consent of the data subject, vital interests of the data subject (immediate medical emergency), necessary for the establishment, exercise or defence of legal claims, processing relating to personal data already made public by the individual, substantial public interest, various medical and public health reasons, and for scientific research or statistics.

The European Data Protection Board (EDPB) recently issued draft guidelines for public consultation: Guidelines 3/2019 on processing of personal data through video devices. These specifically address the use of facial recognition technology.

The EDPB is an independent European body established under the GDPR and publishes guidance on the application of European data protection laws.

The draft guidelines make some interesting observations about the use of both CCTV and facial recognition:

  1. Video surveillance may be necessary to protect the legitimate interests of a controller such as for the protection of property against burglary, theft or vandalism (para 19).
  2. Video surveillance measures should only be chosen if the purpose could not reasonably be fulfilled by other means which are less intrusive to fundamental rights and freedoms (para 24).
  3. It may be necessary to use video surveillance not just within the property boundaries but also in the immediate surroundings of the premises, in which case some protective measures such as blocking out or pixelating could be employed (para 27).
  4. In respect of facial recognition which involves processing of special category data, the draft guidelines voice caution. The EDPB appears to suggest that for private entities that install the technology for their own purposes such as for security, explicit consent will in most cases be required (para 76).
  5. Where explicit consent is required, an organisation cannot condition access to its services on consenting to the processing but must offer an alternative solution that does not involve facial recognition (para 85).
  6. In cases where the technology captures passers-by, an exemption under Article 9 will still be required for these individuals (para 83). The difficulty in the case of passers-by is that if consent is relied on, this must be obtained before undertaking processing and therefore either another exemption under Article 9 must apply or such processing may be unlawful.

For organisations like Kings Cross Estate that capture the images of passers-by, they will need to find another exemption under Article 9 or risk infringing the DPA 2018.

The conclusions of the ICO’s investigation into the Kings Cross matter may provide some clarification on this point. However, if the ICO’s determination reflects the views of the EDPB (explicit consent likely to be required for facial recognition use in public), the flow-on effects for companies could be widespread. Certainly, companies which use facial recognition on individuals in public areas ought to be now reviewing their data protection compliance and procedures.

Use by law enforcement

As already mentioned, in the first case of its kind in the world, the High Court recently handed down its decision in the Bridges case. This involved the targeted employment of facial recognition in public areas on a trial basis by the South Wales Police (SWP).

The case related to a claim by a member of the public, Mr Edward Bridges a civil liberties campaigner who alleged that on two occasions he had been present and caught by the cameras in use by the South Wales Police deploying AFR software.

Mr Bridges claimed that the use of AFR by the SWP was in breach of the right to privacy as contained in Article 8 of the European Convention of Human Rights (ECHR) and that the SWP had failed to comply with the Data Protection Act 1998 and its successor, the Data Protection Act 2018 (DPA 2018). He also complained that the SWP had failed its statutory duty under the Equality Act 2010 to have regard to the possible discriminatory impacts of the technology.

The High Court found that on the facts of that case, the SWP had not unlawfully interfered with Mr Bridges’ right to privacy nor had the SWP failed to comply with the various provisions of the DPA 2018 concerning the processing of personal data by law enforcement. It also found in favour of SWP, stating that it had complied with its duty to have due regard under the Equality Act.

Some of the factors which led the Court to find that the processing was a permitted interference with Mr Bridge’s right to privacy included that the processing was undertaken in a fair and transparent manner (fair processing notices were issued and signs located close by to the camera, the police van was marked), the processing was limited in time, for a particular legitimate purpose, and biometric data which did not result in a match was immediately deleted. For further commentary on the decision, see the case analysis at http://bit.ly/clarkslegalbridges.

Whilst the findings of the court are fact specific, it provided some guidance which is applicable equally to both public and private use of facial recognition:

  1. The mere storing or initial gathering of biometric data is enough to interfere with an individual’s right to privacy under Article 8 of the ECHR (para 59). Apparently this view was shared by the ICO who considered that the automated capture of facial biometrics amounts to a “serious interference with privacy rights” (para 60).
  2. Members of the public caught on CCTV and whose biometric facial data are taken are sufficiently identified and therefore comprise personal data for the purposes of the DPA 1998 (para 124).
  3. The taking of biometric information entails sensitive processing for which cogent and robust reasons must be provided. In that regard, no distinction ought to be made between the levels of protection given in the Human Rights Act 1988 and the DPA 2018 (para 100).
  4. The inclusion by police of individuals on a watch list and consequent processing of that person’s personal data without sufficient reason (ie no reasonable suspicion of having committed or about to commit an offence) would most likely amount to an unlawful interference with their Article 8 rights (para 105).
  5. Whilst the court found no evidence of indirect discrimination in the particular case, it pointed out that the police may wish to conduct an investigation into whether the software may produce discriminatory impacts and pondered whether it would be appropriate for the SWP to employ a failsafe to ensure no step is taken against a member of the public unless an officer has reviewed a potential match and is satisfied there is such a match (para 156).

Conclusions

It is inevitable that facial recognition use will become more prevalent as technology improves notwithstanding concerns raised by privacy advocates and regulators such as the ICO.

With the global market for facial recognition predicted to grow to USD $9 billion by 2024, further clarification by the ICO and perhaps further regulation to address privacy concerns may be warranted.

Facial recognition is likely to become cheaper and more accessible as more tech companies enter the market. Use will therefore be on the rise and regulators such as the ICO in the UK and supervisory authorities within the EU are likely to reassess whether specific laws to target this technology are needed.

Companies are well advised to take a risk-averse approach to facial recognition and deploy it only where strictly necessary and proportionate to the proposed purpose. A data protection impact assessment as required by the GDPR prior to implementing the technology should assist companies assess whether their use is likely to offend privacy rights.

Chrysilla de Vere is the Commercial partner at Clarkslegal, specialising in privacy and data protection. Email cdevere@clarkslegal.com. Twitter @Clarkslegal.

Image cc by Mike MacKenzie on Flickr, adapted.