Last week (9th July), the ICO announced that it would join forces with the Office of the Australian Information Commissioner (OAIC) to investigate the use of personal information, including biometric data, by Clearview AI, Inc. (Clearview). Limited information is available so far, but given the focus of the investigation, this is an important step in determining data protection rights and obligations, where information is ‘scraped’ from ‘publicly available’ sources, for the purposes of tackling crime.
Clearview’s mission (according to its website) is to provide a research tool for law enforcement agencies to identify perpetrators and victims of serious crime. It does so by collecting data from the open web and using facial recognition to compare images from crime scenes, with those available online, for the police to identify the guilty and help exonerate the innocent. Clearview is said to have a database of more than 3 billion images, taken from social media platforms and other websites, which supports this technology. The criminal activities concerned are particularly prevalent in the online environment (child sex abuse, sex trafficking and terrorism), which makes Clearview’s artificial intelligence tool seem a promising and welcome development for any agencies involved in tackling these crimes. The OAIC has undertaken preliminary enquiries with Clearview and has decided to investigate further, with the ICO’s support under the Global Privacy Assembly’s Global Cross Border Enforcement Cooperation Arrangement and an MOU between the authorities.
It may be difficult to argue for the protection of potential criminals’ rights and freedoms, but that is what the law requires, whether the tool is for police use, or not. Clearview, as a private company (and not a law enforcement agency) must comply with the GDPR, if it operates in the EU, or monitors the behaviour of EU data subjects within the EU. It is not clear whether Clearview considers itself a controller or processor (providing services to law enforcement agencies). Assuming that it relies on legitimate interests as a lawful basis to collect personal data, this would have to be balanced against the fundamental rights and freedoms of the data subjects involved and a legitimate interests assessment would be required. Given the potential high risk to data subjects, a data protection impact assessment would also be needed to support the processing.
The GDPR does not apply to law enforcement agencies, but the Law Enforcement Directive (EU) 2016/680
on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences..
deals with personal data processing by the police and is transposed into UK law by Part 3 of the Data Protection Act 2018. The police are not free to act outside of the law, even when dealing with the most sickening of offences and have obligations to comply with data protection principles and inform data subjects of their rights. If the police use Clearview or other companies involved in personal data processing, they must ensure that such personal data has been lawfully obtained and can be used in accordance with these laws.
The ICO also has powers to investigate police use of technology, which has the potential to impact on data protection. Further to its investigation last year into facial recognition software at Kings Cross station and other locations, the ICO recently produced a report on the results of its investigation into the extraction of personal data from mobile phones. The potential impact on victims of these crimes and other innocent individuals caught up in investigations, through the use of technology must also be considered.
This joint investigation by the ICO and OAIC is also welcome sign of collaboration between two well-established data protection authorities, into a controversial and much queried practice of ‘scraping’ data from the web.