HB Ad Slot
HB Mobile Ad Slot
Head to Head Collision
Wednesday, September 18, 2019

Given trends toward digitization and remote transaction, senior security professionals have long expected that biometric identification would be the clear future of security, obviating need for account codes and social security numbers. Safely transferring and storing a retinal scan or voice print makes more sense than memorized numbers or even chip cards.

Companies have invested billions of dollars into biometric based systems.  But the security tool of the future may have just met its match in some jurisdictions that have reacted to their biometric privacy panic. At first, people were concerned that losing a biometric file gave away irreplaceable information. Reality is that now nearly all professional biometric captures for security purposes are immediately digitized into an algorithmic representation that cannot be reverse engineered back into its original measurement. But now the simple act of facial capture and measurement causes fits among the global privacy bureaucracies.

We are entering a bizarro world where the best security and identity tools are scrapped because some pencil-pushing privacy bureaucrats both don’t understand the tool’s effectiveness and  the tools are not natural fits for their narrow regulatory schema.

Case in point: the EU restricts use of biometrics under the GDPR and we have just seen one of our first cases in the area. The Swedish data protection authority, Datainspektionen, audited a public school system that was using facial-recognition capture to take attendance. This use of biometric information would have provided more accurate attendance information, and children would be safer, all the while the schools would save more than 17,000 hours of administrative work. Of course, the Datainspektionen could not tolerate such an outrage, and ordered the technology removed from the premises and fined the school nearly $30,000.

The Swedish DPA found that use of biometrics by this school caused a three-fold violation of GDPR:

  • violation of the fundamental principles of Article 5 by processing personal data in a more integrity invasive manner than necessary to take attendance,

  • violation of Article 9 by processing sensitive personal data (biometrical data) without legal basis (apparently child safety and saving the public schools huge money did not count as a legal basis), and

  • violations of Articles 35 and 36 by not fulfilling the requirements of data protection impact assessment and prior consultation (despite the fact that consent of the children’s guardians was given for this use of technology).

The DPA said that consent was not possible because when a school asks for it, parents are obviously not free to say “no” to the school’s request.  Apparently, no evidence was cited for this last proposition, because it seemed so obvious to the privacy bureaucrats. These bureaucrats also found that “managing attendance records” of children in a school is not necessary in the substantial public interest. Although the school ran an impact assessment, the Swedish DPA found it to be inadequate.

When you give a DPA an enormous hammer, then everything must look like a huge nail to those regulators. You would expect an EU DPA to find privacy violations everywhere it looks.  That is what it is trained to do. But, if a school cannot use facial recognition software (which the Datainspektionen called a “new technology” despite 20 years of public use) to assure the location and safety of its students, then I worry that, for EU and possibly California, no biometric technology will ever pass muster – ironically leaving us with subpar security and identification systems which put our privacy MORE at risk, rather than less.

For heaven’s sake, facial recognition software may only be 20 years old as a serviceable security system, but humans have been recognizing each other by their faces for our entire existence as a species – literally eons of use. Biometric capture and identification, especially of those biometric features generally offered to the public in our daily existence, should clearly be the future of data security and identity management. Not only is biometric recognition of faces a natural act, but it saves us from using other types of technology that can be falsified much easier or which do not work nearly as well. Stifling the biometric identity industry will make all of use less safe.

Our faces and our voices are the features we present to other people to prove who we are.  If continuing this age-old practice violates the law, then something is seriously wrong with the law. But the Europeans seem to like this law, nonetheless, so I foresee a head-on collision between the GDPR and common sense. I foresee a Kaftaesque Europe where we can no longer look at people’s faces because their privacy would be at risk.

I see the EU DPAs behaving like the old (sad) joke about communist legislators – “Yes the new rule works in practice, but we can’t approve it unless it works in theory.” And in this manner die the best security technologies.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins