The Georgetown Law Center for Privacy & Technology released a report that takes a harsh look at the Department of Homeland Security (DHS)’s “Biometric Exit” program. The “Not Ready for Takeoff: Face Scans at Airport Departure Gates” report highlights the myriad number of privacy and fairness issues associated with the use of biometric data for screening and other purposes. The Biometric Air Exit program uses biometric data to verify travelers’ identities as they leave the U.S. and has been deployed at Boston’s Logan International Airport and eight other airports. The program is operated by DHS and uses photographs of passengers taken at the gate while boarding to verify travelers’ identities as they leave the country. Prior to departure of an outbound international flight, DHS prepopulates the Traveler Verification Service (TVS) with biometric templates from the travelers expected on the flight. TVS either confirms the travelers face or rejects the face as a “non-match.” Non-matched travelers credentials will then be checked manually.
The Georgetown Law report takes issue with DHS’s claim that the program is designed to detect “visa overstay travel fraud” noting that the problems associated with visa overstay fraud have not been properly established and therefore do not necessitate such a resource-intensive solution. Visa overstay fraud occurs when a foreign national who wishes to remain in the U.S. past the expiration of his visa arranges to have a conspirator leave the country in his place using the visa holder’s credentials, which creates an exit record. In addition, the report questions whether the program complies with federal law because it has not been specifically authorized by Congress and DHS has not engaged in an appropriate rulemaking proceeding.
The challenges associated with measuring the effectiveness of facial scanning programs are highlighted in the report, which noted that since February 2017 the National Institute of Standards and Technology (NIST) has tested more than 35 different face recognition algorithms designed to verify identities. That research indicates that face recognition systems have a hard time distinguishing among people who like alike which could lead to falsely matching individuals who are similar in appearance. This suggests that such programs may not perform well if the goal is to screen for imposters. Finally, the report notes that DHS relies heavily on airlines and technology vendors for the central components of the program and recommends that airlines and other vendors become aware of the potential privacy and fairness issues associated with biometric screening.