ACLU Says Manufacturer is Blaming the Messenger for Failure of Face-Recognition Technology

May 29, 2002 12:00 am

Media Contact
125 Broad Street
18th Floor
New York, NY 10004
United States

Statement of Barry Steinhardt Director, Technology & Liberty Program, ACLUFOR IMMEDIATE RELEASE

NEW YORK – In an apparent attack on the American Civil Liberties Union, a manufacturer of face-recognition systems has issued a news release charging that “”special interests”” have made “”misleading and incorrect”” statements about the technology. In doing so, the manufacturer, Visionics Corporation, has resorted to blaming the messenger for the inadequacies of its product.

In a May 28 news release and in conversations with reporters, Visionics has suggested that the ACLU is selectively releasing information to create a misleading picture of the performance of face-recognition. In response, we ask Visionics how our distribution of information from places like Palm Beach demonstrates selectivity.

There is an irony in Visionics’ awkward attempt to cast the ACLU as a “special interest.” A special interest is one that pursues aims that are narrow and selfish in nature – such as a corporation hawking a particular product in the aftermath of a national tragedy. The defense of the Bill of Rights and the right to privacy, which is the ACLU’s mission, is not a special interest — it is the public interest.

We have posted on our Web site (at http://archive.aclu.org/issues/privacy/FaceRec_Feature.html ) all the data we have been able to acquire on the performance of face-recognition. In addition to several independent laboratory tests, we also obtained under Florida’s open-records or “”Sunshine Laws”” the results of face-recognition deployments both on the streets of Tampa and in the Palm Beach International Airport. The fact that none of these results speaks well for the technology cannot be blamed on the ACLU.

The ACLU has also filed requests for information about the performance of face-recognition at airports in Fresno and in Boston, and the ACLU’s Texas affiliate is preparing a similar request for the airport in Dallas-Forth Worth. If Visionics has documents reflecting the findings of independent security experts at those airports or elsewhere, the ACLU invites the company to make them all public in a non-selective manner.

Visionics’ statement that the false alarms are “easily cleared through visual inspection by a human operator” is a welcome admission that these systems are not better than humans at identifying faces. It is a clear rebuttal of the false impression held by many Americans that this technology can see through disguises to identify faces in a way that human observers cannot.

The statement also vindicates the ACLU’s effort to point out that the false alarms are often wildly false, as was found in Tampa when the system there mistook men for women and made other basic errors (see the ACLU’s report on the Tampa system, online at (/Files/Files.cfm?ID=10415&c=184) Human visual inspection would not, however, permit a quick release of anyone who happens to resemble a subject in the terrorist database.

Visionics focuses on the error rates of its product because it cannot remedy or control what may be the biggest problem with the technology: the lack of recent quality photographs of most of the people on the planet who belong to Al Qaeda or other organizations likely to plan terrorist attacks. As the security experts at Palm Beach noted, quality photographs are key to achieving even the success rates obtained there — rates those experts judged to be inadequate.

The ACLU believes that Americans can be both safe and free – that we need not give up our privacy and other basic rights to protect against terrorism. That is accomplished by asking first whether intrusive new technologies are even effective. In the case of face-recognition, the answer appears to be no.

Sign up to be the first to hear about how to take action.

Learn More About the Issues in This Press Release