Face the Facts: Facial Recognition Software Should be Strictly Monitored

Face the Facts: Facial Recognition Software Should be Strictly Monitored

JURIST Guest Columnist Corinne Maupin of Valparaiso University School of Law discusses advances in facial recognition technology and the corresponding ripeness for matching regulations …

A recent article in the ABA Journal noted that law enforcement agencies have proposed the use of facial recognition software within departments to aid in the capture of criminals. This software obtains photographs and catalogs them for later use. This could be mug-shots, driver’s license photos, or photographs captured while someone is walking down the street. For example, there are already studies implemented in airports to replace the driver’s license and passport.

The widespread implementation of this type of software requires an analysis of the benefit and risks. Once the benefits and risks are understood, citizens will be able to make an informed decision about how and when this software should be utilized.

The potential benefits of facial recognition may be endless. Importantly, the software provides law enforcement with ways to manage records of people of interest. Facial recognition software in the criminal justice system utilizes two different biometric techniques to increase the efficiency and accuracy of the verification process. This software identifies individuals when it is not legally or physically possible to obtain fingerprint or DNA evidence. As a demonstration, the Pierce County Sheriff in Washington State has been able to demonstrate 94% accuracy in the first candidate spot when using mug shot photographs for facial identification during booking. Although the quality of the facial images varies, such as those frequently obtained from surveillance footage, it is still possible to compare photograph stored in the database and make a possible identification.

This technology, however, does not come without the attendant risks. Facial recognition software is currently an un-regulated algorithm that uses facial features to identify people from previously stored photographs. This means that each individual State or department is controlling their own facial recognition software. Facial recognition software is not as accurate in its identification methods as other, more established methods such as fingerprinting or DNA testing.

For instance, Georgetown Law Center on Privacy and Technology did a study [PDF] regarding the accuracy and risks of the unregulated use of facial recognition software by law enforcement in America. In this study, Georgetown Law Center noted several flaws with facial recognition software. Most notably, that facial recognition software is least accurate when it comes to people with dark skin. In a poll done at the Seattle Police Department, the department states their system does not see race. However, a co-study done by the Federal Bureau of Investigation suggested that facial recognition software may be less accurate on black people. This is troubling because, as the study suggests, mug shot databases are likely to include a disproportionate number of African Americans. However, there is no testing method to discover racially biased errors. The potential for racial bias in the facial recognition may be a sore spot for many police departments. Transparency is going to be important in communities when the facial recognition software becomes more popular.

With the inherent risk of racial bias, law enforcement agencies may be less likely to be transparent with their communities. A report written by Kimberly Del Greco, Deputy Assistant Director, Criminal Justice Information Services Division of the Federal Bureau of Investigation, states the facial recognition software used for identifying criminals is only accurate 85% of the time in a controlled environment. Compared to the accuracy of fingerprinting, which is roughly 92.5%, there is still work to be done on facial recognition software before it becomes a reliable method of capturing criminals.

Additionally, facial recognition software raises some important constitutional questions. A pertinent example involves the First Amendment, and the specific right of freedom of assembly. Citizens are granted the right to protest anonymously. However, the use of facial recognition software to identify anyone on the street potentially halts the freedoms of citizens. If this software is used at political protests citizens lose their right to protest anonymously. To protect the right to protest and freedom of speech there should be some form of regulations on the use of facial recognition software. There are a wide array of benefits and risks associated with the use of facial recognition software in police departments. When this use is unregulated it allows criminal justice systems to use it how they see fit. Regulations on facial recognition software in law enforcement agencies will standardize the use of facial imaging and protect citizens First Amendment rights.

To see how the technology has fared without regulation, it is helpful to examine recent implementations of it, such as with the iPhone X. This phone allows a person to simply show their face near the camera of the phone to unlock their device. This device software also authenticates Apple Pay, and some applications on the device. Your face is now the key to opening and securing your phone.

Having the facial recognition software in the iPhone means easy accessibility for the owners. There are added security features for the facial recognition software. Reviewers of the iPhone software recognize that unlocking the iPhone is not automatic. The Face ID readies the phone for unlock, then an additional icon appears to unlock the phone. One user has noted that Face ID also works in an almost completely dark room by using only the light on the iPhone’s screen. The features of the iPhone X make using a smartphone easy. Forgotten passwords seem to be a thing of the past with this new technology.

However, as with any new technology, there are risks associated with having a device that uses a face as a key. The iPhone X has been tested on identical twins, and it cannot tell the difference. In 2012, it was suggested that about one in 30 babies born is a twin. The article also suggested that the rate of twins born in the United States has been increasing since 1980. With there being such a small population of twins, one would think this risk does not apply to them. The risk, in fact, may be more widespread than many think: the Twin Stranger Project is an example of a group showcasing two people looking almost exactly alike, while sharing little to no relation to one another.

In fact, an article written for Women’s Health Magazine explores the reasons we may look like complete strangers. The article explains that any two people at random may share about 99.5% of their gene sequence. This is a relatively small risk considering the Twin Strangers is global, and not limited to the United States.

A Vietnamese cybersecurity firm has also taken steps to show how the device can be hacked. Ngo Tuan Anh has demonstrated how to hack the device by using a 3D mask, paper tape skin, silicone nose, and paper eyes, he can unlock the iPhone X. In the same article it is noted, however, that the process could take up to five hours. Apple’s own technical support page notes the probability of a random person unlocking a phone that does not belong to them is approximately one -in-a-million. Although the risk exists, it is relatively small and would take in-depth consideration from a person trying to hack into a phone.

The potential for the future of facial recognition in smart devices is endless. The iPhone X seems to be user-friendly. The device uses facial features to secure the smartphone. The increased security is appealing. However, with benefits come inherent risks. Like all technology, facial recognition software can be hacked. The device cannot differentiate identical twins, yet, and as noted, there is a convenient website for strangers to find people who match them almost identically. This opens the door to hacking and potential theft of devices.

Facial recognition software is set to take America by storm. New technology is often utilized in a variety of ways and incorporated into everyday lives. Fingerprinting and password locks on phones are soon to be a thing of the past. With all new technology, there are risks. The government has not yet caught up to the speed of the dissemination of the facial recognition devices. There are no regulations on how the software is being used. Law enforcement agencies are deciding when and where to use the software. We may soon see a future where facial recognition technology in government agencies and phones connect.

New technology that has the aforementioned risks should be regulated. There should be a standardized use of the technology. The regulations will protect citizens from the infringement on their rights, while also protecting law enforcement and government agencies from outrage by citizens. The goal of the technology should not just be ease of use for phones or catching criminals. The technology should also be utilized to protect the rights of all citizens.

Corinne Maupin is a second-year law student at Valparaiso University School of Law, and is also obtaining her Masters in Health Administration.
She earned her Bachelor’s Degree in Healthcare Leadership from Valparaiso University and plans to use her studies to change the way people look at healthcare.

Suggested citation:Corinne Maupin, Face the Facts: Facial Recognition Software Should be Strictly Monitored , JURIST &#8212 Student Commentary, Nov. 30, 2017, http://jurist.org/dateline/2017/07/Corinne Maupin-facial_recognition.php

This article was prepared for publication by Sean Merritt, an Assistant Editor for JURIST Commentary. Please direct any questions or comments to him at commentary@jurist.org

Opinions expressed in JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of JURIST's editors, staff, donors or the University of Pittsburgh.