The Legal and Ethical Considerations of Facial Recognition Technology in the Business Sector

To the casual observer facial recognition technology (FRT) is a gimmick. It allows consumers to unlock an iPhone with a glance and Facebook to tag an individual within a group photo. However, the technology controlling these features has much broader implications. FRT is a powerful instrument that presents unique advantages and sobering drawbacks, raising questions that go to the foundation of privacy and freedom of expression.

Facial recognition is a type of physiological identifier, known as biometrics, that includes fingerprints, retina scanning, voice recognition, and DNA matching. It can be used in many aspects of police work, security screenings, and computer access, but also has consumer and business applications. Its main application in the US appears to be detecting characteristics like age or gender for digital advertising. Office spaces, mobile phones, online platforms, airports, and shopping centers are all increasing use of sophisticated cameras linked to algorithmic software. FRT can identify problem gamblers in casinos, greet visitors by name at hotels or connect individuals on dating websites. One of the more creative applications is Amazon’s new Go Stores. Cameras will capture shoppers’ images as they shop and send the feed to a “central processing unit” that will identify the customer and merchandise being held. When the customer leaves, the purchases will be automatically charged to their credit card.

Biometric information can be linked to personal information of any type, such as a person’s tax records, political affiliations, or arrest records, and research indicates Americans are concerned about private actors’ collecting data about them. Consumer sentiment about biometrics highlights particular discomfort with using the technology in business locations like malls and or airports. Despite negative consumer attitudes, facial recognition in the business sector is increasing. The facial recognition market reached $3.72B in 2020 and is expected to reach $11.62B by 2026.

Technology often outpaces the law, so it is not surprising there is little federal regulation of biometric privacy. Some companies have recognized the technology’s flaws—questions persist about AI’s accuracy and built-in biases—and have stopped selling facial recognition software to law enforcement. As Black Lives Matter protests erupted across the country, IBM, Amazon, and Microsoft halted sales of FRT to the police and urged more government regulation. While most of the civil liberty concerns involving FRT have been directed at law enforcement, there is also apprehension of its application to the private sector. The safeguards provided by the Constitution may not provide the same consumer protections in the retail, medical, and banking markets.

Illinois was the first state to regulate biometric information through the Biometric Information Privacy Act (BIPA) seeking to ensure transparency between private entities and consumers. The law restricts how private companies can use individual biometric data. Businesses that collect such information must inform consumers, justify the collection, explain how long the information will be retained, and secure written authorization before sharing the data with another entity. The statute creates a private cause of action, permitting individuals to file a claim for a violation of the law, and the penalties are substantial.

The most well-known case involving FRT is Patel v. Facebook, Inc. Several Facebook users instituted a suit, alleging Facebook utilized FRT without following BIPA’s mandates. The litigation concerned Facebook’s “Tag Suggestions” feature. The feature uses FRT to determine if a user’s friends are featured in that user’s pictures. If a match is discovered, the company notifies the user to tag them. The Ninth Circuit opined that Facebook’s activity was the exact injury anticipated by BIPA. Facebook appealed to the Supreme Court, but the Court denied the writ. The matter was eventually resolved for $650M, making it one of the biggest privacy-related settlements in history. Facebook also had to turn the facial recognition setting off by default and remove face templates unless it obtained user permission.

As FRT catapults society into unexplored terrain, its benefits must be balanced against its impact on privacy, data protection, and other consumer concerns. FRT can be used to obtain and process massive amounts of data for the police, businesses, and individuals, but that also makes it a controversial topic that will continue to be in the news for years to come.

The full article in its original form can be found here.

 

Samuel D. Hodge, Jr. is an adjunct professor at the Temple Beasley School of Law where he has taught law, anatomy, and forensic courses for more than 30 years. He is an award-winning author with more than 700 publications, six medical/legal books, and two legal texts. 

 

1 thought on “The Legal and Ethical Considerations of Facial Recognition Technology in the Business Sector”

Leave a Comment