Legal Law

Biometrics really brings out the double-edged nature of technology [Sponsored]

In a discussion of biometrics, it may depend on the stories you’ve heard whether people react negatively or positively.

Have you read about it? the man in Russia Who took out their rubbish during a pandemic lockdown only to have the police show up at their door 30 minutes later? The man had been identified using facial biometric technology. Needless to say, some people find the story alarming, a Big Brother scenario.

But it’s hardly that easy. What if people heard about the Las Vegas man who sexually abused a 7 year old girl?? The man was arrested using similar facial recognition technology and sentenced to 35 years in prison. Less big brother, more great justice.

Opinions are likely to vary as widely as the types of information considered biometric. Merriam-Webster Dictionary defines this as “measuring and analyzing unique physical or behavioral characteristics (such as fingerprints or voice patterns), particularly as a means of verifying personal identity. “However, laws governing biometric technology all define biometrics differently.

Surveillance is one aspect of biometric technology, but identity verification is more common. Many people have used a fingerprint to unlock a smartphone or access a mobile banking application. A company that handles sensitive or confidential information may use biometric identity verification when its employees access a work platform. United States Customs and Border Protection already uses biometrics Moving people through airport security.

“The pandemic accelerated an already booming biometric industry as the technology becomes more readily available, useful and popular,” said Nicola C. Menaldo, a partner in the Seattle office of Perkins Coie LLP, who has litigated clients and advises on areas such as biometrics , Scraping and web crawling as well as artificial intelligence.

Biometrics has a number of pandemic-specific uses, Menaldo said. “For example, technologies that can possibly detect whether someone is wearing a face mask. … There is also a great need for contactless access to various locations. I think this has driven the demand for products and services that rely on facial recognition and recognition. “

Menaldo advises those who use these systems to carefully review the laws and regulations that apply to them, which plaintiffs and regulators may interpret in a very strict and unforgiving manner. Additionally, she notes that best practices when collecting personal information, such as biometrics, are often to “provide appropriate and contextual guidance and be transparent about data collection activities”.

Keeping up with the law is not without its challenges. Biometrics is at the center of many legal disputes. One problem is that the definition differs from law to law, according to James G. Snell, partner in the Palo Alto office of Perkins Coie and faculty chairman of the twenty-second annual privacy and cybersecurity law at the Practicing Law Institute May.

For example, Snell said that the definition of biometrics in the California Consumer Protection Act differs from the definition in the Data Protection Violation Act, which is different from the definition in state regulations, but it is not clear whether and what material differences in the law will result from inconsistencies . Additionally, Snell noted that what is going on in California is nothing compared to what is happening in Illinois, where the Illinois Biometric Information Privacy Act has sparked a wave of class action lawsuits.

Menaldo said more than a thousand class action lawsuits have been filed under Illinois law since 2015, and settlements have been publicly announced for tens or hundreds of millions of dollars. Most lawsuits have been filed against small businesses with relatively few employees who use fingerprint technology to track when someone is dropping in and out.

Big corporations weren’t immune, however. These cases “range from problems with fingerprints to voice prints and speech analysis to scans of facial geometry and what the law means when the data is used to create AI models and algorithms,” Menaldo said. “It’s really enough. … All of these cases are in the early stages so there is not much evidence from the courts. “

Similar disputes have arisen at the federal level. In January, the Federal Trade Commission announced a settlement with Everalbum Inc. on “allegations that it has deceived consumers into using facial recognition technology and keeping the photos and videos of users who have disabled their accounts.”

The FTC press release for comparison said the company “must obtain explicit consumer consent before using facial recognition technology on their photos and videos. The proposed contract also provides for the company to delete models and algorithms it has developed using the photos and videos uploaded by its users. “

The Everalbum promotion did not include a specific biometric law, according to Rebecca S. Engrav, a partner in the Seattle office of Perkins Coie LLP and faculty of the upcoming annual PLI Institute for Privacy and Cybersecurity Law.

The right available to the FTC was Section 5 of the FTC Act. The “FTC’s disposition of consent against Everalbum is therefore only made under its general authority for unfair and misleading business practices,” said Engrav. She said that “it is normal for the FTC to use the general authority of Section 5 when it comes across new technology. What is strange, however, is that in the Everalbum Consent Ordinance, the FTC created a whole-substance definition for the term “biometric information”. It defined this concept as “data that represents or describes the physical or biological characteristics of an identified or identifiable person, including images”.

Engrav is concerned that, at first glance, this definition allows any ordinary photo to flow into the concept of “biometric information”. She believes these issues need to be considered much more carefully before “we go down this path because photos as a technology have been around for decades and decades, and they are used by businesses, governments, law enforcement agencies and all kinds of people. And there are quite a number of laws and cultural norms that exist in relation to photos, such as: B. Distinctions based on where the photo was taken. “

With the right care and caution, Engrav said, “We could probably get to better laws that are more nuanced and really address the damage appropriately without unnecessarily stopping the good.”

Federal law dealing with biometrics may be in sight. California updated the CCPA, and Virginia also passed a privacy law. Both will go into effect for the most part in 2023, and other states like Washington are also considering privacy laws. State measures often drive federal measures. “The current view is that there will likely be no federal data protection legislation this year. There could be one in 2022, although things are moving fast in this area. “Said Snell.

He stressed the need for lawmakers not to go too far. A company got into trouble for using public photos of people without their consent to develop algorithms for facial recognition technology. A press account portrayed this as an invasion of privacy.

“I think reasonable minds can differ in what should have been done or what kind of disclosure should have been made, but the article suggested that these photos were all messed up in some way in the algorithm, but that’s just not the case. “

In reality, the final algorithms were completely anonymous, nothing but records of features.

Engrav also stressed the need to be clear about the nature of biometric technologies before reaching any conclusions. She found that video surveillance has been used in the US for decades. While she understands concerns about “massive facial recognition to the extent that we can hear it in China, I think we owe it to ourselves and to lawmakers … to be precise, what’s different, and to articulate what the law is should be in our opinion. ”

In essence, Engrav said, “Laws shouldn’t be technology related. Regardless of the laws governing the recording of images in public spaces, they should be technology-independent. ”

Menaldo agreed. “I think what privacy advocates see all the time is that the law lags far behind technological advances, which brings with it all sorts of problems and unintended consequences. And I think [a law] that targets a technology – as opposed to a problem or principle – will have that problem. “

Face recognition today, Menaldo said, is none other than face recognition will be 10 years from now. “It’s a breeze to try and legislate based on today’s technology.”

Click here to learn more about the Practicing Law Institute’s twenty-second annual Privacy and Cybersecurity Law Institute.

Elizabeth M. Bennett was a business reporter who turned to legal journalism while covering the Delaware courts. That blow inspired her to go to law school. After spending a few years practicing law in the Philadelphia area, she retired to the Pacific Northwest and returned to freelance reporting and editing.

Related Articles