biometric – Techdirt (original) (raw)

Recent Case Highlights How Age Verification Laws May Directly Conflict With Biometric Privacy Laws

from the privacy-nightmare dept

California passed the California Age-Appropriate Design Code (AADC) nominally to protect children’s privacy, but at the same time, the AADC requires businesses to do an age “assurance” of all their users, children and adults alike. (Age “assurance” requires the business to distinguish children from adults, but the methodology to implement has many of the same characteristics as age verification–it just needs to be less precise for anyone who isn’t around the age of majority. I’ll treat the two as equivalent).

Doing age assurance/age verification raises substantial privacy risks. There are several ways of doing it, but the two primary options for quick results are (1) requiring consumers to submit government-issued documents, or (2) requiring consumers to submit to face scans that allow the algorithms to estimate the consumer’s age.

[Note: the differences between the two techniques may be legally inconsequential, because a service may want a confirmation that the person presenting the government documents is the person requesting access, which may essentially require a review of their face as well.]

But, are face scans really an option for age verification, or will it conflict with other privacy laws? In particular, face scanning seemingly directly conflict with biometric privacy laws, such as Illinois’ BIPA, which provide substantial restrictions on the collection, use, and retention of biometric information. (California’s Privacy Rights Act, CPRA, which the AADC supplements, also provides substantial protections for biometric information, which is classified as “sensitive” information). If a business purports to comply with the CA AADC by using face scans for age assurance, will that business simultaneously violate BIPA and other biometric privacy laws?

Today’s case doesn’t answer the question, but boy, it’s a red flag.

The court summarizes BIPA Sec. 15(b):

Section 15(b) of the Act deals with informed consent and prohibits private entities from collecting, capturing, or otherwise obtaining a person’s biometric identifiers or information without the person’s informed written consent. In other words, the collection of biometric identifiers or information is barred unless the collector first informs the person “in writing of the specific purpose and length of term for which the data is being collected, stored, and used” and “receives a written release” from the person or his legally authorized representative

Right away, you probably spotted three potential issues:

[Another possible tension is whether the business can retain face scans, even with BIPA consent, in order to show that each user was authenticated if challenged in the future, or if the face scans need to be deleted immediately, regardless of consent, to comply with privacy concerns in the age verification law.]

The primary defendant at issue, Binance, is a cryptocurrency exchange. (There are two Binance entities at issue here, BCM and BAM, but BCM drops out of the case for lack of jurisdiction). Users creating an account had to go through an identity verification process run by Jumio. The court describes the process:

Jumio’s software…required taking images of a user’s driver’s license or other photo identification, along with a “selfie” of the user to capture, analyze and compare biometric data of the user’s facial features….

During the account creation process, Kuklinski entered his personal information, including his name, birthdate and home address. He was also prompted to review and accept a “Self-Directed Custodial Account Agreement” for an entity known as Prime Trust, LLC that had no reference to collection of any biometric data. Kuklinski was then prompted to take a photograph of his driver’s license or other state identification card. After submitting his driver’s license photo, Kuklinski was prompted to take a photograph of his face with the language popping up “Capture your Face” and “Center your face in the frame and follow the on-screen instructions.” When his face was close enough and positioned correctly within the provided oval, the screen flashed “Scanning completed.” The next screen stated, “Analyzing biometric data,” “Uploading your documents”, and “This should only take a couple of seconds, depending on your network connectivity.”

Allegedly, none of the Binance or Jumio legal documents make the BIPA-required disclosures.

The court rejects Binance’s (BAM) motion to dismiss:

Jumio’s motion to dismiss also goes nowhere:

[The Sosa v. Onfido case also involved face-scanning identity verification for the service OfferUp. I wonder if the court would conduct the constitutional analysis differently if the defendant argued it had to engage with biometric information in order to comply with a different law, like the AADC?]

The court properly notes that this was only a motion to dismiss; defendants could still win later. Yet, this ruling highlights a few key issues:

1. If California requires age assurance and Illinois bans the primary methods of age assurance, there may be an inter-state conflict of laws that ought to support a Dormant Commerce Clause challenge. Plus, other states beyond Illinois have adopted their own unique biometric privacy laws, so interstate businesses are going to run into a state patchwork problem where it may be difficult or impossible to comply with all of the different laws.

2. More states are imposing age assurance/age verification requirements, including Utah and likely Arkansas. Often, like the CA AADC, those laws don’t specify how the assurance/verification should be done, leaving it to businesses to figure it out. But the legislatures’ silence on the process truly reflects their ignorance–the legislatures have no idea what technology will work to satisfy their requirements. It seems obvious that legislatures shouldn’t adopt requirements when they don’t know if and how they can be satisfied–or if satisfying the law will cause a different legal violation. Adopting a requirement that may be unfulfillable is legislative malpractice and ought to be evidence that the legislature lacked a rational basis for the law because they didn’t do even minimal diligence.

3. The clear tension between the CA AADC and biometric privacy is another indicator that the CA legislature lied to the public when it claimed the law would enhance children’s privacy.

4. I remain shocked by how many privacy policy experts and lawyers remain publicly quiet about age verification laws, or even tacitly support them, despite the OBVIOUS and SIGNIFICANT privacy problems they create. If you care about privacy, you should be extremely worried about the tsunami of age verification requirements being embraced around the country/globe. The invasiveness of those requirements could overwhelm and functionally moot most other efforts to protect consumer privacy.

5. Mandatory online age verification laws were universally struck down as unconstitutional in the 1990s and early 2000s. Legislatures are adopting them anyway, essentially ignoring the significant adverse caselaw. We are about to have a high-stakes society-wide reconciliation about this tension. Are online age verification requirements still unconstitutional 25 years later, or has something changed in the interim that makes them newly constitutional? The answer to that question will have an enormous impact on the future of the Internet. If the age verification requirements are now constitutional despite the legacy caselaw, legislatures will ensure that we are exposed to major privacy invasions everywhere we go on the Internet–and the countermoves of consumers and businesses will radically reshape the Internet, almost certainly for the worse.

Reposted with permission from Eric Goldman’s Technology & Marketing Law Blog.

Filed Under: aadc, ab 2273, age assurance, age verification, biometric, biometric privacy, bipa, california, illinois, privacy
Companies: binance, jumio

Indian Supreme Court Rules Aadhaar Does Not Violate Privacy Rights, But Places Limits On Its Use

from the mixed-result dept

Techdirt wrote recently about what seems to be yet another problem with India’s massive Aadhaar biometric identity system. Alongside these specific security issues, there is the larger question of whether Aadhaar as a whole is a violation of Indian citizens’ fundamental privacy rights. That question was made all the more pertinent in the light of the country’s Supreme Court ruling last year that “Privacy is the constitutional core of human dignity.” It led many to hope that the same court would strike down Aadhaar completely following constitutional challenges to the project. However, in a mixed result for both privacy organizations and Aadhaar proponents, India’s Supreme Court has handed down a judgment that the identity system does not fundamentally violate privacy rights, but that its use must be strictly circumscribed. As The New York Times explains:

The five-judge panel limited the use of the program, called Aadhaar, to the distribution of certain benefits. It struck down the government’s use of the system for unrelated issues like identifying students taking school exams. The court also said that private companies like banks and cellphone providers could not require users to prove their identities with Aadhaar.

The majority opinion of the court said that an Indian’s Aadhaar identity was unique and “unparalleled” and empowered marginalized people, such as those who are illiterate.

The decision affects everything from government welfare programs, such as food aid and pensions, to private businesses, which have used the digital ID as a fast, efficient way to verify customers’ identities. Some states, such as Andhra Pradesh, had also planned to integrate the ID system into far-reaching surveillance programs, raising the specter of widespread government spying.

In essence, the Supreme Court seems to have felt that although Aadhaar’s problems were undeniable, its advantages, particularly for India’s poorest citizens, outweighed those concerns. However, its ruling also sought to limit function creep by stipulating that Aadhaar’s compulsory use had to be restricted to the original aim of distributing government benefits. Although that seems a reasonable compromise, it may not be quite as clear-cut as it seems. The Guardian writes that it still may be possible to use Aadhaar for commercial purposes:

Sharad Sharma, the co-founder of a Bangalore-based technology think tank which has worked closely with Aadhaar’s administrators, said Wednesday’s judgment did not totally eliminate that vision for the future of the scheme, but that private use of Aadhaar details would now need to be voluntary.

“Nothing has been said [by the court] about voluntary usage and nothing has been said about regulating bodies mandating it for services,” Sharma said. “So access to private parties for voluntary use is permitted.”

That looks to be a potentially large loophole in the Supreme Court’s attempt to keep the benefits of Aadhaar while stopping it turning into a compulsory identity system for accessing all government and business services. No doubt in the coming years we will see companies exploring just how far they can go in demanding a “voluntary” use of Aadhaar, as well as legal action by privacy advocates trying to stop them from doing so.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Filed Under: aadhaar, biometric, id, identification, india, privacy

Videos From Wearable Cameras Contain Natural Biometric Markers That Can Eliminate Anonymity

from the motion-pictures dept

Video evidence figures quite frequently here on Techdirt, because moving pictures of incidents are generally compelling and incontrovertible. That’s true even if they are released anonymously to protect the person recording the event from retribution. But new research suggests that videos from wearable cameras have embedded within them natural biometric markers (via New Scientist):

> Egocentric cameras are being worn by an increasing number of users, among them many security forces worldwide. GoPro cameras already penetrated the mass market, and Google Glass may follow soon. As head-worn cameras do not capture the face and body of the wearer, it may seem that the anonymity of the wearer can be preserved even when the video is publicly distributed. We show that motion features in egocentric video provide biometric information, and the identity of the user can be determined quite reliably from a few seconds of video.

The paper describing the work also points out some consequences of this result:

> Egocentric video biometrics can prevent theft of wearable cameras by locking the camera when worn by people other than the owner. In video sharing services, this Biometric measure can help to locate automatically all videos shot by the same user. An important message in this paper is that people should be aware that sharing egocentric video will compromise their anonymity.

On the plus side, this also means that videos from police body-cameras can also be tied to particular officers, which may help to make such evidence less vulnerable to tampering.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Filed Under: anonymity, biometric, body cameras, wearable cameras

Awesome Stuff: Little Devices That Help You Out

from the make-it-work dept

For this week’s “Awesome Stuff” post I wasn’t necessarily planning a “theme,” but it seemed to mostly work out as one anyway: it’s about three “little” devices that enable you to do more, by changing the way we deal with information in one way or another. This is a pretty exciting space in general, and it’s cool to see projects popping up that explore certain areas that make you wonder why no one had done this before — and then you realize that what’s being done wasn’t really possible until the tech caught up.

There you go. Three interesting new projects that are showing new ways to do more via little devices and information, enabling things that really weren’t possible until just recently — at least not in these kinds of packages.

Filed Under: awesome stuff, biometric, car computers, driving, heating, id, passwords, sensors
Companies: automatic, heatmeter, mysecureid

Israel Trying To Build Biometric Database

from the privacy? dept

Reader Ido alerts us to the news coming out of Israel, that the Senate there has moved forward on a bill that would create a huge biometric database including data on all Israelis, and refusing to provide such data could land anyone a year in jail. As the article notes, there’s a rather loud uproar about this, as many Israelis fear not only for their own privacy and civil liberties, but wonder just how such a database will be abused — either by gov’t officials or by hackers. It sounds like the bill still has a ways to go before becoming law, but this appears to be yet another move by a government to mistakenly assert that taking away people’s privacy somehow makes them more secure.

Filed Under: biometric, database, israel, privacy