face scanning – Techdirt (original) (raw)

Stories filed under: "face scanning"

Judge OKs Class Action Status For Illinoisans Claiming Facebook Violated State Privacy Law

from the face-off dept

The last time we discussed Illinois’ Biometric Information Privacy Act, a 2008 law that gives citizens in the state rights governing how companies collect and protect their biometric data, it was when a brother/sister pair attempted to use the law to pull cash from Take-Two Interactive over its face-scanning app for the NBA2K series. In that case, the court ruled that the two could not claim to have suffered any actual harm as a result of using their avatars, with their real faces attached, in the game’s online play. One of the chief aspects of the BIPA law is that users of a service must not find their biometric data being used in a way that they had not intended. In this case, online play with these avatars was indeed the stated purpose of uploading their faces and engaging in online play to begin with.

But now the law has found itself in the news again, with a federal court ruling that millions of Facebook users can proceed under a class action with claims that Facebook’s face-tagging database violates BIPA. Perhaps importantly, Facebook’s recent and very public privacy issues may make a difference compared with the Take-Two case.

A federal judge ruled Monday that millions of the social network’s users can proceed as a group with claims that its photo-scanning technology violated an Illinois law by gathering and storing biometric data without their consent. Damages could be steep — a fact that wasn’t lost on the judge, who was unsympathetic to Facebook’s arguments for limiting its legal exposure.

Facebook has for years encouraged users to tag people in photographs they upload in their personal posts and the social network stores the collected information. The company has used a program it calls DeepFace to match other photos of a person. Alphabet’s cloud-based Google Photos service uses similar technology and Google faces a lawsuit in Chicago like the one against Facebook in San Francisco federal court.

Both companies have argued that none of this violates BIPA, even when this face-data database is generated without users’ permission. That seems to contradict BIPA, where fines between 1,000and1,000 and 1,000and5,000 can be assessed with every use of a person’s image without their permission. Again, recent news may come into play in this case, as noted by the lawyer for the Facebook users in this case.

“As more people become aware of the scope of Facebook’s data collection and as consequences begin to attach to that data collection, whether economic or regulatory, Facebook will have to take a long look at its privacy practices and make changes consistent with user expectations and regulatory requirements,” he said.

Now, Facebook has argued in court against this moving forward as a class by pointing out that different users could make different claims of harm, impacting both the fines and outcomes of their claims. While there is some merit to that, the court looked at those arguments almost purely as a way for Facebook to try to get away from the enormous damages that could potentially be levied under a class action suit, and rejected them.

As in the Take-Two case, Facebook is doing everything it can to set the bar for any judgement on the reality of actual harm suffered by these users, of which the company claims there is none.

The Illinois residents who sued argued the 2008 law gives them a “property interest” in the algorithms that constitute their digital identities. The judge has agreed that gives them grounds to accuse Facebook of real harm. Donato has ruled that the Illinois law is clear: Facebook has collected a “wealth of data on its users, including self-reported residency and IP addresses.” Facebook has acknowledged that it can identify which users who live in Illinois have face templates, he wrote.

We’ve had our problems with class actions suits in the past, but it shouldn’t be pushed aside that this case has the potential for huge damages assessed on Facebook. It’s also another reminder that federal privacy laws are in sore need of modernization, if for no other reason than to harmonize how companies can treat users throughout the United States.

Filed Under: bipa, class action, face recognition, face scanning, illinois
Companies: facebook

DHS's New Airport Face-Scanning Program Is Expensive, Flawed, And Illegal

from the 3-out-of-3.-nice-job,-fellas. dept

We, the people, are going to shell out $1 billion for the DHS to scan our faces into possibly illegal biometric systems. Those are the conclusions reached by the Georgetown Law Center on Privacy and Technology. A close examination the face scanning system the DHS plans to shove in front of passengers of international flights shows it to be a waste of money with limited utility.

DHS’ biometric exit program… stands on shaky legal ground. Congress has repeatedly ordered the collection of biometrics from foreign nationals at the border, but has never clearly authorized the border collection of biometrics from American citizens using face recognition technology. Without explicit authorization, DHS should not be scanning the faces of Americans as they depart on international flights—but DHS is doing it anyway. DHS also is failing to comply with a federal law requiring it to conduct a rulemaking process to implement the airport face scanning program—a process that DHS has not even started.

But American citizens will be included, according to the DHS. Its response to US travelers’ wondering why they’re being treated like terrorism suspects is that they’re welcome to opt out of the collection. All they have to do is not fly. The DHS insists it’s only targeting foreign visitors, but the system will scan everyone. The agency also promises not to retain face scans of US citizens, but it’s highly doubtful it will keep that promise. The government has rolled out a variety of biometric collections, each one intermingled with existing law enforcement and terrorism databases. Collect it all and let the courts sort it out: that’s the government’s motto.

On top of the illegality and lack of proper deployment paperwork, there’s the fact the program really just doesn’t do anything useful. As the Center points out in its thorough report, there was originally a point to scanning incoming foreign visitors and comparing them to government databases: catching incoming criminals and members of terrorism watchlists. But there’s no solid rationale behind the push to scan faces of foreigners as they leave the country.

The DHS has a theory, but it’s not a good one.

DHS, for its part, has never studied whether there is a problem that necessitates a change in its approach to tracking travelers’ departures. DHS claims that the aim of the program is to detect visa overstay travel fraud and to improve DHS’ data on the departure of foreign nationals by “biometrically verifying” the exit records it already creates for those leaving the country.

Visa overstay travel fraud could—in theory—be a problem worth solving. Foreign nationals who wish to remain in the country undetected past the expiration of their visas could be arranging to have others leave the country in their place using fraudulent credentials. But DHS has only ever published limited and anecdotal evidence of this.

The DHS — despite rolling this out — still has no idea if it will do anything more than stock its database of human faces. Five years after being asked to demonstrate how biometric exit scans would be an improvement over the status quo, the DHS has yet to provide answers. In fact, it’s hasn’t even been able to deliver an estimate as to when its report answering these questions will be delivered.

This dovetails right into the DHS’s lackadaisical roll out of its biometric program. So far, the tech has only been installed in a few airports, but even in this limited trial run, the agency seems uninterested in ensuring the system’s accuracy. The DHS claims the program is doing great because it’s not returning a lot of false positives. But that’s the wrong metric if you’re hoping to catch people on the way out of the country.

DHS currently measures performance based on how often the system correctly accepts travelers who are using true credentials. But if the aim of this system is to detect and stop visa overstay travel fraud—as DHS suggests—it is critical and perhaps more important to assess how well it performs at correctly rejecting travelers who are using fraudulent credentials. Yet DHS is not measuring that.

The Center recommends DHS suspend the program indefinitely. It should not be put back into place until the DHS has clear legal authorization to do so and with all of the required privacy impact paperwork filed. It should spend some more time studying the tech to see if it can actually perform the job the DHS wants it to. The end goal for the tech — overstay travel fraud — seems like a spurious reason for expanded surveillance in US airports, especially when isn’t interested in limiting this biometric collection to foreign citizens only. But chances are none of these recommendations will be followed by the DHS — not while answering to a presidential administration that has done its best to portray most foreigners as inherent threats to the US way of life.

Filed Under: airports, dhs, face scanning, homeland security, privacy

DHS Goes Biometric, Says Travelers Can Opt Out Of Face Scans By Not Traveling

from the driving:-dangerous-and-unpatriotic dept

The DHS has decided air travel is the unsafest thing of all. In the wake of multiple fear mongering presidential directives — including a travel ban currently being contested in federal courts — the DHS has introduced several measures meant to make flying safer, but in reality would only make flying more of a pain in the ass.

The government has argued in court that flying is a privilege, not a right, and the DHS seems hellbent on making fliers pay for every bit of that privilege. We’ve seen laptop bans introduced as a stick to push foreign airports to engage in more security theater and a threat to rifle through all travelers’ books and papers to ensure nobody’s reading explosive devices.

Now, the DHS is going to be scanning everyone’s faces as they board/disembark international flights.

The Department of Homeland Security says it’s the only way to successfully expand a program that tracks nonimmigrant foreigners. They have been required by law since 2004 to submit to biometric identity scans — but to date have only had their fingerprints and photos collected prior to entry.

Now, DHS says it’s finally ready to implement face scans on departure — aimed mainly at better tracking visa overstays but also at tightening security.

The DHS swears it won’t be retaining face scans of US persons, but apparently never considered limiting the collection to foreign travelers. Instead, the DHS will “collect them all” and supposedly toss out US citizens’ scans later.

John Wagner, the Customs deputy executive assistant commissioner in charge of the program, confirmed in an interview that U.S. citizens departing on international flights will submit to face scans.

Wagner says the agency has no plans to retain the biometric data of U.S. citizens and will delete all scans of them within 14 days.

This sounds good (other than the collect-them-all approach) but Wagner’s not done talking. The DHS is obviously hoping to make use of US persons’ scans at some point.

However, [Wagner] doesn’t rule out CBP keeping them in the future after going “through the appropriate privacy reviews and approvals.”

This makes the promise of a 14-day deletion period dubious. The DHS would seemingly prefer to keep everything it collects, so this deletion promise may morph into data segregation, with the government keeping domestic scans in their own silo for possible use later.

The program is already being deployed at a handful of major airports. During the trial run, passengers will be able to opt out of the collection. But the DHS’s own Privacy Impact Assessment [PDF] makes it clear it won’t be optional for long.

Privacy Risk: There is a risk to individual participation because individuals may be denied boarding if they refuse to submit to biometric identity verification under the TVS.

Mitigation: This privacy risk is partially mitigated. Although the redress and access procedures above provide for an individual’s ability to correct his or her information, the only way for an individual to ensure he or she is not subject to collection of biometric information when traveling internationally is to refrain from traveling. [emphasis added] Individuals seeking to travel internationally are subject to the laws and rules enforced by CBP and are subject to inspection.

To opt-out is to not travel. Considering this affects international flights, the DHS has a very good chance of achieving 100% compliance.

But there are other percentages to be concerned about, like accuracy. The DHS has a 96% accuracy requirement for face scanning tech (but, oddly, not for its TSA employees…), but its Privacy Impact Awareness report doesn’t actually say whether vendors have been able to hit that mark. In practical terms, what’s being deployed could still be well under that percentage. Considering the number of things that need to go right to obtain a useful face scan, the error rate could be far above 4% once less-than-ideal capture conditions are factored in.

Whatever privacy assurances are being given now, expect them to be whittled down in the future, especially if the government continues to engage in reactionary, fear-based lawmaking. With the exception of some post-Snowden surveillance reforms, the government’s desire to collect databases full of US persons’ info has only steadily increased since September 11, 2001.

Filed Under: dhs, face scanning, facial recognition, homeland security, opt-out, privacy, tsa

UK Police Carry Out Facial Scans Of 100,000 People Attending Music Festival

from the yes,-we-scan dept

Last year, Techdirt wrote about Boston Police performing a test run of its facial recognition software on those attending a local music festival. Perhaps unsurprisingly, in the UK, land of a million CCTV cameras, the police have taken things even further. As this story in Noisey explains, drawing on a report on the Police Oracle site (registration required):

> This weekend’s Download Festival will be subjected to strategic facial recognition technology by Leicestershire Police, making those 100,000 plus attendees the first music fans to ever be monitored to this extent at a UK music festival > > ? > > The announcement article on Police Oracle reads, “the strategically placed cameras will scan faces at the Download Festival site in Donington before comparing it with a database of custody images from across Europe.”

The ostensible reason for this massive surveillance is to catch people who steal mobile phones, but that really doesn’t stand up to scrutiny. The database that the 100,000 faces were matched against was “custody images from across Europe”, but it seems improbable that criminals would travel all the way across Europe to this particular music festival in the hope that they might be able to relieve a few spaced-out musicgoers of their phones. Nor was general criminal behavior an issue: apparently, last year there were just 91 arrests with 120,000 people attending. It’s more likely that the facial scans were born of a desire to see if the hardware and software were capable of capturing such large numbers and comparing them with the pan-European database. Worryingly, the Download Festival may be just the start:

> According to the Police Oracle article previously cited, other festival organisers have expressed widespread interest in technology, pending a successful trial. DC Kevin Walker told the Oracle, “It is one of the first times it has been trialled outside, normally it is done in a controlled environment. There has also been a lot of interest from other festivals and they are saying: ‘If it works, can we borrow it?’ “

It’s easy to see this kind of technology being rolled out ever-more widely. First at other music festivals — purely for safety reasons, you understand — and then, once people have started to get used to that, elsewhere too. Eventually, of course, it will become routine to scan everyone, everywhere, all the time, offering a perfect analog complement to the non-stop, pervasive surveillance that we now know takes place in the digital world.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Filed Under: face scanning, facial recognition, police, surveillance, uk
Companies: download music festival