contact tracing – Techdirt (original) (raw)

German Police Caught Using COVID-Tracing Data To Search For Crime Witnesses

from the law-enforcement-means-applying-enforcement-in-one-direction-only dept

Multiple governments have been relying on contact-tracing apps to limit the spread of COVID. This has gone on nearly uninterrupted for the last couple of years in more than a few countries. Given the type of data collected — contact information and location data — it was only a matter of time before some government decided to abuse this new information source for reasons unrelated to tracking COVID infections.

I guess the only surprise is that it took this long to be abused.

Authorities in Germany faced increasing criticism on Tuesday over their misuse of a COVID contact tracing app to investigate a case.

[…]

The incident concerns authorities in the city of Mainz. At the end of November, a man fell to his death after leaving a restaurant in the city, prompting police to open a case.

While trying to track down witnesses, police and prosecutors managed to successfully petition local health authorities to release data from the Luca app, which logs how long people stayed at an establishment.

Authorities then reached out to 21 potential witnesses based on the data they had unlawfully acquired from the app.

The Luca app used in Germany collects data on visitors to public places. Users enter their contact info into the app and scan QR codes posted at restaurants, bars, and public events. When they leave the venue, Luca users sign out of the location.

This app has proven very useful in Germany, mostly due to it automating the mandatory paperwork required of restaurant and venue owners, who were required to gather contact information on patrons and log the time they spent in their businesses. The Luca app does this automatically and encrypts the info, protecting it from the prying eyes of malicious outsiders.

Both the venue and the health department have to agree to decrypt the data and, once decrypted, it remains solely in the hands of the health department. It is only supposed to be used to track potential infections, hence the backlash against police and prosecutors in Mainz.

Following the backlash, prosecutors are now promising to never do this again. But that pledge only applies to these law enforcement officials. According to Luca’s developers, lots of cops are asking for this data.

The app’s developers, culture4life, sharply criticized the actions of authorities in Mainz.

“We condemn the abuse of Luca data collected to protect against infections,” the company said in a statement.

Culture4life added that it receives frequent requests for its data from the law enforcement — but those requests are routinely denied.

This may be Germany’s first scandal related to misuse of COVID-tracking data. Hopefully, the public response to this news will help it to be its last. But if the rules that have been in place since the app went into use aren’t sufficient to deter law enforcement from seeking data it’s clearly illegal for it to obtain, it’s unlikely a little bad press targeting another agency will have much of an effect on investigators who think they’ve found a better way to round up suspects or witnesses.

Filed Under: contact tracing, covid tracking, germany, luca, surveillance

No Good Deed Goes Unpunished: Google/Apple Criticized… For Seeking To Protect Privacy In UK Gov't Covid Contact Tracing

from the always-a-complaint dept

There are plenty of legitimate things to complain about regarding some of the big internet companies — but so many people these days view things through a weird prism in which every single action absolutely must be for evil intent, even when it’s actually for a good reason. Sometimes this leads to crazy reactions in which the companies are criticized for doing the exact opposite things, with both approaches being framed as nefarious.

The latest is a very odd piece by Rory Cellan-Jones in the UK. The National Health Service (NHS) there had a contact tracing app early in the pandemic, but last summer, recognizing the limitations of its own system, switched to the framework developed by Apple and Google early on. As you may recall, Google and Apple (somewhat surprisingly) came together early on to set up a framework for contact tracing — and the two companies put privacy front and center in the development of the system, with both recognizing (1) the inherent privacy concerns of medical information, and (2) the fact that many people already were skeptical of the two companies.

And, pretty quickly we saw some weird pushback, like the Washington Post whining that the app was too protective of privacy, keeping your health information out of the hands of government officials.

When the UK decided to switch over to Apple/Google’s system, it agreed to abide by the privacy rules that Apple and Google established. But, it appears the NHS tried to push the boundaries and go beyond the privacy framework. Specifically, under the updated version, if a user tested positive for COVID, the app asked the user to upload their “venue” history (all the places they had “checked in” to according to the app). But a core part of the privacy setup was that your location info was designed to be kept decentralized and on your phone. The fear being that if you’re uploading your locations it becomes a prime surveillance tool. Thus, Google and Apple rejected the updated app.

And that leads to the BBC piece that explains all of this, but then concludes by complaining about Google and Apple’s ability to block these privacy-invasive feature:

What this underlines is that governments around the world have been forced to frame part of their response to the global pandemic according to rules set down by giant unelected corporations.

At a time when the power of the tech giants is under the microscope as never before, that will leave many people feeling uncomfortable.

Really? It seems odd that this should be the point that leaves people feeling uncomfortable. It set up rules to help keep everyone’s data private. The government tried to violate those rules. Google and Apple said no. If we should feel uncomfortable about anything it’s about the government trying to sneak around the clearly established privacy framework.

And, no, governments are not being “forced” to frame part of their response according to the rules set down by “giant unelected corporations” (I’m separately unclear who elected the NHS officials working on this app, but alas…). After all, the NHS had its own app before, but decided that the Google/Apple framework was a better one to adapt.

So what a bizarre stance to take to argue that this effort to better protect privacy somehow makes those two companies look bad.

The thing that gets me the most about stories like this is that they undermine stories in which real concerns and real bad behavior are called out. When you automatically lump all actions into the “ooooh, evil big company” pile, without determining whether there are legitimate, non-nefarious reasons for their actions (or, as in this case, concepts that are designed to better protect end-user privacy), it makes it that much harder to focus in on the real concerns.

Filed Under: contact tracing, covid, privacy, uk
Companies: apple, google

Surprise! Singapore Backtracks On Privacy Pledge And Opens Contact Tracing Data To Police

from the making-contact dept

Singapore has a relatively long history when it comes to using modern technology to create a surveillance state within its borders. The monitoring of use of the internet and other digital services goes way back to 2002, sold to the citizenry as both an anti-terrorism bulwark and a tool to keep hate-speech at bay. Of course, though the populace as a whole seemed to take to the government’s use of surveillance for a variety of reasons, Singapore also has a history of clamping down on any speech it simply doesn’t like.

At present, of course, surveillance of populations has increased worldwide, though in the form of contact tracing to combat the COVID-19 pandemic. All sorts of technology and tools have been rolled out to accomplish effective contract tracing, with unfortunately far less emphasis put on securing the data of participants. It should go without saying that if contact tracing is going to be effective, it needs to be widely trusted and adopted. Any breaks in the links of the contact chain render it worthless. Which is probably why Singapore had assured its citizenry, when rolling out its plan for contact tracing using the TraceTogether app, that any data collected from it would be secured and used only for tracing purposes.

In its efforts to ease privacy concerns, the Singapore government had stressed repeatedly that COVID-19 data would “never be accessed unless the user tests positive” for the virus and was contacted by the contact tracing team. Personal data such as unique identification number and mobile number also would be substituted by a random permanent ID and stored on a secured server.

Minister-in-Charge of the Smart Nation Initiative and Minister for Foreign Affairs, Vivian Balakrishnan, also had insisted the TraceTogether token was not a tracking device since it did not contain a GPS chip and could not connect to the internet. He further noted that all TraceTogether data would be encrypted and stored for up to 25 days, after which it would be automatically deleted, adding that the information would be uploaded to the Health Ministry only when an individual tested positive for COVID-19 and this could be carried out only by physically handing over the wearable device to the ministry, Balakrishnan said.

The promises went on, including assurances that a very small number of contact tracers would have access to the data. This, again, was done specifically to increase the adoption in use of the app in order to get the pandemic in Singapore under control. The interests of public health ruled supreme, said the government.

Those interests lasted mere months, however, now that the Singapore government has announced that law enforcement would get access to the data for any number of reasons, including for use in open investigations.

However, the Singapore government now has confirmed local law enforcement will be able to access the data for criminal investigations. Under the Criminal Procedure Code, the Singapore Police Force can obtain any data and this includes TraceTogether data, according to Minister of State for Home Affairs, Desmond Tan. He was responding to a question posed during parliament Monday on whether the TraceTogether data would be used for criminal probes and the safeguards governing the use of such data.

He noted that “authorised police officers” may invoke the Criminal Procedure Code to access TraceTogether data for such purposes as well as for criminal investigation, but this data would, otherwise, be used only for contact tracing and to combat the spread of COVID-19.

It’s hard to imagine any such assurances finding much purchase given the one-eighty the government already performed on its previous promises. Privacy advocates are crying foul, with ProPrivacy’s Ray Walsh noting that the Singapore government appears poised to mandate the use of TraceTogether while also opening that data up to law enforcement, a scenario sure to breed distrust of the app during a global pandemic.

“This is extremely concerning considering that the government is planning to make the use of the TraceTogether app mandatory for all citizens,” he said. “Test and trace systems forced on the general public for the purposes of preventing the spread of the pandemic have no right being used to create an extensive surveillance network, and it is extremely unnerving to see a soon-to-be mandatory app being exploited in this way.”

As Churchill said, “Never let a good crisis go to waste.” One hopes that, when he said it, Churchill didn’t have the creation of a mass surveillance tool excused by a pandemic in mind.

Filed Under: contact tracing, covid, law enforcement, privacy, singapore, surveillance

If A College Is Going To Make COVID-19 Contact Tracing Apps Mandatory, They Should At Least Be Secure

from the tracer-round dept

One of the more frustrating aspects of the ongoing COVID-19 pandemic has been the frankly haphazard manner in which too many folks are tossing around ideas for bringing it all under control without fully thinking things through. I’m as guilty of this as anyone, desperate as I am for life to return to normal. “Give me the option to get a vaccine candidate even though it’s in phase 3 trials,” I have found myself saying more than once, each time immediately realizing how stupid and selfish it would be to not let the scientific community do its work and do it right. Challenge trials, some people say, should be considered. There’s a reason we don’t do that, actually.

And contact tracing. While contact tracing can be a key part of siloing the spread of a virus as infectious as COVID-19, how we contact trace is immensely important. Like many problems we encounter these days, there is this sense that we should just throw technology at the problem. We can contract trace through our connected phones, after all. Except there are privacy concerns. We can use dedicated apps on our phones for this as well, except this is all happening so fast that it’s a damn-near certainty that there are going to be mistakes made in those apps.

This is what Albion College in Michigan found out recently. Albion told students two weeks prior to on-campus classes resuming that they would be required to use Aura, a contact tracing app. The app collects a ton of real-time and personal data on students in order to pull off the tracing.

Aura, however, goes all in on real-time location-tracking instead, as TechCrunch reports. The app collects students’ names, location, and COVID-19 status, then generates a QR code containing that information. The code either comes up “certified” if the data indicates a student has tested negative, or “denied” if the student has a positive test or no test data. In addition to tracking students’ COVID-19 status, the app will also lock a student’s ID card and revoke access to campus buildings if it detects that a student has left campus “without permission.”

TechCrunch used a network analysis tool to discover that the code was not generated on a device but rather on a hidden Aura website—and that TechCrunch could then easily change the account number in the URL to generate new QR codes for other accounts and receive access to other individuals’ personal data.

It gets worse. One Albion student was able to discover that the app’s source code also included security keys for Albion’s servers. Using those, other researchers into the app found that they could gain access to all kinds of data from the app’s users, including test results and personal identifying information.

Now, Aura’s developers fixed these security flaws…after the researchers brought them to light and after the school had made the use of the app mandatory. If anyone would like to place a bet that these are the only two privacy and security flaws in this app, then they must certainly not like having money very much.

To be clear, plenty of other schools are trying to figure out how to use technology to contact trace as well. And there’s probably a use for technology in all of this, with an acceptable level of risk versus the benefit of bringing this awful pandemic under control.

But going off half-cocked isn’t going to help. In fact, it’s only going to make the public less trustful of contact tracing attempts in the future, which is the last thing we need.

Filed Under: academia, contact tracing, covid-19, mandatory, security, students
Companies: albion college, aura

Privacy Concerns Lead To Deletion Of All Data Collected By Norway's Contact Tracing App

from the not-enough-infections-is-a-nice-problem-to-have dept

In the early days of the coronavirus outbreak — a few months ago, in other words — there was a flurry of activity around contact tracing apps. Desperate to be seen to be doing something — anything — governments around the world rushed to announced their own digital solutions to tracing people who have been in the proximity of infected individuals. There are now over 40 in various stages of development. After the initial excitement, it’s striking how quiet things have gone on the contact tracing front, as projects struggle to turn politicians’ promises into useful programs. Some of the apps are beginning to emerge now, and we’re likely to hear more about them over the next few weeks and months. For example, there’s been an interesting development in Norway, one of the first to release its smartphone app, Smittestopp (“infection stop”), back in April. As the Guardian reports:

On Friday, the [Norwegian] data agency Datatilsynet issued a warning that it would stop the Norwegian Institute of Public Health from handling data collected via Smittestopp.

Datatilsynet said the restricted spread of coronavirus in Norway, as well as the app’s limited effectiveness due to the small number of people using it, meant the invasion of privacy resulting from its use was disproportionate.

There are two important points there. One is about the balance between tackling COVID-19, and protecting privacy. In this case, the Norwegian data protection authority (NDPA) believes that the benefit is so small that the harm to privacy is unjustified. The other is that when the infection rate is low, as is the case in Norway, which has reported fewer than 250 deaths from coronavirus so far, people may not see much point in using it. Professor Camilla Stoltenberg, Director-General at the Norwegian Institute of Public Health, is unhappy with Datatilsynet’s move:

We do not agree with the NDPA’s evaluation, but now we will delete all data and put work on hold following the notification. This will weaken an important part of our preparedness for increased transmission because we are losing time in developing and testing the app. Meanwhile, we have a reduced ability to combat ongoing transmission. The pandemic is not over. There is no immunity in the population, no vaccine, and no effective treatment. Without the Smittestopp app, we will be less equipped to prevent new local or national outbreaks

It’s worth noting that Stoltenberg admits that “the work involved in getting the app to work optimally has taken longer than planned, partly because there are few people who are infected”. As the number of COVID-19 cases continues to fall in some countries, those developing contact tracing apps there may encounter similar problems.

Follow me @glynmoody on Twitter, Diaspora, or Mastodon.

Filed Under: contact tracing, covid-19, data, deletion, norway, privacy

Why Using Cellphones To Trace The Pandemic Won't Save Black Lives

from the connecting-the-pandemic-to-the-protests dept

Caught between COVID-19 and cop violence, and now risking their health to protest these conditions, Black communities need a comprehensive anti-racist public health response to this growing pandemic. Instead, some U.S. states are investing in apps to contain the virus.

George Floyd died, unable to breathe, with Officer Derek Chauvin’s knee pressed into his neck for almost nine minutes. According to the full autopsy report released by the victim’s family, Mr. Floyd had also tested positive for COVID-19 weeks prior to his death. While Mr. Floyd was asymptomatic and the virus was not a contributing factor in his death, thousands of Black people throughout the United States have died, breathless—collateral damage in a double pandemic that has taken too many Black lives. COVID-19 has sickened at least 1.9 million people and killed more than 100,000 in the U.S., the highest death toll across the globe. Black people are disproportionately represented in both the number of infections and the number of deaths.

To contain the viral pandemic, cities and counties across the U.S. are adopting something called Technology Assisted Contact Tracing (TACT), using cell phone apps as a means of identifying people diagnosed with COVID-19, notifying people who have been in contact with them of possible exposure to the disease, and advising them of protocols to limit the spread of the virus.

I talked to a Bay Area nurse who conducts coronavirus contact tracing, and she described how understaffed, committed teams of medical professionals and others are working tirelessly to find sick and potentially sick people. As the President quickens the pace of “re-opening” the economy, public health experts have insisted that dramatically increasing U.S. capacity to implement contact tracing is a necessary step. But the costs and dangers, especially to Black and other underserved communities, may be greater than the benefits.

In many cities, massive manual contact tracing efforts are underway, engaging nurses and other medical staff, as well as laypeople, in making thousands of phone calls to identify, notify and educate people who may be infected with COVID-19. Contact tracing is considered a basic practice in epidemiology. Digital contact tracing is intended to accent these sorely needed existing manual contact tracing efforts, making them less expensive, and more effective and efficient.

But digital contact tracing or ‘TACT’ is not just a neutral scientific project. There are some real limitations to the technology that render it potentially ineffective and possibly harmful in Black, Latino and Native communities, and amongst low-income or homeless people. Some experts say at least 60% of a nation’s population must participate as app users for it to work. But, those hardest hit by the virus may also be least likely to have the access and means to use the apps.

What Exactly is Digital Contact Tracing?

Here’s the simple version for non-techies like me: Proximity tracing apps are a way to figure out if two people were in the same location at the same time. Proximity tracing involves using bluetooth detection, GPS tracking, or a hybrid of the two. This proximity detection capacity is powered by either the bluetooth detection framework that has recently been added to smartphones by both Google and Apple, or by existing GPS tracking.

Cell phone users download an app with a GPS-based location logger or proximity detection capacity. When a person is confirmed by a medical professional as having COVID-19, a medical provider certifies the diagnosis, and the infected person’s GPS data or proximity tokens are uploaded to the app’s server. People could indicate on the app, provided by public health officials with an opt-in by the user, that they’d been infected. Those who’d been nearby would receive a notification so that they could self-quarantine or, ideally, seek a diagnosis.

On its face, this kind of cell phone based proximity detection and notification system sounds really good, simple and necessary. As a person who grew up watching my mother suffer and die from sickle cell anemia and then my wife and friends suffer and die from cancer, I know what it is to be desperate for a way to avoid the tragedy of dying in a hospital alone. But, as global climate change and inequality exacerbates existing societal fractures, we have to decide as a nation whether our use of digital technologies will expand government surveillance, as digital contact tracing apps do, or shrink it.

App Inequity

A Pew Research Center survey from 2019 found that Blacks and Latinos are more likely than whites to use a smartphone for a broader range of activities, but these communities and lower-income smartphone users are about twice as likely as whites to have canceled or cut off service because of the expense. That same study found that some 19 percent of U.S. adults do not have smartphones—a proportion that rises to 29 percent among those in rural areas, 34 percent among people who did not graduate from high school, and a staggering 47 percent among people over the age of 65.

These proximity notification apps work on smartphones and could use a tremendous amount of someone’s data plan. Signal strength and availability of Internet WiFi is another major challenge, since these apps could require a fairly robust data plan. Using a large amount of data also often requires Internet access, yet more than 21 million Americans don’t have access to high speed Internet.

What’s more, the apps’ ability to recognize proximity means that the app can tell if two people are close to each other, but cannot target with enough specificity to tell whether or not there is a wall, or PPE between them. This can lead to false positives, especially in densely populated areas. In addition, though one person may have the app, if the person beside them does not, the app can produce false negatives by failing to identify a potentially infected person. In both of these scenarios of dense population and sporadic app availability, we can imagine how Black communities would be disproportionately impacted by the technology’s limitations.

Inequity = Ineffectiveness

These equity barriers mean the apps just won’t be as effective as they are designed to be.

For these apps to work, there are multiple decision points that create a barrier to use. A person has to download the app. Then they must disclose medical information. Finally, they must choose to quarantine and follow public health recommendations provided by the app or a follow up health provider.

At each point, equity challenges — from the disproportionate representation of Black and Latino workers performing front line essential services to crowded and substandard housing — lessen the likelihood of Blacks and Latinos voluntarily using and therefore gaining benefit from proximity notification apps.

Suspicion of public health workers and the medical establishment as a whole may be one of the most significant barriers to Black and Latino participation in digital contact tracing, despite the disparate impact of COVID-19 on these communities.

African Americans represent about a third of deaths from the COVID-19 pandemic, and 30% of known coronavirus cases, though they comprise only 13% of the U.S. population. Black people are hospitalized for coronavirus complications at three times the rate of white people. In 21 states, Blacks are dying at substantially higher rates. This trend of disproportionate Black dying, while associated with a new virus, has a long and painful history. Since the trans-Atlantic slave trade began in the 1600’s, Black people in the Americas have been deprived of quality food, housing, health care, clothing and all the things a person needs to remain healthy.

At the same time Black people live under extraordinarily stressful conditions, from recurring police violence and over-policed communities, mass incarceration, employment and educational discrimination and so much more. As a result, Black bodies have become more susceptible to conditions like heart disease and diabetes. Black people are more likely to die from all types of cancers.

Despite evidence that generational deprivation caused by systemic white supremacy has had a powerful hand in creating these racial health disparities, Black people are consistently blamed for their poor health outcomes and for their disparate rate of death from coronavirus.

From U.S. Surgeon General Jerome Adams? recent remarks in the press that encouraged Black people to ?take better care of themselves? to prevent COVID-19 deaths?to President Trump?s recent suggestion that protests for Black civil rights caused a spike in coronavirus rates, the White House is pointing fingers at Black communities that should be pointed at America?s broken health care system. Latinos also comprise a greater share of confirmed cases than their share of the population, and share similar health disparities. The Navajo Nation has been hit especially hard.

Black people are not only blamed for the durability of this virus, they are also punished for it. Ninety percent of people arrested in encounters related to social distancing in New York City have been Black or Latino, and some of the arrests captured on video have been extremely brutal. Arrests are conducted by officers without adequate personal protective equipment, and those arrested on a charge of failing to adequately social distance end up in jails where social distancing is impossible. Though it’s clear we can’t police our way out of this pandemic, it seems some cities are committed to continuing to use excessive, brutal and discriminatory policing to enforce social codes.

Blamed for the underlying conditions that make Black people more likely to die from COVID-19, criminalized, left behind by government relief funds and steeped in a history of medical bias and government misuse of personal data—Black communities, alongside Latinos, Native American, lower-income, undocumented and some Asian communities have little reason to trust government contact tracing apps.

The technology only works if federal and state dollars are subsequently invested in impacted communities, appropriate interventions in place. For example, in New Orleans, health officials identified equity barriers in their drive-through testing strategy for the coronavirus. Census tract data revealed hot spots for the virus were located in predominantly low-income African-American neighborhoods where many residents lacked cars. So they changed their strategy and sent mobile testing vans into the community, instead of having the community come to them. But these interventions are few and far between.

Perhaps the biggest effectiveness barrier for Black and other underserved communities is inequity in testing. Testing generates data, and Technology Assisted Contact Tracing apps rely on robust data generated by robust testing. So what happens when Black communities are denied robust testing due to structural inequalities and implicit bias in the medical system?

In April 2020, 30-year-old Rana Zoe Mungin, a beloved Black teacher in Brooklyn New York, died from COVID-19 after twice being denied testing. Under pressure from civil rights groups and public health advocates, cities and states have begun to collect race-based data on who is getting sick and dying from this virus. Right now, race or ethnicity is known in about half of all cases and in 90 percent of deaths. Without adequate testing, there is insufficient data for the apps to track contacts. Without adequate testing, appropriate protection and accessible health care, contact tracing apps do not ease the burden or hardworking public health officials. In fact, under-utilized contact tracing apps may misdirect health care workers and make the job of containing the virus even harder.

Privacy Challenges

While early exposure notification using proximity detection apps could be minimally useful, its potential usefulness does not outweigh another very real threat: spying. These apps are likely to be created by developers who have limited experience with managing privacy concerns and sensitive medical data, yet the privacy parameters will be left to each app developer. An inexperienced app developer with limited privacy and contract tracing background could cause real harm. Though Google and Apple have gone a long way toward managing privacy concerns, not every state is going to use the Google/Apple API.

In the rush to a technological fix, cities and states may overlook what’s needed to protect highly sensitive data being collected and placed in centralized, government run databases. This data needs to be locked down with clear use agreements when governments are entering into contracts with private companies, including app developers and database developers like Salesforce. A patchwork state to state, city to city, approach with no federal privacy standards or use agreements could be a significant threat to user data. It can be really hard to genuinely protect privacy, especially if the political will to protect all users equally isn’t there.

For these reasons alone, Black people have good reason to be skeptical of contact tracing apps. But history provides an even better reason. After the terrorist attacks on 9-11, Americans were told that the Patriot Act, a new Department of Homeland Security, and the Immigration and Customs Enforcement (ICE) were critically necessary. All three came into being in the direct aftermath of those attacks and were deployed domestically during the U.S invasion of Afghanistan. The Patriot Act installed new powers that were supposed to target terrorism, but have since been used to fuel racial profiling, while DHS has fueled a burgeoning system of digital surveillance and multi-state cooperation through the use of fusion centers. We’ve already experienced the way expanded surveillance powers for one purpose can be transferred and used for another.

Right now, the Pandemic is already providing a distraction to dramatic expansions of existing surveillance powers. The US Senate recently failed in its attempt to limit law enforcement agencies’ access to web browsing data without a warrant, which reinforced the government’s expansive surveillance powers. Though it’s clear that facial recognition is not the solution to what is now a public health crisis, controversial tech company Clearview says it’s in talks with federal and state agencies to track COVID19 using facial recognition. Face-scanning systems are already in use or under consideration in the coronavirus response. Tampa General Hospital in Florida recently implemented a screening system that includes thermal-scanning face cameras that look for fevers, sweating, and discoloration. Texas-based Athena Security has been pitching a similar product to grocery stores, hospitals, and voting locations.

The COVID19 Consumer Data Protection Act would require companies to get consent from individuals to collect health information, device information, geolocation, or proximity information. It would also make companies disclose why their data is being collected, how it will be handled, who it will be transferred to, and how long it will be retained. But advocates are concerned it doesn’t go far enough. It certainly would not protect Black app users from the potential pivot from government sponsored pandemic surveillance to police surveillance.

Proximity Apps Must Be Coupled With Comprehensive Public Health Strategies

It’s clear that inequality, not ignorance, is fueling COVID-19 infection rates in America. When racial disparities are coded as biological medical problems caused by the patient, rather than political problems caused by long-standing structural inequalities, it creates a wall of mistrust between patients and providers that is already limiting the success of manual contact tracers. Louisiana has invested millions of dollars, but fewer than half are answering the phone. The same barriers will exist when trying to get people to use an app that wants to track your location and share your medical information.

To contain the virus in Black communities, contact tracing apps have a role to play, but it can never function successfully as a primary solution. It’s clear some states will build their own apps and if they do, they should follow these principles for Technology Assisted Contact Tracing. But instead, cities should strengthen and invest in the human infrastructure for contact tracing and public health that is rooted in relationships and trust. Build small, nurse-led medical teams, and invest in technological solutions that can help bring medicine into people’s homes, reach people on the street, speak to those for whom English is not their first language. Connect people to testing and real services in real time. And if they end up sick, alone and dying — advocate for them in hospitals.

Over-reliance on technology cannot solve massive social problems, but it can create them. An app cannot bridge structural inequalities baked into American healthcare. The effectiveness of the proposed contact tracing apps seem limited comparative to potential negative impacts. If we skip over these disparities, if we pour resources and direct investment toward technical fixes without also repairing what has prevented Black patients from getting the best medical care available, it will be a huge, ineffective and expensive distraction. This may be at least one of the reasons contact tracing apps haven’t really taken off in the U.S. thus far. Some states have invested in the apps, but most still do not and many have no immediate plans to adopt one.

Contact tracing or proximity apps could help limit the spread of the virus among some populations, but without a comprehensive public health agenda, it won’t be enough to save Black lives.

Malkia Devich Cyril is an award winning activist, writer and public speaker on issues of digital rights, narrative power, Black liberation and collective grief; as well as the lead founder and former Executive Director of MediaJustice — a national hub boldly advancing racial justice, rights and dignity in a digital age. After more than 20 years of leadership, Devich-Cyril now serves as a Senior Fellow at Media Justice.

Filed Under: black lives matter, contact tracing, covid-19, george floyd, inequality, pandemic, privacy, racism

Coronavirus Surveillance Is Far Too Important, And Far Too Dangerous, To Be Left Up To The Private Sector

from the who-do-you-trust dept

Months into the global pandemic, governments, think tanks, and companies have begun releasing comprehensive plans to reopen the economy, while the world will have to wait a year or longer for the universal deployment of an effective vaccine.

A big part of many of these plans are digital tools, apps, and public-health surveillance projects that could be used to contain the spread of COVID-19. But even if they’re effective, these tools must be subject to rigorous oversight and laws preventing their abuse. Corporate America is already contemplating mandatory worker testing and tracking. Digital COVID passports that could grant those with immunity or an all-clear from a COVID test the right to enter stores, malls, hotels, and other spaces may well be on the way.

We must be ready to watch the watchers and guard against civil rights violations.

Many governments and pundits are turning to tech companies that are promising digital contact tracing applications and services to augment the capacity of manual contact tracers, as they work to identify transmission chains and isolate people exposed to the virus. Yet civil society groups are already highlighting the serious privacy implications of such tools, underscoring the need for robust privacy protections.

The potential for law enforcement and corporate actors alike to abuse these tracking systems is just too great to ignore. For their part, most democratic governments have largely recognized that the principle of voluntary adoption of this technology — rather than attempts at state coercion — is more likely to encourage use of these apps.

But these applications are not useful unless significant percentages of cellphone users use them. An Oxford University study suggests that for a similar app to successfully suppress the epidemic in the United Kingdom, 80 percent of British cellphone users would have to use it, which equates to 56 percent of the overall UK population. If the numbers for a digital contact tracing program to succeed stateside were similar, that would mean activating more than 100 million users.

The level of adoption will dictate just how well these technologies prevent the spread of the virus, but no matter how widespread such voluntary adoption may be, there is still potential for coercion, abuse, and targeting of specific users and communities without their consent. Some companies and universities are already planning to develop their own contact tracing systems and require their employees or students to participate. The consulting firm PricewaterhouseCoopers is advising companies on how to create these systems, and other smaller tech firms are designing Bluetooth beacons to facilitate the tracking of workers without smartphones.

An unaccountable regime of COVID surveillance could represent a great near-term threat to civil rights and privacy. Already marginalized communities suffering most from this crisis are the most exposed to the capricious whims of corporate leaders eager to restart supply chains and keep the manufacturing and service sector operating.

Essential workers are subject to serious health risks while doing their jobs during a pandemic, and employers mandating use of these technologies without public oversight creates another risk to worker rights. This paints a particularly tragic picture for the Black community which has been disproportionately affected by the pandemic in terms of sickness, death, and unemployment.

Black and Latinx people are more likely to work as cashiers in grocery stores, in nursing homes, or in other service-industry jobs that make infection far more likely. Many such workers are already subject to pervasive and punitive workplace surveillance regimes. But now, there may be real public-health equities at play. When these workers go to work, they have to do so in close proximity to others. Employers must protect them and digital tracking tools may well be part of saving lives. But that balance ought to be struck by public-health officials and worker-safety authorities in consultation with affected employees.

This system of private-health surveillance may not just affect workers. Grocery store, retail, and restaurant owners, eager to deploy this kind of technology to regain the confidence of shoppers, may well see the logic in incentivizing widespread public deployment as well.

Those same stores could offer a financial incentive to customers who can prove they have a contact-tracing app installed on their phone, or they could integrate it into already existing customer loyalty apps. Coordinated efforts from businesses to mitigate losses due to sick workers or the threat of repeated government shutdowns could make incentivizing or demanding COVID-passports worth the investment to them. We may well find ourselves in a situation where a digitally checkpointed mall, Whole Foods, or Walmart feels like an oasis — the safest place in the world outside our homes.

Unaccountable deployment of these systems threatens to create further divides between workers and consumers, the tracked and untracked, or perilous division between those who can afford repeated testing and those who can’t.

So far, few officials have weighed these tradeoffs. As of yet, the only federal legal guidance on these questions has come from the Equal Employment Opportunity Commission, which has ruled that employers can legally institute mandatory temperature checks and other medical exams as conditions of continued employment.

Lawmakers have to do more. They must provide protections for the unauthorized use of this information and not allow access to places of public accommodation – a core civil right – to be determined by a mere app. We must seriously consider what it would mean for a free society, should businesses find it makes financial sense to invest in their own health-surveillance systems or deny people access to corner markets or grocery stores if they aren’t carrying the right pass on their person.

We do not have to be resigned to the deployment of a permanent state surveillance apparatus or the capriciousness of the private sector. If our post-9/11 experience is a guide, then we know that unaccountable surveillance infrastructure implemented during a crisis is wildly difficult to dismantle.

We must not construct a recovery that casts a needless decades-long shadow over our society, entrenches the power of large corporations, and further exacerbates class and racial divides. Governments must proactively decide the permissible uses and limits of this technology and the data it collects, and they must demand that these surveillance systems, private or otherwise, be dismantled at the end of the crisis.

Gaurav Laroia is the Senior Policy Counsel at consumer Group Free Press, working alongside the policy team on topics ranging from internet-freedom issues like Net Neutrality and media ownership to consumer privacy and government surveillance.

Filed Under: contact tracing, covid-19, privacy, private sector, public sector, surveillance

The Case For Contact Tracing Apps Built On Apple And Google's Exposure Notification System

from the tradeoffs dept

Apple and Google have now released their update to their mobile operating systems to include a new capability for COVID-19 exposure notification. This new technology, which will support contact tracing apps developed by public health agencies, is technically impressive: it enables notifications of possible contact with COVID-positive individuals without leaking any sensitive personal data. The only data exchanged by users are rotating random keys (i.e., a unique 128-digit string of 0s and 1s) and encrypted metadata (i.e., the protocol version in use and transmitted power levels). Keys of infected individuals, but not their identities or their locations, are downloaded by the network upon a positive test with the approval of a government-sanctioned public health app.

Despite being a useful tool in the pandemic arsenal and adopting state-of-the-art techniques to protect privacy, the Apple-Google system has drawn criticism from several quarters. Privacy advocates are dreaming up ways the system could be abused. Anti-tech campaigners are decrying ?tech solutionism.? None of these critiques stands up to scrutiny.

How the exposure notification API works

To get a sense for how the Apple-Google exposure notification system works, it is useful to consider a hypothetical system involving raffle tickets instead of Bluetooth beacons. Imagine you were given a roll of two-part raffle tickets to carry around with you wherever you go. Each ticket has two copies of a randomly-generated 128-digit number (with no relationship to your identity, your location, or any other ticket; there is no central record of ticket numbers). As you go about your normal life, if you happen to come within six feet of another person, you exchange a raffle ticket, keeping both the ticket they gave you and the copy of the one you gave them. You do this regularly and keep all the tickets you?ve exchanged for the most recent two weeks.

If you get infected with the virus, you notify the public health authority and share only the copies of the tickets you?ve given out?the public health officials never see the raffle tickets you?ve received. Each night, on every TV and radio station, a public health official reads the numbers of the raffle tickets it has collected from infected patients (it is a very long broadcast). Everyone listening to the broadcast checks the tickets they?ve received in the last two weeks to see if they?ve ?won.? Upon confirming a match, an individual has the choice of doing nothing or seeking out a diagnostic test. If they test positive, then the copies of the tickets they?ve given out are announced in the broadcast the next night. The more people who collect and hand out raffle tickets everywhere they go, and the more people who voluntarily announce themselves after hearing a match in the broadcast, the better the system works for tracking, tracing, and isolating the virus.

The Apple-Google exposure notification system works similarly, but instead of raffle tickets, it uses low-power Bluetooth signals. Every modern phone comes with a Bluetooth radio that is capable of transmitting and receiving data over short distances, typically up to around 30 feet. Under the design agreed to by Apple and Google, iOS and Android phones updated to the new OS, that have their Bluetooth radios on, and that have a public health contact tracing app installed will broadcast a randomized number that changes every 10 minutes. In addition, phones with contact tracing apps installed on them will record any keys they encounter that meet criteria set by app developers (public health agencies) on exposure time and signal strength (say, a signal strength correlating with a distance up to around six feet away). These parameters can change with new versions of the app to reflect growing understanding of COVID-19 and the levels of exposure that will generate the most value to the network. All of the keys that are broadcast or received and retained are stored on the device in a secure database.

When an individual receives a positive COVID-19 diagnosis, she can alert the network to her positive status. Using the app provided by the public health authority, and with the authority?s approval, she broadcasts her recent keys to the network. Phones download the list of positive keys and check to see if they have any of them in their on-device databases. If so, they display a notification to the user of possible COVID-19 exposure, reported in five-minute intervals up to 30 minutes. The notified user, who still does not know the name or any other data about the person who may have exposed her to COVID-19, can then decide whether or not to get tested or self-isolate. No data about the notified user leaves the phone, and authorities are unable to force her to take any follow-up action.

Risks to privacy and abuse are extremely low

As global companies, Google and Apple have to operate in nearly every country around the world, and they need to set policies that are robust to the worst civil liberties environments. This decentralized notification system is exactly what you would design if you needed to implement a contact tracing system but were concerned about adversarial behavior from authoritarian governments. No sensitive data ever leaves the phone without the user?s express permission. The broadcast keys themselves are worthless, and cannot be tied back to a user?s identity or location unless the user declares herself COVID-positive through the public health app.

Some European governments think Apple and Google?s approach goes too far in preserving user privacy, saying they need more data and control. For example, France has indicated that it will not use Apple and Google?s API and has asked Apple to disable other OS-level privacy protections to let the French contact tracing app be more invasive (Apple has refused). The UK has also said it will not use Apple and Google?s exposure notification solution. The French and British approach creates a single point of failure ripe for exploitation by bad actors. Furthermore, when the government has access to all that data, it is much more likely to be tempted to use it for law enforcement or other non-public health-related purposes, risking civil liberties and uptake of the app.

Despite the tremendous effort the tech companies exerted to bake privacy into their API as a fundamental value, it is not enough for some privacy advocates. At Wired, Ashkan Soltani speculates about a hypothetical avenue for abuse. Suppose someone set up a video camera to record the faces of people who passed by, while also running a rooted phone?one where the user has circumvented controls installed by the manufacturer?that gave the perpetrator direct access to the keys involved. Then, argues Soltani, when a COVID-positive key was broadcast over the network, the snoop could be able to correlate it with the face of a person captured on camera and use that to identify the COVID-positive individual.

While it is appropriate for security researchers like Soltani to think about such hypothetical attacks, the real-world damage from such an inefficient possible exploit seems dubious. Is a privacy attacker going to place cameras and rooted iPhones every 30 feet? And how accurate would this attack even be in crowded areas? In a piece for the Brookings Institution with Ryan Calo and Carl Bergstrom, Soltani doubles down, pointing out that ?this ?decentralized? architecture isn?t completely free of privacy and security concerns? and ?opens apps based on these APIs to new and different classes of privacy and security vulnerabilities.?

Yet if ?completely free of privacy and security concerns? is the standard, then any form of contact tracing is impossible. Traditional physical contact tracing involves public health officials interviewing infected patients and their recent contacts, collecting that information in centralized government databases, and connecting real identities to contacts. The Google-Apple exposure notification system clearly outperforms traditional approaches on privacy grounds. Soltani and his collaborators raise specious problems and offer no solution other than privacy fundamentalism.

Skeptics of the Apple-Google exposure notification system point to a recent poll by the Washington Post that found ?nearly 3 in 5 Americans say they are either unable or unwilling to use the infection-alert system.? About 20% of Americans don?t own a smartphone, and of those who do, around 50% said they definitely or probably would not use the system. While it?s too early to know how much each component of coronavirus response contributes to suppression, evidence from Singapore and South Korea suggests that technology can augment the traditional public health toolbox (even with low adoption rates). In addition, there are other surveys with contradictory results. According to a survey by Harris Poll, ?71% of Americans would be willing to share their own mobile location data with authorities to receive alerts about their potential exposure to the virus.? Notably, cell phone location data is much more sensitive than the encrypted Bluetooth tokens in the Apple-Google exposure notification system.

Any reasonable assessment of the tradeoff between privacy and effectiveness for contact tracing apps will conclude that if the apps are at all effective, they are overwhelmingly beneficial. For cost-benefit analysis of regulations, the Environmental Protection Agency has established a benchmark of about $9.5 million per life saved (other government agencies use similar values). By comparison, the value of privacy varies depending on context, but the range is orders of magnitude lower than the value of saving a life, according to a literature review by Will Rinehart.

If we have any privacy-related criticism of the tech companies? exposure notification API, it is that it requires the user to opt in by downloading a public health contact tracing app before it starts exchanging keys with other users. This is a mistake for two reasons. First, it signals that there is a privacy cost to the mere exchange of keys, which there is not. Even the wildest scenarios concocted by security researchers entail privacy risks from the API only when a user declares herself COVID-positive. Second, it means that the value of the entire contact tracing system is dependent on uptake of the app at all points in time. If the keys were exchanged all along, then even gradual uptake of the app would unlock value in the network that had built up even before users installed the app.

The exposure notification API is part of a portfolio of responses to the pandemic

Soltani, Calo, and Bergstrom raise other problems with contact tracing apps. They will result in false positives (notifications about exposures that didn?t result in transmission of the disease) and false negatives (failures to notify about exposure because not everyone has a phone or will install the app). If poorly designed (without verification from the public health authority), apps could allow individuals who are not COVID-positive to ?cry wolf? and frighten a bunch of innocent people, a practice known in the security community as ?griefing.? They want their readers to understand that the rollout of a contact tracing app using this API will not magically solve the coronavirus crisis.

Well, no shit. No one is claiming that these apps are a panacea. Rather, the apps are part of a portfolio of responses that can together reduce the spread of COVID and potentially avoid the need for rolling lockdowns until a cure or vaccine is found (think of how many more false negatives there would be in a world without any contact tracing apps). We will still need to wear masks, supplement phone-based tracing methods with traditional contact tracing, and continue some level of distancing until the virus is brought fully under control. (For a point-by-point rebuttal of the Brookings article, see here from Joshua B. Miller).

The exposure notification API developed by Google and Apple is a genuine achievement: it will enable the most privacy-respecting approach to contact tracing in history. It was developed astonishing quickly at a time when the world is in desperate need of additional tools to address a rapidly spreading disease. The engineers at Google and Apple who developed this API deserve our applause, not armchair second-guessing from unpleasable privacy activists.

Under ordinary circumstances, we might have the luxury of interminable debates as developers and engineers tweaked the system to respond to every objection. However, in a pandemic, the tradeoff between speed and perfection shifts radically. In a viral video in March, Dr. Michael J. Ryan, the executive director of the WHO Health Emergencies Programme, was asked what he?s learned from previous epidemics and he left no doubt with his answer:

Be fast, have no regrets. You must be the first mover. The virus will always get you if you don?t move quickly. […] If you need to be right before you move, you will never win. Perfection is the enemy of the good when it comes to emergency management. Speed trumps perfection. And the problem in society we have at the moment is that everyone is afraid of making a mistake. Everyone is afraid of the consequence of error. But the greatest error is not to move. The greatest error is to be paralysed by the fear of failure.

We must move forward. We should not be paralyzed by the fear that somewhere someone might lose an iota of privacy.

Filed Under: apis, contact tracing, privacy
Companies: apple, google

Now The Washington Post Misleadingly Complains About Google & Apple Protecting Your Privacy Too Much

from the oh-come-on-guys dept

Both the NY Times and the Washington Post have been among the most vocal in attacking internet companies like Google and Facebook, claiming that they’re bad regarding your privacy. Yet, like with France (who fined Google for its privacy practices, but then got mad at the company over the privacy-protecting features of its COVID contact tracing API), the Washington Post has a very, very weird article complaining about Google and Apple’s project because it’s too protective of people’s privacy. We’ve talked in the past about how the API (jointly developed between Apple and Google) was designed from the ground up to be privacy protective. And you know damn well that if the API wasn’t developed as such there would be huge articles in the Washington Post and elsewhere decrying this API as a threat to everyone’s privacy. Yet here, the complaint is that it’s too protective, because these companies simply can’t win.

John Gruber, over at Daring Fireball, has an excellent post explaining just how spectacularly bad the Washington Post article is, but we’ll do our own treatment as well.

The crux of the article is that some “health officials” are annoyed that the API won’t share data with them directly, but is more designed to alert individuals themselves if they may have come into contact with someone who turns out to be COVID-19 positive.

But as the tech giants have revealed more details, officials now say the software will be of little use. Due to strict rules imposed by the companies, the system will notify smartphone users if they?ve potentially come into contact with an infected person, but it won?t share any data with health officials or reveal where those meetings took place.

Local health authorities in states like North Dakota, as well as in countries such as Canada and the United Kingdom, say they?ve pleaded with the companies to give them more control over the kinds of information their apps can collect. Without the companies? help, some worry their contact tracing systems will remain dangerously strained.

But Apple and Google have refused, arguing that letting the apps collect location data or loosening other smartphone rules would undermine people?s privacy.

Now, a good news report would explain why it’s important for Google and Apple’s API to protect people’s privacy — and maybe even highlight how lots of people, including some at the Washington Post, have frequently hammered Google and Apple over privacy concerns. Hell, just a few weeks ago, one of the very same reporters on this article, Drew Harwell, was bylined on an article saying that most Americans wouldn’t want to use apps based on the API because they don’t trust Google and Apple’s privacy protections. Though, of course, even that headline was misleading. That headline said “Most Americans are not willing or able to use an app tracking coronavirus infections. That’s a problem for Big Tech’s plan to slow the pandemic.” Yet, they have to do some funny math to make that “most” work, because the actual data said that 50% said they would use it, and among those who had phones, 59% said they’d be comfortable with the app informing others they had COVID-19.

So just a few weeks earlier, the same Washington Post, and one of the same reporters, was crowing about how people wouldn’t trust Apple and Google with their contact tracing apps due to privacy concerns. Then they publish this other piece saying that health officials are steamed that the companies are doing too much to protect people’s privacy. They can’t win.

The article then quotes a very confused professor (tragically, from my own alma mater):

But Helen Nissenbaum, a professor of information science and director of the Digital Life Initiative at Cornell University, called Apple and Google?s use of privacy to defend their refusal to allow public health officials access to smartphone technology a ?flamboyant smokescreen.? She said it was ironic that the two companies had for years tolerated the mass collection of people?s data but were now preventing its use for a purpose that is ?critical to public health.?

?If it?s between Google and Apple having the data, I would far prefer my physician and the public health authorities to have the data about my health status,? she said. ?At least they?re constrained by laws.?

Basically all of this is wrong or bullshit, and a good reporter would have either (a) immediately pointed out that this is bullshit or (b) not published the bullshit. First off, it’s not a “flamboyant smokescreen.” Google and Apple very clearly put a lot of thought into the privacy features of this API. Second, they’re not “preventing its use” for something “critical to public health”. The entire point of the API is to make use of this data in a way that helps deal with the crisis. And this is new data, not the data they’ve “mass collected.” On top of that, despite Nissenbaum’s insinuations to the contrary, none of this is new. It’s how Google and Apple work. Both have long histories of fighting back to make sure that government agencies can’t access your private data without a very clear legal basis to do so.

Most importantly, though, Google and Apple don’t have the data. That’s part of the “privacy protection” here — and anyone would know that if they looked at anything that Google and Apple have put out about this API. The data stays on your phone. It’s based on your phone, and then the individuals get to make the choice of whether or not to share the data. The FAQ from Apple and Google make this all very clear:

In keeping with our privacy guidelines, Apple and Google will not receive identifying information about the user, location data, or information about any other devices the user has been in proximity of.

And, if the users decide, then the necessary information can be shared with public health officials:

If a user chooses to report a positive diagnosis of COVID-19 to their contact tracing app, the user?s most recent keys to their Bluetooth beacons will be added to the positive diagnosis list shared by the public health authority so that other users who came in contact with those beacons can be alerted.

It seems like both Nissenbaum and the Washington Post owe people a rather large apology.

Next up, there’s a quote from Matt Stoller, who has built up a cottage industry making ignorant statements about big internet companies (and cheering on Senator Josh Hawley’s anti-internet nonsense). While I’ve come to expect nonsense from Stoller, the quote he gives the Post is beyond the pale:

?They are exercising sovereign power. It?s just crazy,? said Matt Stoller, the director of research at the American Economic Liberties Project, a Washington think tank devoted to reducing the power of monopolies. Apple and Google have ?decided for the whole world,? he added, ?that it?s not a decision for the public to make. ? You have a private government that is making choices over your society instead of democratic governments being able to make those choices.?

Again, nearly everything Stoller says here is wrong. Gruber’s summary of it covers this better than anything I would say:

This quote is what?s crazy. Again, this guy Stoller clearly has no idea what he?s talking about. Apple and Google deciding how their operating systems work, in compliance with all existing laws, all around the world, is not ?exercising sovereign power?. No one here is alleging that Apple or Google are doing anything even vaguely illegal. They?re not toeing some sort of line, they?re not taking advantage of any sort of loopholes.

And if Apple and Google did what Stoller and Nissenbaum seem to want them to do???track location data of every person you?re in contact with and report that data automatically to government health officials, they almost certainly would be breaking all sorts of laws around the world. The whole point of Europe?s well-intentioned but overzealous GDPR law???88 dense pages in PDF???is, quoting from its preamble, ?Natural persons should have control of their own personal data.? That?s exactly the point of Apple and Google?s system???and seemingly exactly the opposite of what every source in this Post story thinks Apple and Google should do.

Honestly, it’s not at all difficult to imagine that if Google and Apple’s API was automatically handing data over to the government, you’d see Stoller and Nissenbaum still complaining and suddenly they’d be all concerned about the companies “exercising sovereign power” to “mass collect people’s data” and just “handing it over to the government.” Again, this is a no win situation, in which the companies are being shat upon as if they’re doing the wrong thing when it’s clear they’ve bent over backwards to make sure they were doing the right thing and giving as much power and control as possible to the end user.

Basically every quote in this piece is utter nonsense — the kind of nonsense that a reporter should be explaining why it’s wrong or not publishing. But here it is all published as if these people are making good points. Here’s the next one:

?Every minute that ticks by, maybe someone else is getting infected, so we want to be able to use everything we can,? said Vern Dosch, the contact-tracing liaison for North Dakota. ?I get it. They have a brand to protect. I just wish they would have led with their jaw.?

Huh? What “brand” are they protecting here? The brand that says… they’re going to help out by building a big system that others can build on and use for free, and that actually protects people’s privacy? I honestly don’t get what what Dosch is even saying here. Of course government officials want every piece of data, and this system is designed to help them get more data, but also to protect privacy. And I honestly have no clue what “led with their jaw” even means here.

It’s only twenty six paragraphs in that the article mentions how “some privacy advocates have applauded the companies? stance around anonymity and security concerns,” but then it shits on that almost immediately:

But some parts of the U.S., including Apple and Google?s home state, say the restrictions have rendered the apps effectively useless. In California, epidemiologists in charge of contact tracing are ignoring the Apple-Google approach and have decided the best course for contact tracing is to train thousands of people to do the work.

Which “apps” are they even talking about? This is an API, not an app, and it’s not even out yet, so these “apps” can’t be useless yet. They don’t exist. And the fact that California epidemiologists are focusing on training human contact tracers is… meaningless? No one has said that this API should replace human contact tracers. The idea has always been that it’s another tool — not a replacement. And then we get another ridiculous quote:

?The limitations of those kind of apps are extensive,? said Mike Reid, an assistant professor of medicine at the University of California at San Francisco, who is leading the effort to train contact tracers in the state. ?I don?t think they have an important role to play for most of the population.?

The contact tracers, he said, will be using software made by Salesforce and Accenture to help reach patients by phone and are trained on how to protect sensitive patient information.

?We go to pains to minimize the amount of data we take from people and we ask consent from people we?re talking to on the phone. We go to considerable lengths to ensure there are strong technical controls to ensure the anonymization of our platforms,? he said. ?Can you say the same thing about these big tech companies? I?m not sure.?

Um. Dude. Did you not even bother to read the details of the API that you’re commenting on, about which you say you’re “not sure” if the data is minimized or that there are strong technical controls to make sure the data remains anonymous? Because half of this very article is all about how other health professionals are annoyed that the apps are doing too much to protect the data.

And, honestly, how is it that these reporters are using quotes that are in direct conflict with other quotes in the article (the API keeps things too private, who knows if the big companies will keep things private…) as if they’re making the same argument.

This is just bad, bad reporting.

With the Apple and Google approach, ?We?ve overcompensated for privacy and still created other risks and not solved the problem,? said Ashkan Soltani, the former chief technologist of the Federal Trade Commission. ?I?d personally be more comfortable if it were a health agency that I trusted and there were legal protections in place over the use of the data and I knew it was operated by a dedicated security team.?

I know and like Ashkan, and have quoted him in the past, but this… is just a bizarre quote in its own right. Google and Apple have two of the best “dedicated security teams” around. Meanwhile the federal health agency, Health & Human Services, has a history of getting hacked, including some sort of hack as the pandemic began (exactly what happened still has not been made clear).

But some public health experts believe the push toward unproven virus-tracing apps has wasted time and missed the point. Tom Frieden, the former director of the Centers for Disease Control and Prevention now working with the health organization Vital Strategies, said the proximity-tracing system as proposed by Apple and Google has ?been largely a distraction.?

?There are very serious questions about its feasibility and its ability to be done with adequate respect for privacy, and it has muddied the water for what actually needs to happen,? Frieden said in an interview Wednesday. ?This was an approach that was done with not much understanding and a lot of overpromising.?

This quote may be the most accurate of the bunch, but in its own way misleading as well. I haven’t seen anyone “overpromising.” The people involved in the project and supportive of it have argued that it might be an additional useful tool beyond everything that everyone else is doing. I haven’t seen how it’s “muddied the waters” for what others need to do.

Honestly, we don’t really know how useful the apps built on this API will or won’t be. There are reasons to be skeptical of their usefulness, but if you wanted to understand why, you wouldn’t get help from this article, which really just seemed like an attempt by the reporters to collect as many disjointed anti-Google and Apple quotes as possible and put them all together in an article that is incredibly misleading and not even internally consistent.

Filed Under: api, contact tracing, data, helen nissenbaum, journalism, matt stoller, privacy, tradeoffs
Companies: apple, google

As Some Are Requiring People To Give Up Their Info To Dine, Stories Of Creeps Abusing That Info Come Out

from the the-privacy-conundrum dept

I think many of us are going to avoid eating at sit-down restaurants for the foreseeable future, even if governments deem them to be “safe.” However, I find it at least somewhat unnerving to see Governor Jay Inslee in Washington say that in order for a restaurant to offer dine-in services, it will need to keep a log of all diners for 30 days, including their telephone and email contact info.

Under Gov. Jay Inslee’s new statewide orders, Washington restaurants that offer sit-down service will be required to create a daily log of all customers.

The restaurants must maintain that log for 30 days, including telephone and email contact information and the time they were in the restaurant. The state wants this information to facilitate any contact tracing that might need to occur.

I fully understand why this requirement is there. Since contact tracing is so important, it’s much more difficult to do contact tracing in situations like these where there’s no way to tell who else was in the same small space where a COVID-positive person dined. But… at the same time, it seems to raise a number of privacy questions.

When I tweeted about this, some pushed back and said it wasn’t much different from ordering online or from an app (or even, potentially, paying with a credit card). All of those give up some level of privacy. Yet, as I’ve been saying for years, privacy is about trade-offs and a big part of that is understanding the benefits and the risks. And when we’re ordering with an app or using a credit card, there are reasonable systems in place that make it unlikely that your info will be abused. These are not perfect, and there are some cases where there are risks. But, for most people, the “threat model” suggests it’s not that risky.

Yet, it’s unclear if that’s the case with something like a “restaurant log,” like the one that Washington State is requiring. As an example of why that might be problematic, we can just head down to New Zealand (which appears to have almost entirely contained COVID-19) to hear of a story about a restaurant worker using the contact tracing info a customer left to hit on her:

“I had to put my details on their contact tracing form which I didn’t think anything of. It asked for my name, home address, email address and phone number so I put all those details down,” she tells Newshub.

Except in Jess’s case she didn’t just take away a sandwich from the Subway restaurant she was at. She also got a Facebook request, Instagram request, Facebook messenger approach and a text from the guy who served her, using her contact tracing details.

“I felt pretty gross, he made me feel really uncomfortable,” she says.

“He’s contacting me, I didn’t ask him to do that, I don’t want that.

“I’m lucky that I live with quite a few people because if that was me by myself at home – he knows my address you know – I’d feel really, really scared. Even now I feel a bit creeped out and vulnerable.”

The article does note that the Subway employee who did that digital stalking “has now been suspended” (is that New Zealand for fired?), but it can’t make anyone very comfortable.

And that’s a much bigger issue than just for that woman. If people are afraid that their private info can be misused, they’re less likely to give it. In other words, the nature of the privacy trade-offs are vastly different than they might otherwise be. Not understanding that leads to bad results, and yet that seems to be what’s happening in Washington.

After receiving some pushback, Inslee is now saying that the logs should only be kept for 14 days and that privacy “protocols” are developed. But that’s the kind of thing that need to be built up initially, not after such a plan is announced:

?This is something that we have to make sure that we build protocols around privacy so that any of this information can only be used for this purpose, can be expunged after 14 days so that this is only a minor inconvenience. No one is looking to make this a federal crime. We?re trying to save some lives here,? Inslee added.

Again, he means well, and there’s obvious value in contact tracing done correctly. But you can’t ignore the privacy issues, and you can’t tack them on after things are already messed up. Any system needs to develop the concept with privacy built in from the very start — and there’s no indication that Washington state has done so.

Update: Late this evening Governor Inslee announced that this would no longer be a requirement, though suggested that restaurants set up a voluntary system. It appears he listened to some of the criticism.

Filed Under: contact tracing, dining, pandemic, privacy, restaurants, tradeoffs, washington