whatsapp – Techdirt (original) (raw)

NSO Group Asked Israeli Government To Help It Hide Malware Docs From WhatsApp

from the surely-something-only-an-honest-company-would-do dept

Before the news had broken that NSO Group’s clients were utilizing its powerful spyware to target journalists, dissidents, activists, religious leaders, opposition party members, and anyone else that might have irritated the autocrats and human rights abusers that made up a disproportionate percentage of its customer list, NSO was sued by Meta and WhatsApp.

That lawsuit alleged NSO Group had illegally accessed and utilized WhatsApp’s software and servers to distribute malware to surveillance targets. It’s a problematic lawsuit — one that seeks to see the CFAA (which has been abused perpetually since its inception) read as outlawing any access that might violate terms of service, including access that simply allowed NSO software to reach targets using WhatsApp.

NSO has since tried multiple times to have the lawsuit thrown out. One of its more creative efforts tried to portray NSO Group as nothing more than a stand-in for the governments it sold to. By portraying itself this way, NSO hoped to invoke sovereign immunity. That argument was rejected by two consecutive levels of the judiciary. NSO would have been better served by sticking to its first argument: that it could not be held directly accountable for actions performed by its customers, especially since that’s pretty much the only argument it’s left with at this point in time.

Having failed to get the lawsuit dismissed, the litigation moved forward. Finally, it reached a point NSO hoped it never would: discovery. Earlier this year, the court ordered NSO to turn over a bunch of info, including the source code of the malware that traveled through Meta’s servers to infect WhatsApp users.

The source code has yet to be delivered to the court and WhatsApp. It may never get there. As Harry Davies and Stephanie Kirchgaessner report for The Guardian, NSO Group called on a higher power to help it dodge its courtroom obligations:

Israeli officials seized documents about Pegasus spyware from its manufacturer, NSO Group, in an effort to prevent the company from being able to comply with demands made by WhatsApp in a US court to hand over information about the invasive technology.

Documents suggest the seizures were part of an unusual legal manoeuvre created by Israel to block the disclosure of information about Pegasus, which the government believed would cause “serious diplomatic and security damage” to the country.

Neat! And it comes with a form of plausible deniability built in: the Israeli government could claim it seized this information as part of its own investigation of NSO Group. Of course, that investigation is already closed and it wasn’t publicly announced until long after NSO was in (international) hot water. The government concluded it did nothing wrong when it used NSO spyware. It didn’t have much to say about NSO itself, although it did (very belatedly) limit the countries NSO could sell to.

But this is just a weird form of regulatory capture. NSO Group was formed by former Israeli intelligence officers. For years, Israel’s government helped broker deals for NSO with nearby nations, engaging in a malware-powered form of diplomacy.

The last thing NSO wanted was for this lawsuit to move to the point where it might need to start producing documents. The outstanding order for code production posed a threat to NSO’s secrecy, even if there’s almost zero chance it would be denied any request to seal these documents. With NSO being mostly former government employees and the Israel government being composed of current government employees, NSO asked and received. With this move, a sovereign that is not party to this lawsuit has done what NSO couldn’t on its own: prevent an American entity from obtaining its source code.

The origin of this information isn’t NSO or the Israeli government. It’s the product of leaks and hacking. And it shows NSO knew this reckoning was coming, long before it became somewhat of a household name following the leak of targeting data. This appears to have happened not long after WhatsApp filed its lawsuit against NSO in late 2019.

Israel’s hidden intervention in the case can be revealed after a consortium of media organisations led by the Paris-based non-profit Forbidden Stories, and including the Guardian and Israeli media partners, obtained a copy of a secret court order relating to the 2020 seizure of NSO’s internal files.

Details of the seizures and Israel’s contacts with NSO regarding the WhatsApp case are laid bare in a separate cache of emails and documents reviewed by the Guardian. They originate from a hack of data from Israel’s ministry of justice obtained by the transparency group Distributed Denial of Secrets and shared with Forbidden Stories.

According to the documents, NSO first approached the Israeli government in the early months of 2020, asking for a “blocking order” that would hopefully prevent it from having to hand over anything to WhatsApp. When WhatsApp served its discovery request in June 2020, NSO Group and government officials met to “discuss issues related to disclosure.” After some back-and-forth between NSO’s legal reps and government officials, the government performed a perfunctory raid of NSO offices for the sole purpose of leaving it with almost nothing to turn over in response to the US court order.

Three days later, in mid-July 2020, Israel made a significant but secret intervention. At an urgent meeting with NSO, Israeli officials presented the company with an order issued by a Tel Aviv court granting the government powers to execute a search warrant at its office, access its internal computer systems and seize files.

This subterfuge appears to have worked, at least so far. According to WhatsApp’s lawyers, NSO has only turned over 17 pages of documents in response to its discovery requests. Obviously, none of these documents are responsive to the court order demanding NSO turn over its software to WhatsApp.

On the surface, it might not look any more unusual than, say, the Justice Department filing a motion to keep documents from being produced by one of its contractors in the interest of public safety, operational secrecy, or whatever other excuse it might use. But it’s nowhere near comparable. NSO Group never informed the US court that these documents had been seized. And it appears its lawyers — some of which are US-based — never informed the court it was seeking the assistance of the Israeli government to keep these documents from being produced.

It will certainly be interesting to see how the court responds to these revelations. However, sanctions can’t make NSO Group turn over information now in the hands of its own personal Jesus: the Israeli government. And it’s unlikely any US court has the power to pierce the sovereign immunity that controls this action, no matter how transparent the self-interest.

Filed Under: distributed denial of secrets, israel, lawsuit, malware, pegasus, privacy, source code, spyware, surveillance
Companies: nso group, whatsapp

NSO Group Continues To Use The Lawsuit Filed Against It By WhatsApp To Harass Canadian Security Researchers

from the if-you-can't-beat-'em,-fuck-with-'em dept

Israeli malware manufacturer NSO Group spent years making good money selling to bad people. Its only concern for the longest time was how long it would take nearby autocrats and totalitarians to start targeting Israeli citizens.

To be fair, the Israeli government shares at least some of the blame. Surrounded by entities that would love to see it erased from the earth, the government helped broker deals with unfriendly countries — a perverse form of diplomacy that allowed some of its worst enemies to gain access to extremely powerful spyware.

NSO is no longer the local darling in Israel. In fact, none of its competitors are either. The country achieved terminal embarrassment velocity following the leak of documents that appeared to show many of NSO’s customers were abusing access to its Pegasus spyware to target journalists, dissidents, human rights lawyers, political opponents, and even the occasional ex-wife and her lawyer.

NSO has also been sued multiple times. The first tech firm to sue NSO was WhatsApp. Backed by Meta, WhatsApp took NSO to court for using WhatsApp’s US-based servers to deliver malware packages to users targeted by NSO’s absolute shitlist of customers.

Some of what WhatsApp observed might have been due to the FBI taking a bespoke version of NSO’s Pegasus for a spin before deciding it would be pretty much impossible to use it without doing a ton of damage to the Fourth Amendment.

This lawsuit has not gone well for NSO. It invoked a variety of defenses, including sovereign immunity, reasoning that it was a stand-in for the governments it sold to. And, as such, it was entitled to the same immunity often granted foreign governments by US courts.

This tactic didn’t work. Not only did multiple courts (district, appellate, the Top Court in the Land) reject NSO immunity overtures, but the original court handling this lawsuit ordered the company to turn over its code to WhatsApp. And that order meant all the code, not just the stuff involving NSO’s flagship spyware, Pegasus.

Far from the nation’s courts, Canadians have been giving NSO (and its competitors) fits for years. Citizen Lab — a group of Canadian malware researchers linked to the University of Toronto — has been examining NSO’s malware for years. More importantly, it’s been detecting infections and allowing those targeted by NSO spyware to rid themselves of these infections. In every case, Citizen Lab has exposed the targeting of the usual people: dissidents, opposition leaders, journalists, lawyers, diplomats, etc. The company continues to pretend this malware is sold to target the most dangerous criminals despite all evidence to the contrary.

With NSO now being asked to turn over its source code, it has decided to drag a non-party into the mix by going after Citizen Lab repeatedly during this lawsuit. (This is something its financial backers did years before NSO was a defendant in multiple lawsuits and an international pariah.)

As Shawn Musgrave reports for The Intercept, NSO appears to be engaged in a campaign of harassment against Citizen Lab… presumably because it has run out of believable defenses and/or solid litigation strategies.

FOR YEARS, CYBERSECURITY researchers at Citizen Lab have monitored Israeli spyware firm NSO Group and its banner product, Pegasus. In 2019, Citizen Lab reported finding dozens of cases in which Pegasus was used to target the phones of journalists and human rights defenders via a WhatsApp security vulnerability.

Now NSO, which is blacklisted by the U.S. government for selling spyware to repressive regimes, is trying to use a lawsuit over the WhatsApp exploit to learn “how Citizen Lab conducted its analysis.”

[…]

With the lawsuit now moving forward, NSO is trying a different tactic: demanding repeatedly that Citizen Lab, which is based in Canada, hand over every single document about its Pegasus investigation. A judge denied NSO’s latest attempt to get access to Citizen Lab’s materials last week.

While it’s good to see a court shut down this obvious attempt to turn Citizen Lab into a co-litigant, the fact remains that Citizen Lab has never been a party to this lawsuit. This is nothing more than NSO attempting to obtain information it has no legal reason to request, possibly because it’s still aching from being ordered to turn over its own information: i.e, its source code.

It also may be even more petty than the previous hypothetical: it may be trying to get Citizen Lab to burn up some of its limited resources fighting stupid requests for stuff Citizen Lab should even be asking for, much less expecting a judge to sign off on.

Whatever it is, it certainly isn’t good litigation. This reeks of desperation. These are the acts of litigant that has run out of options. NSO is just flailing, hoping to drag down a non-party with it as it heads towards a seemingly-inevitable loss.

And this certainly isn’t a winning strategy. It’s not even capable of maintaining the miserable status quo NSO Group is currently mired in. Citizen Lab (obviously) refused these demands for information (justifiably!) and the judge handling the case has made it clear there’s almost zero chance of NSO being able to drag anything out of this particular thorn in its side.

Citizen Lab opposed NSO’s demands on numerous grounds, particularly given “NSO’s animosity” toward its research.

In the latest order, Hamilton concluded that NSO’s demand was “plainly overbroad.” She left open the possibility for NSO to try again, but only if it can point to evidence that specific individuals that Citizen Lab categorized as “civil society” targets were actually involved in “criminal/terrorist activity.”

lol at that last sentence. Does anyone think anyone, much less an aggrieved NSO Group, has any evidence Citizen Lab is involved in “criminal/terrorist activity?” All it has done is expose abuse of malware sold by NSO Group to governments with long histories of corruption and/or human rights abuses.

NSO is just going to keep on losing. Reap/sow. Lie down with dogs. The foreseeable consequences of actions. Etc. Etc. Etc. Citizen Lab will keep performing its important work. And, with any luck, NSO will soon collapse under the weight of its hubris. Hope the (temporary) shekels were worth it.

Filed Under: canada, discovery, harassment, source code, spyware, surveillance
Companies: citizen lab, meta, nso group, whatsapp

NSO Group Ordered To Turn Over Spyware Code To WhatsApp

from the UNDERSEAL.EXE dept

The time has come to pay the discovery piper for NSO Group. The phone exploit firm formed by former Israeli spies was supported unilaterally by the Israeli government as it courted human rights abusers and autocrats. The Israeli government apparently felt selling powerful phone exploits to its enemies got caught with its third-party pants down when numerous news agencies exposed just how often NSO’s customers abused its powerful spyware to target journalists, activists, lawyers, dissidents, religious leaders, and anyone else who annoyed its customers.

NSO Group has been sued multiple times. One of the first lawsuits filed in the US featured Meta (formerly Facebook) as a plaintiff, suing on behalf of WhatsApp, its encrypted communications acquisition. NSO tried multiple times to escape this lawsuit. It claimed it was a private sector equivalent of a government agency and, therefore, should be protected by sovereign immunity. This argument was rejected, leaving NSO with the option of arguing its actions (or, rather, the actions of its customers, which it claimed it couldn’t control) weren’t subject to US law.

That other argument might have worked if NSO Group’s customers weren’t using WhatsApp’s US-based servers to deliver malware payloads. Once something like this happens, US law comes into play and, without the protective cover of sovereign immunity, NSO Group must continue to respond to lawsuits filed by US tech companies.

Everything NSO tried in hopes of earning an early exit from US lawsuits was aimed at preventing the very thing that’s happening now. NSO and its (few remaining) backers can probably survive an expensive settlement. What the company is unlikely to survive is a (possibly) public outing of its malware code.

As Stephanie Kirchgaessner reports for The Guardian, NSO has been ordered to turn over the source code for pretty much all of its malware to Meta/WhatsApp.

NSO Group, the maker of one the world’s most sophisticated cyber weapons, has been ordered by a US court to hand its code for Pegasus and other spyware products to WhatsApp as part of the company’s ongoing litigation.

[…]

In reaching her decision, Hamilton considered a plea by NSO to excuse it of all its discovery obligations in the case due to “various US and Israeli restrictions”.

Ultimately, however, [Judge Phyllis Hamilton] sided with WhatsApp in ordering the company to produce “all relevant spyware” for a period of one year before and after the two weeks in which WhatsApp users were allegedly attacked: from 29 April 2018 to 10 May 2020. NSO must also give WhatsApp information “concerning the full functionality of the relevant spyware”.

WhatsApp already has a pretty good idea how NSO Group malware operates. It has already managed to detect actual deployments via its servers. The irony here, of course, is that the incidents that most likely exposed NSO’s exploitation of WhatsApp servers were trial runs of a US-oriented version of NSO’s Pegasus phone exploit by the FBI. (The FBI ultimately decided it couldn’t deploy this malware constitutionally.) A months-long investigation by the FBI into the “mysterious” NSO purchase by a supposedly “unknown” government agency ultimately revealed that it was the FBI itself shelling out bucks for malware it couldn’t deploy without violating the Constitution.

The order [PDF] issued by Judge Hamilton makes it clear NSO has to hand over more than just its Pegasus code to WhatsApp.

As to category (1), as stated at the hearing, the court adopts plaintiffs’ definition of “all relevant spyware” as set forth in their motion: “any NSO spyware targeting or directed at Whatsapp servers, or using Whatsapp in any way to access Target Devices.” As also stated at the hearing, defendants have not identified a basis for limiting its production to the Pegasus program, or to any particular single operating system.

[…]

As to the timeframe of documents that must be produced, the court concludes that, at this stage of the case, the Richmark factors weigh in favor of production for “all relevant spyware” for a period of one year before the alleged attack to one year after the alleged attack; in other words, from April 29, 2018 to May 10, 2020. If, after reviewing the relevant spyware from that timeframe, plaintiffs are able to provide evidence that any attack lasted beyond that timeframe, plaintiffs may seek further discovery at that time.

hahahahaaaaaaaaaa

We can be sure NSO’s lawyers are now busy crafting extremely restrictive proposed protective orders to prevent WhatsApp/Meta for making this information available to the public via court filings, blogs posts, transparency reports, or any other options this company has at its disposal.

I imagine these motions (along with other efforts to seal docket entries) will be granted, since NSO has continually claimed its customers use its malware to target high-value targets like suspected terrorists and other violent criminals. But this court remains free to weigh NSO’s CYA statements against the brutal reality: that its malware is often used to target people governments don’t like, rather than the “terrorists” and “violent criminals” governments claim they’re interested in apprehending.

Equally amusing is the fact that the same court has denied NSO’s demands for any communications between WhatsApp/Meta and Toronto’s Citizen Lab that were initiated following the filing of this lawsuit. It’s easy to see why NSO would love access to these communications, considering Citizen Lab has constantly and continually exposed abusive NSO malware deployments over the past several years while also publishing whatever exploit code it’s been able to extract during these investigations.

But, as the court notes, NSO has already undercut its own argument for additional discovery on its end by attempting to move the goalposts to cover only perceived misuses against “civil society” by its customers. This attempt to obtain further communications is backed only by NSO’s perception of the tone of WhatsApp’s lawsuit, rather than its listed causes for action — allegations that cover not only “abusive” deployments of malware but also “legitimate” deployments that, nonetheless, occurred without the platform’s permission and definitely violated WhatsApp’s terms of service.

So, the lawsuit will move forward. And it’s NSO that obligated to start explaining itself — not just to Meta/WhatsApp, but the court itself. Now that there’s source code on the line, NSO Group might start examining it other options, the most likely of which would be paying WhatsApp a considerable sum of money while promising not to use the company’s US servers to deploy malware. Most entities, at worst, have to deal with the consequences often expressed as having to lay in a bed that they’ve made. But NSO’s actions exceed this idiom. NSO, for all intents and purposes, shat the bed before making it, which makes lying it it feel that much worse.

Filed Under: malware, pegasus, source code, spyware, surveillance
Companies: meta, nso group, whatsapp

WhatsApp Tells UK Government It’s Still Not Willing To Undermine Its Encryption

from the don't-make-me-tap-the-sign dept

The UK government is entertaining even more plans to undermine (or actually outlaw) end-to-end encryption. And it’s not gaining any support from the multiple services (and multiple people) these efforts would harm.

Both Signal and Proton have made it clear they’ll pull their services rather than weaken their encryption to comply with UK government demands. WhatsApp is saying the same thing — telling the UK government something it has already told it at least twice.

In 2017, WhatsApp made an unofficial announcement of its policies when UK law enforcement showed up with a demand to compel decryption of a targeted account. WhatsApp refused to comply and the UK government apparently decided not to press the issue. At least not directly.

Five years later, the UK government is still hammering away at encryption, adding more mandates to its steadily simmering Online Safety Bill. And WhatsApp told the UK government what it told it back in 2017: breaking encryption just isn’t an option. (In the form of a lawsuit challenging an Indian law, WhatsApp said the same thing to the Modi administration and its series of rights-violating internet-related laws.)

Another year has passed and the UK government still wants to get the Online Safety Bill passed. And, once again, Meta has surfaced to tell the government that it can pass all the laws it want, but none of them will force WhatsApp to undermine its encryption.

WhatsApp would refuse to comply with requirements in the online safety bill that attempted to outlaw end-to-end encryption, the chat app’s boss has said, casting the future of the service in the UK in doubt.

Speaking during a UK visit in which he will meet legislators to discuss the government’s flagship internet regulation, Will Cathcart, Meta’s head of WhatsApp, described the bill as the most concerning piece of legislation currently being discussed in the western world.

The UK government doesn’t have any leverage here. WhatsApp will simply stop offering its service in the UK. As Cathcart points out, 98% of its users reside in other countries. And there’s no reason it should put all of its users at risk, just because the home to 2% of its user base is being stupid about end-to-end encryption.

Now, that 2% would probably like to have access to an encrypted messaging service, whether it’s WhatsApp, Signal, or Proton’s offering. Unfortunately for them, supporters of the bill don’t want them to have these options. But that’s not going to work out well for the government. Angering constituents tends to shift the leverage back their way, which means legislators are pushing a terrible bill from a position of weakness.

The potential for hefty fines only makes it more likely service providers will exit this market rather than give the government what it wants.

Under the bill, the government or Ofcom could require WhatsApp to apply content moderation policies that would be impossible to comply with without removing end-to-end encryption. If the company refused to do, it could face fines of up to 4% of its parent company Meta’s annual turnover – unless it pulled out of the UK market entirely.

If the options are providing a weakened service that harms all users or shelling out 4% of its income on a regular basis, the option these legislators failed to consider is really the only intelligent option: exiting the market.

And when that starts happening, the government is going to get an earful from the people it never bothered to listen to in the first place: domestic users of services these legislators are actively trying to destroy.

Filed Under: encryption, online safety bill, uk
Companies: meta, whatsapp

Supreme Court Denies NSO Group’s Attempt To Avoid Lawsuit Filed By WhatsApp

from the better-prep-for-a-settlement,-NSO dept

A couple of years before criticism of Israel-based NSO Group reached critical mass, the malware merchant was sued by WhatsApp. According to the messaging service (now owned by Meta), its servers were used (without its permission and in violation of the terms of service) to deliver powerful spyware to targets of NSO Group customers (which included a disturbingly large number of habitual human rights abusers).

As the lawsuit moved forward, things got interesting. Court filings revealed NSO’s malware had been delivered via WhatsApp servers located in California. (Much later, it was discovered this was the result of the FBI performing a test drive of a Pegasus variant offered by NSO that would allow the targeting of US phone numbers — something that isn’t an option with the standard spyware.) Filings also showed current FBI director Chris Wray (who won’t shut the fuck up about encryption despite his deliberate refusal to be intellectually honest about his proposed “solutions”) was a defender of encryption when he was still in the private sector, advocating on WhatsApp’s behalf during a legal battle with the DOJ, which hoped to force WhatsApp to weaken encryption to facilitate DOJ wiretap orders.

NSO Group claimed it was immune from this lawsuit for a couple of reasons. First, it said it could not be held directly responsible for the actions of its customers. If courts decided it could be held responsible for irresponsible malware sales to questionable governments, the company raised a secondary defense: it was entitled to sovereign immunity if the court decided NSO was a suitable litigation stand-in for its foreign customers.

Neither argument worked. In November 2021, the Ninth Circuit Appeals Court denied sovereign immunity to NSO Group, pointing out very reasonably that NSO is not a “foreign state.” It is a foreign company, but that’s not nearly the same thing as being a foreign entity worthy of immunity. The appeal was denied, preventing NSO Group from escaping this lawsuit.

Another appeal followed. NSO Group asked the Supreme Court to review this denial by the Ninth Circuit. The Supreme Court, in its most recent cert order [PDF], has decided NSO Group hasn’t raised an issue it feels like addressing. (h/t The Register)

NSO Group will have to continue facing WhatsApp’s lawsuit. Adding 18 months of disturbing revelations, sanctions, investigations, additional lawsuits, and negative press to the proceedings definitely isn’t helping NSO’s case. It made poor decisions about who to sell to, something that may have been aggravated by the Israeli government’s attempts to convert a private company into a tool of international diplomacy.

The downside here is that WhatsApp is using the CFAA to pursue its claims against NSO. While it would seem obvious that utilizing WhatsApp’s servers and service to deliver malware violates terms of use agreements, this lawsuit asks courts to broadly define “unauthorized access” to include merely unexpected uses of WhatsApp. WhatsApp has the ability to shutter accounts that spread malware, including dummy accounts run by foreign government agencies. What it shouldn’t be doing is asking federal courts to expand already broad definitions of unauthorized access — something that has the potential to harm security researchers and their invaluable work.

Filed Under: liability, malware, pegasus, supreme court, whatsapp
Companies: meta, nso group, whatsapp

WhatsApp Again Affirms It Will Not Break Encryption To Appease Government Entities

from the governments-invited-to-go-fuck-themselves dept

The debate over end-to-end encryption continues in the UK. It’s really not much of a debate, though. government officials continue to claim the only way to prevent the spread of child sexual abuse material (CSAM) is by breaking or removing encryption. Companies providing encrypted communications have repeatedly pointed out the obvious: encryption protects all users, even if it makes it more difficult to detect illicit activity by certain users. It’s impossible to break encryption to detect criminal activity without breaking it for every innocent user as well.

Sometimes the UK government argues with itself. The Information Commissioner’s Office put out a report earlier this year that stated encryption was essential to children’s online safety, directly contradicting assertions by other UK government entities which claimed breaking encryption was the only way to protect children.

At the center of this debate is WhatsApp, the popular messaging service that has provided end-to-end encrypted messaging since early 2016. And since that point, multiple governments have tried to get WhatsApp to ditch encryption or, at the very least, provide them with backdoors. That includes the UK government, which made its request only a few months after WhatsApp finished rolling out its end-to-end encryption.

WhatApp rejected the UK government’s request in 2017. That hasn’t stopped the UK government from repeatedly approaching the company in hopes of talking it out of its encryption. And nothing has changed for WhatsApp, which has again made it clear it’s not interested in compromising user security on a country-by-country basis.

Will Cathcart, who has been at parent company Meta for more than 12 years and head of WhatsApp since 2019, told the BBC that the popular communications service wouldn’t downgrade or bypass its end-to-end encryption (EE2E) just for British snoops, saying it would be “foolish” to do so and that WhatsApp needs to offer a consistent set of standards around the globe.

“If we had to lower security for the world, to accommodate the requirement in one country, that … would be very foolish for us to accept, making our product less desirable to 98 percent of our users because of the requirements from 2 percent,” Cathcart told the broadcaster. “What’s being proposed is that we – either directly or indirectly through software – read everyone’s messages. I don’t think people want that.”

It’s good to see WhatsApp take this stand (again), even as the voices clamoring for the end of encryption are now claiming its primary purpose is to allow distributors of CSAM to escape justice. It’s pretty tough to take a principled stand when opponents are accusing you of siding with child molesters.

And the pressure isn’t going to let up. The UK government still believes it is entitled to encryption backdoors. The European Union, which the UK recently exited, has expressed the same desire for broken encryption, using the same disingenuous phrase trotted out so often by the likes of FBI Director Chris Wray: “lawful access.”

But simple refusals like these allow companies to call governments’ bluffs. If governments can’t get the backdoors they want, they’ll have to decide whether they want their citizens to have access to encrypted communications. And while it may seem some governments don’t want their citizens to enjoy this protection, very few have been willing to eject popular services that won’t comply with their demands.

Filed Under: csam, encryption, surveillance, uk
Companies: meta, whatsapp

Social Responsibility Organization Says Meta’s Embrace Of Encryption Is Important For Human Rights

from the encryption-protects-human-rights dept

Encryption is under attack from all over the world. Australia already has a law on the books trying to force companies to backdoor encryption. The UK is pushing its Online Safety Bill, which would be an attack on encryption (the UK government has made it clear it wants an end to encryption). In the US, we have the EARN IT Act, whose author, Senator Richard Blumenthal, has admitted he sees it as a necessary attack on companies who “hide behind” encryption.

All over the world, politicians and law enforcement officials insist that they need to break encryption to “protect” people. This has always been false. If you want to protect people, you want them to have (and use) encryption.

Against this backdrop, we have Meta/Facebook. While the company has long supported end-to-end encryption in WhatsApp, it’s been rolling it out on the company’s other messaging apps as well. Even if part of the reason for enabling encryption is about protecting itself, getting more encryption out to more people is clearly a good thing.

And now there’s more proof of that. Business for Social Responsibility is a well respected organization, which was asked by Meta to do a “human rights assessment” of Meta’s expanded use of end-to-end encryption. While the report was paid for by Meta, BSR’s reputation is unimpeachable. It’s not the kind of organization that throws away its reputation because a company paid for some research. The end result is well worth reading, but, in short, BSR notes that the expansion of end-to-end encryption is an important step in protecting human rights.

The paper is thorough and careful, details its methodology, and basically proves what many of us have been saying all along: if you’re pushing to end or diminish end-to-end encryption, you are attacking human rights. The key point:

Privacy and security while using online platforms should not only be the preserve of the technically savvy and those able to make proactive choices to opt into end-to-end encrypted services, but should be democratized and available for all.

The report notes that we’re living in a time of rising authoritarianism, and end-to-end encryption is crucial in protecting people fighting back against such efforts. The report is careful and nuanced, and isn’t just a one-sided “all encryption must be good” kind of thing. It does note that there are competing interests.

The reality is much more nuanced. There are privacy and security concerns on both sides, and there are many other human rights that are impacted by end-to-end encrypted messaging, both positively and negatively, and in ways that are interconnected. It would be easy, for example, to frame the encryption debate not only as “privacy vs. security” but also as “security vs. security,” because the privacy protections of encryption also protect the bodily security of vulnerable users. End-to-end encryption can make it more challenging for law enforcement agencies to access the communications of criminals, but end-to-end encryption also makes it more challenging for criminals to access the communications of law-abiding citizens.

As such, the report highlights the various tradeoffs involved in encrypting more communications, but notes:

Meta’s expansion of end-to-end encrypted messaging will directly result in the increased realization of a range of human rights, and will address many human rights risks associated with the absence of ubiquitous end-to-end encryption on messaging platforms today. The provision of end-to-end encrypted messaging by Meta directly enables the right to privacy, which in turn enables other rights such as freedom of expression, association, opinion, religion, and movement, and bodily security. By contrast, the human rights harms associated with end-toend encrypted messaging are largely caused by individuals abusing messaging platforms in ways that harm the rights of others—often violating the service terms that they have agreed to. However, this does not mean that Meta should not address these harms; rather, Meta’s relationship to these harms can help identify the types of leverage Meta has available to address them.

The report notes that people who are worried that by enabling end-to-end encryption, Meta will enable more bad actors, do not seem to be supported by evidence, since bad actors have a plethora of encrypted communications channels already at their disposal:

If Meta decided not to implement end-toend encryption, the most sophisticated bad actors would likely choose other end-to-end encrypted communications platforms. Sophisticated tech use is increasingly part of criminal tradecraft, and the percentage of criminals without the knowledge and skills to use end-to-end encryption will continue to decrease over time. For this reason, if Meta chose not to provide end-to-end encryption, this choice would likely not improve the company’s ability to help law enforcement identify the most sophisticated and motivated bad actors, who can choose to use other end-to-end encrypted messaging products.

While the report notes that things like child sexual abuse material (CSAM) are a serious issue, focusing solely on scanning everything and trying to block it is not the only way (or even the best) way of addressing the issue. Someone should send this to the backers of the EARN IT Act, which is predicated on forcing more companies to scan more communications.

Content removal is just one way of addressing harms. Prevention methods are feasible in an end-to-end encrypted environment, and are essential for achieving better human rights outcomes over time. The public policy debate about end-to-end encryption often focuses heavily or exclusively on the importance of detecting and removing problematic, often illegal content from platforms, whether that be CSAM or terrorist content. Content removal is important for harm from occurring in end-to-end encrypted messaging through the use of behavioral signals, public platform information, user reports, and metadata to identify and interrupt problematic behavior before it occurs.

The report also, correctly, calls out how the “victims” in this debate are most often vulnerable groups — the kind of people who really could use much more access to private communications. It also notes that while some have suggested “technical mitigations” that can be used to identify illegal content in encrypted communications, these mitigations are “not technically feasible today.” This includes the much discussed “client-side” scanning idea that Apple has toyed with.

Methods such as client-side scanning of a hash corpus, trained neural networks, and multiparty computation including partial or fully homomorphic encryption have all been suggested by some as solutions to enable messaging apps to identify, remove, and report content such as CSAM. They are often collectively referred to as ”perceptual hashing” or “client-side scanning,” even though they can also be server-side. Nearly all proposed client-side scanning approaches would undermine the cryptographic integrity of end-to-end encryption, which because it is so fundamental to privacy would constitute significant, disproportionate restrictions on a range of rights, and should therefore not be pursued

The report also notes that even if someone came up with a backdoor technology that allowed Meta to scan encrypted communications, the risks to human rights would be great, given that such technology could be repurposed in dangerous ways.

For example, if Meta starts detecting and reporting universally illegal content like CSAM, some governments are likely to exploit this capability by requiring Meta to block and report legitimate content they find objectionable, thereby infringing on the privacy and freedom of expression rights of users. It is noteworthy that even some prior proponents of homomorphic encryption have subsequently altered their perspective for this reason, concluding that their proposals would be too easily repurposed for surveillance and censorship. In addition, these solutions are not foolproof; matching errors can occur, and bad actors may take advantage of the technical vulnerabilities of these solutions to circumvent or game the system

The report notes that there are still ways that encrypted communications can be at risk, even name-checking NSO Group’s infamous Pegasus spyware.

How about all the usual complaints from law enforcement about how greater use of encryption will destroy their ability to solve crimes? BSR says “not so fast…”

While a shift to end-to-end encryption may reduce law enforcement agency access to the content of some communications, it would be wrong to conclude that law enforcement agencies are faced with a net loss in capability overall. Trends such as the collection and analysis of significantly increased volumes of metadata, the value of behavioral signals, and the increasing availability of artificial intelligence-based solutions run counter to the suggestion that law enforcement agencies will necessarily have less insight into the activities of bad actors than they did in the past. Innovative approaches can be deployed that may deliver similar or improved outcomes for law enforcement agencies, even in the context of end-to-end encryption. However, many law enforcement entities today lack the knowledge or the resources to take advantage of these approaches and are still relying on more traditional techniques.

Still, the report does note that Meta should take responsibility in dealing with some of the second- and third-order impacts of ramping up encryption. To that end, it does suggest some “mitigation measures” Meta should explore — though noting that a decision not to implement end-to-end encryption “would also more closely connect Meta to human rights harm.” In other words, if you want to protect human rights, you should encrypt. In fact, the report is pretty bluntly direct on this point:

If Meta were to choose not to implement end-to-end encryption across its messaging platforms in the emerging era of increased surveillance, hacking, and cyberattacks, then it could be considered to be “contributing to” many adverse human rights impacts due to a failure to protect the privacy of user communications.

Finally, the paper concludes with a series of recommendations for Meta on how to “avoid, prevent, and mitigate the potential adverse human rights impacts from the expansion of end-to-end encryption, while also maximizing the beneficial impact end-to-end encryption will have on human rights.”

The report has 45 specific (detailed and thoughtful) recommendations to that end. Meta has already committed to fully implementing 34 of them, while partly implementing four more, and assessing six others. There is only one of the recommendations that Meta has rejected. The one that it rejected has to do with “client side scanning” which the report itself was already nervous about (see above). However, one of the recommendations suggested that Meta “continue investigating” client-side scanning techniques to see if a method was eventually developed that wouldn’t have all the problems detailed above. However, Meta says it sees no reason to continue exploring such a technology. From Meta’s response:

As the HRIA highlights, technical experts and human rights stakeholders alike have raised significant concerns about such client-side scanning systems, including impacts on privacy, technical and security risks, and fears that governments could mandate they be used for surveillance and censorship in ways that restrict legitimate expression, opinion, and political participation that is clearly protected under international human rights law.

Meta shares these concerns. Meta believes that any form of client-side scanning that exposes information about the content of a message without the consent and control of the sender or intended recipients is fundamentally incompatible with an E2EE messaging service. This would be the case even with theoretical approaches that could maintain “cryptographic integrity” such as via a technology like homomorphic encryption—which the HRIA rightly notes is a nascent technology whose feasibility in this context is still speculative.

People who use E2EE messaging services rely on a basic premise: that only the sender and intended recipients of a message can know or infer the contents of that message. As a result, Meta does not plan to actively pursue any such client-side scanning technologies that are inconsistent with this user expectation.

We spend a lot of time criticizing Facebook/Meta around these parts, as the company often seems to trip over itself in trying to do the absolutely wrongest thing over and over again. But on this it’s doing a very important and good thing. The BSR report confirms that.

Filed Under: client-side scanning, csam, encryption, end to end encryption, human rights, messenger
Companies: facebook, instagram, meta, whatsapp

Because The Defense Department's Secure Communications Options Don't Work For Everyone, Soldiers Are Turning To Signal And WhatsApp

from the breaking-the-rules-to-stay-in-touch dept

The military has an obvious need for secure communications. It offered its support of encryption even as the NSA tried to find ways to undercut to make its surveillance ends easier to achieve.

The problem is the military doesn’t have a great plan for securing communications between personnel. Due to tech limitations the Defense Department has yet to overcome (despite billions in annual funding), soldiers are turning to third-party messaging services to communicate orders and disseminate information.

The use of the encrypted messaging app Signal is ubiquitous within the Department of Defense. Service members have received briefings about operational security (OPSEC) and information security (INFOSEC) and have taken the dangers of living in a digital world seriously by making sure that the work-related text messages they send on their cell phones are encrypted. The contradiction is that using Signal for official military business is against regulations.

Securing communications apparently means breaking the rules. The DoD forbids the use of non-DoD-controlled messaging services to handle the distribution of nonpublic DoD information. The Defense Department insists personnel use its services, but those services can’t be accessed by employees who don’t have military-issued cell phones. And everyone has a cell phone, so it’s often easier to use third-party platforms to communicate.

When this happens, it raises the risk that unauthorized access or sharing of information could occur. It also puts many communications beyond the reach of public records requests, which often cannot access communications between privately owned devices.

And there appears to be no fix on the immediate horizon. The Defense Department is quick to point out the use of Signal and WhatsApp violates regulations. But it has nothing in place that would allow the many military members not in possession of government-issued cell phones to communicate when out in the field.

This is what the Secretary of Defense’s Public Affairs Officer (Russell Goemaere) told Audacy when asked about how military members were expected to use DoD-approved communications platforms they didn’t actually have access to on their personal devices.

“DoD365 provides a messaging capability that is approved for CUI and use on DoD mobile devices. The Services are in the final stages of testing Bring Your Own Approved Device (BYOAD) and Bring Your Own Device (BYOD) solutions that provide access to the DoD365 collaboration capability on service member’s personal devices,” Goemaere said.

It’s 2022 and the Defense Department is only at the “final stage of testing” for solutions it needed years ago. Cell phone usage has been ubiquitous for nearly two decades at this point. For the Department to still be weeks or months away from a solution should be considered unacceptable. Denying soldiers access to third-party options means cutting them off from communications that can often have life-or-death implications.

This also means the Defense Department is still weeks or months away from ensuring communications subject to FOIA law are being captured and retained. The priority should still be personnel safety, but this is another downside of the Defense Department’s slow roll into the 21st century.

Filed Under: dod, encrypted messaging, encryption, military, soldiers
Companies: meta, signal, whatsapp

Documents Shows Just How Much The FBI Can Obtain From Encrypted Communication Services

from the plenty-of-data-but-content-not-so-much dept

There is no “going dark.” Consecutive FBI heads may insist there is, but a document created by their own agency contradicts their dire claims that end-to-end encryption lets the criminals and terrorists win.

Andy Kroll has the document and the details for Rolling Stone:

[I]n a previously unreported FBI document obtained by Rolling Stone, the bureau claims that it’s particularly easy to harvest data from Facebook’s WhatsApp and Apple’s iMessage services, as long as the FBI has a warrant or subpoena. Judging by this document, “the most popular encrypted messaging apps iMessage and WhatsApp are also the most permissive,” according to Mallory Knodel, the chief technology officer at the Center for Democracy and Technology.

The document [PDF] shows what can be obtained from which messaging service, with the FBI noting WhatsApp has plenty of information investigators can obtain, including almost real time collection of communications metadata.

WhatsApp will produce certain user metadata, though not actual message content, every 15 minutes in response to a pen register, the FBI says. The FBI guide explains that most messaging services do not or cannot do this and instead provide data with a lag and not in anything close to real time: “Return data provided by the companies listed below, with the exception of WhatsApp, are actually logs of latent data that are provided to law enforcement in a non-real-time manner and may impact investigations due to delivery delays.”

The FBI can obtain this info with a pen register order — the legal request used for years to obtain ongoing call data on targeted numbers, including numbers called and length of conversations. With a warrant, the FBI can get even more information. A surprising amount, actually. According to the document, WhatsApp turns over address book contacts for targeted users as well as other WhatsApp users who happen to have the targeted person in their address books.

Combine this form of contact chaining with a few pen register orders, and the FBI can basically eavesdrop on hundreds of conversations in near-real time. The caveat, of course, is that the FBI has no access to the content of the conversations. That remains locked up by WhatsApp’s encryption. Communications remain “warrant-proof,” to use a phrase bandied about by FBI directors. But is it really?

If investigators are able to access the contents of a phone (by seizing the phone or receiving permission from someone to view their end of conversations), encryption is no longer a problem. That’s one way to get past the going darkness. Then there’s stuff stored in the cloud, which can give law enforcement access to communications despite the presence of end-to-end encryption. Backups of messages might not be encrypted and — as the document points out — a warrant will put those in the hands of law enforcement.

If target is using an iPhone and iCloud backups enabled, iCloud returns may contain WhatsApp data, to include message content.

This is a feature of cloud backups — a way to retrieve messages if something goes wrong with someone’s phone or their WhatsApp account. It’s also a bug that makes encryption irrelevant. The same goes for Apple’s iMessage service. Encryption or no, backups are not encrypted by service providers. In the case of Apple’s iMessage, warrants for iCloud backups will give law enforcement the encryption key needed to decrypt the stashed messages.

On the other side, there are truly secure options that the FBI considers dead ends, starting with Signal. Signal retains no user info, which means there’s nothing to be had no matter what paperwork the feds produce. But, for the most part, even encrypted messaging and email services generate metadata that can be obtained without a warrant. If investigators want more, warrants can actually result in investigators obtaining a great deal of information about users, their interactions, and their communications. And, as is noted directly above, it can also grant access to communications users mistakenly believed were beyond the reach of law enforcement.

But not everyone using encrypted services is a criminal, no matter what FBI directors say in public. Communications metadata being only a subpoena or pen register order away is concerning, especially for those who use encrypted services not only to maintain their own privacy, but to protect those they communicate with.

“WhatsApp offering all of this information is devastating to a reporter communicating with a confidential source,” says Daniel Kahn Gillmor, a senior staff technologist at the ACLU.

Those who truly understand the protocols and platforms they use for communications will understand the tradeoffs. For everyone else, there’s this handy tip sheet, compiled by none other than the FBI, which explains exactly what each service retains and what each service will hand over in response to government paperwork.

It also shows that encryption isn’t keeping law enforcement from pursuing investigations. In rare cases, investigators may have zero access to communications. But every communications platform or service creates a digital paper trail investigators can follow until they find something that breaks the case open. “Going dark” — the idea that law enforcement is helpless in the face of increased use of encryption — is a lie. And the FBI knows it.

Filed Under: 4th amendment, encryption, fbi, going dark, lawful access, subpoena, warrant
Companies: apple, facebook, meta, whatsapp

Ninth Circuit Tells NSO Group It Isn't A Government, Has No Immunity From WhatApp's Lawsuit

from the law-harder! dept

Long before its current run of Very Bad News, Israeli malware purveyor NSO Group was already controversial. Investigations had shown its exploits were being used to target journalists and activists and its customer list included governments known mostly for their human rights abuses.

Facebook and WhatsApp sued NSO in November 2019, alleging — among other things — that NSO had violated WhatsApp’s terms of use by deploying malware via the chat service. The arguments made by Facebook/WhatsApp aren’t the best and they could allow the CFAA to be abused even more by expanding the definition of “unauthorized access.”

Then there’s the question of standing, which NSO raised in one motion to dismiss. The alleged harms were to users of the service, not to the service itself. While suing on behalf of violated users is a nice gesture, it’s pretty difficult to talk a court into granting your requests for injunctions or damages if you’re not the target of the alleged abuse.

NSO also pointed out it didn’t actually violate anyone’s terms of service. Its customers did when they used WhatApp to deliver malware to targets. NSO said WhatsApp was welcome to sue any of its customers, but was unlikely to get anywhere with that either, given the immunity from lawsuits generally handed to foreign governments.

Then NSO made a ridiculous claim of its own: it said it was immune from lawsuits since it provided this malware to foreign governments. By extension, it argued, the same immunity protecting foreign sovereigns (i.e., its customers) should be extended to the private company that sold them phone exploits. That argument was rejected by the district court. And the Ninth Circuit Appeals Court has just affirmed [PDF] that rejection, which means NSO will have to continue to fight what is now one of several damaging fires.

The Appeals Court says no reasonable reading of the Foreign Sovereign Immunities Act (FSIA) supports NSO’s argument in favor of it taking no responsibility for its actions or the actions of its customers.

Whether such entity can sidestep the FSIA hinges on whether the Act took the entire field of foreign sovereign immunity as applied to entities, or whether it took the field only as applied to foreign state entities, as NSO suggests. The answer lies in the question. The idea that foreign sovereign immunity could apply to non-state entities is contrary to the originating and foundational premise of this immunity doctrine.

[…]

Thus, we hold that an entity is entitled to foreign sovereign immunity, if at all, only under the FSIA. If an entity does not fall within the Act’s definition of “foreign state,” it cannot claim foreign sovereign immunity. Period.

NSO isn’t a “foreign state.” It is not operated by a foreign state. It simply sells products to foreign states, which legally makes it no different than the company that supplies toilet paper to the White House. Just because the product is used almost exclusively by governments, that fact does not make NSO a state actor or a foreign state proxy.

NSO does not contend that it meets the FSIA’s definition of “foreign state,” and, of course, it cannot. It is not itself a sovereign. 28 U.S.C. § 1603(a). It is not “an organ . . . or political subdivision” of a sovereign. Id. § 1603(b)(2). Nor is a foreign sovereign its majority owner. NSO is a private corporation that provides products and services to sovereigns—several of them. NSO claims that it should enjoy the immunity extended to sovereigns because it provides technology used for law-enforcement purposes and law enforcement is an inherently sovereign function. Whatever NSO’s government customers do with its technology and services does not render NSO an “agency or instrumentality of a foreign state,” as Congress has defined that term. Thus, NSO is not entitled to the protection of foreign sovereign immunity. And that is the end of our task.

This heads back to the district court — the court where NSO first argued that the court had no jurisdiction to handle the lawsuit since it was pretty much just a foreign government d/b/a a private malware manufacturer. It will have to continue facing WhatsApp’s lawsuit over its alleged terms of service violations. And the longer the suit runs, the greater the chance NSO might have to divulge some more details on its dirty work and its even dirtier customers.

Filed Under: 9th circuit, immunity, malware, spyware
Companies: facebook, nso group, whatsapp