encryption – Techdirt (original) (raw)

Governments Continue Losing Efforts To Gain Backdoor Access To Secure Communications

from the always-searching-for-the-backdoor dept

Reports that prominent American national security officials used a freely available encrypted messaging app, coupled with the rise of authoritarian policies around the world, have led to a surge in interest in encrypted apps like Signal and WhatsApp. These apps prevent anyone, including the government and the app companies themselves, from reading messages they intercept.

The spotlight on encrypted apps is also a reminder of the complex debate pitting government interests against individual liberties. Governments desire to monitor everyday communications for law enforcement, national security and sometimes darker purposes. On the other hand, citizens and businesses claim the right to enjoy private digital discussions in today’s online world.

The positions governments take often are framed as a “war on encryption” by technology policy experts and civil liberties advocates. As a cybersecurity researcher, I’ve followed the debate for nearly 30 years and remain convinced that this is not a fight that governments can easily win.

Understanding the ‘golden key’

Traditionally, strong encryption capabilities were considered military technologies crucial to national security and not available to the public. However, in 1991, computer scientist Phil Zimmermann released a new type of encryption software called Pretty Good Privacy (PGP). It was free, open-source software available on the internet that anyone could download. PGP allowed people to exchange email and files securely, accessible only to those with the shared decryption key, in ways similar to highly secured government systems.

Following an investigation into Zimmermann, the U.S. government came to realize that technology develops faster than law and began to explore remedies. It also began to understand that once something is placed on the internet, neither laws nor policy can control its global availability.

Fearing that terrorists or criminals might use such technology to plan attacks, arrange financing or recruit members, the Clinton administration advocated a system called the Clipper Chip, based on a concept of key escrow. The idea was to give a trusted third party access to the encryption system and the government could use that access when it demonstrated a law enforcement or national security need.

Clipper was based on the idea of a “golden key,” namely, a way for those with good intentions – intelligence services, police – to access encrypted data, while keeping people with bad intentions – criminals, terrorists – out.

Clipper Chip devices never gained traction outside the U.S. government, in part because its encryption algorithm was classified and couldn’t be publicly peer-reviewed. However, in the years since, governments around the world have continued to embrace the golden key concept as they grapple with the constant stream of technology developments reshaping how people access and share information.

Following Edward Snowden’s disclosures about global surveillance of digital communications in 2013, Google and Apple took steps to make it virtually impossible for anyone but an authorized user to access data on a smartphone. Even a court order was ineffective, much to the chagrin of law enforcement. In Apple’s case, the company’s approach to privacy and security was tested in 2016 when the company refused to build a mechanism to help the FBI break into an encrypted iPhone owned by a suspect in the San Bernardino terrorist attack.

At its core, encryption is, fundamentally, very complicated math. And while the golden key concept continues to hold allure for governments, it is mathematically difficult to achieve with an acceptable degree of trust. And even if it was viable, implementing it in practice makes the internet less safe. Security experts agree that any backdoor access, even if hidden or controlled by a trusted entity, is vulnerable to hacking.

Competing justifications and tech realities

Governments around the world continue to wrestle with the proliferation of strong encryption in messaging tools, social media and virtual private networks.

For example, rather than embrace a technical golden key, a recent proposal in France would have provided the government the ability to add a hidden “ghost” participant to any encrypted chat for surveillance purposes. However, legislators removed this from the final proposal after civil liberties and cybersecurity experts warned that such an approach would undermine basic cybersecurity practices and trust in secure systems.

In 2025, the U.K. government secretly ordered Apple to add a backdoor to its encryption services worldwide. Rather than comply, Apple removed the ability for its iPhone and iCloud customers in the U.K. to use its Advanced Data Protection encryption features. In this case, Apple chose to defend its users’ security in the face of government mandates, which ironically now means that users in the U.K. may be less secure.

In the United States, provisions removed from the 2020 EARN IT bill would have forced companies to scan online messages and photos to guard against child exploitation by creating a golden-key-type hidden backdoor. Opponents viewed this as a stealth way of bypassing end-to-end encryption. The bill did not advance to a full vote when it was last reintroduced in the 2023-2024 legislative session.

Opposing scanning for child sexual abuse material is a controversial concern when encryption is involved: Although Apple received significant public backlash over its plans to scan user devices for such material in ways that users claimed violated Apple’s privacy stance, victims of child abuse have sued the company for not better protecting children.

Even privacy-centric Switzerland and the European Union are exploring ways of dealing with digital surveillance and privacy in an encrypted world.

The laws of math and physics, not politics

Governments usually claim that weakening encryption is necessary to fight crime and protect the nation – and there is a valid concern there. However, when that argument fails to win the day, they often turn to claiming to need backdoors to protect children from exploitation.

From a cybersecurity perspective, it is nearly impossible to create a backdoor to a communications product that is only accessible for certain purposes or under certain conditions. If a passageway exists, it’s only a matter of time before it is exploited for nefarious purposes. In other words, creating what is essentially a software vulnerability to help the good guys will inevitably end up helping the bad guys, too.

Often overlooked in this debate is that if encryption is weakened to improve surveillance for governmental purposes, it will drive criminals and terrorists further underground. Using different or homegrown technologies, they will still be able to exchange information in ways that governments can’t readily access. But everyone else’s digital security will be needlessly diminished.

This lack of online privacy and security is especially dangerous for journalists, activists, domestic violence survivors and other at-risk communities around the world.

Encryption obeys the laws of math and physics, not politics. Once invented, it can’t be un-invented, even if it frustrates governments. Along those lines, if governments are struggling with strong encryption now, how will they contend with a world when everyone is using significantly more complex techniques like quantum cryptography?

Governments remain in an unenviable position regarding strong encryption. Ironically, one of the countermeasures the government recommended in response to China’s hacking of global telephone systems in the Salt Typhoon attacks was to use strong encryption in messaging apps such as Signal or iMessage.

Reconciling that with their ongoing quest to weaken or restrict strong encryption for their own surveillance interests will be a difficult challenge to overcome.

Richard Forno is Teaching Professor of Computer Science and Electrical Engineering, and Assistant Director, UMBC Cybersecurity Institute, University of Maryland, Baltimore County. This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

Filed Under: backdoors, encryption, golden keys, privacy, security, surveillance

Florida’s New Social Media Bill Says The Quiet Part Out Loud And Demands An Encryption Backdoor

from the seems-bad dept

At least Florida’s SB 868/HB 743, “Social Media Use By Minors” bill isn’t beating around the bush when it states that it would require “social media platforms to provide a mechanism to decrypt end-to-end encryption when law enforcement obtains a subpoena.” Usually these sorts of sweeping mandates are hidden behind smoke and mirrors, but this time it’s out in the open: Florida wants a backdoor into any end-to-end encrypted social media platforms that allow accounts for minors. This would likely lead to companies not offering end-to-end encryption to minors at all, making them less safe online.

Encryption is the best tool we have to protect our communication online. It’s just as important for young people as it is for everyone else, and the idea that Florida can “protect” minors by making them less safe is dangerous and dumb.

The bill is not only privacy-invasive, it’s also asking for the impossible. As breaches like Salt Typhoon demonstrate, you cannot provide a backdoor for just the “good guys,” and you certainly cannot do so for just a subset of users under a specific age. After all, minors are likely speaking to their parents and other family members and friends, and they deserve the same sorts of privacy for those conversations as anyone else. Whether social media companies provide “a mechanism to decrypt end-to-end encryption” or choose not to provide end-to-end encryption to minors at all, there’s no way that doesn’t harm the privacy of everyone.

If this all sounds familiar, that’s because we saw a similar attempt from an Attorney General in Nevada last year. Then, like now, the reasoning is that law enforcement needs access to these messages during criminal investigations. But this doesn’t hold true in practice.

In our amicus brief in Nevada, we point out that there are solid arguments that “content oblivious” investigation methods—like user reporting— are “considered more useful than monitoring the contents of users’ communications when it comes to detecting nearly every kind of online abuse.” That remains just as true in Florida today.

Law enforcement can and does already conduct plenty of investigations involving encrypted messages, and even with end-to-end encryption, law enforcement can potentially access the contents of most messages on the sender or receiver’s devices, particularly when they have access to the physical device. The bill also includes measures prohibiting minors from accessing any sort of ephemeral messaging features, like view once options or disappearing messages. But even with those features, users can still report messages or save them. Targeting specific features does nothing to protect the security of minors, but it would potentially harm the privacy of everyone.

SB 868/HB 743 radically expands the scope of Florida’s social media law HB 3, which passed last year and itself has not yet been fully implemented as it currently faces lawsuits challenging its constitutionality. The state was immediately sued after the law’s passage, with challengers arguing the law is an unconstitutional restriction of protected free speech. That lawsuit is ongoing—and it should be a warning sign. Florida should stop coming up with bad ideas that can’t be implemented.

Weakening encryption to the point of being useless is not an option. Minors, as well as those around them, deserve the right to speak privately without law enforcement listening in. Florida lawmakers must reject this bill. Instead of playing politics with kids’ privacy, they should focus on real, workable protections—like improving consumer privacy laws to protect young people and adults alike, and improving digital literacy in schools.

Reposted from the EFF’s Deeplinks blog.

Filed Under: encryption, florida, hb 743, privacy, sb 868, security, social media, social media use by minors

EU Commission Kicks Off 2025 With Yet Another Plea For Backdoored Encryption

from the evergreen-bullshit dept

The EU Commission spent most of 2024 getting knocked around by opponents of its anti-encryption efforts. While it did find some support from countries with, shall we say, more authoritarian urges, most countries that still actually cared about security and privacy pushed back, resulting in the Commission putting encryption backdoors on the back burner until the next legislative session.

But there was never any reason to believe the EU Commission wouldn’t make another effort to push this past the member nations of the EU Council. Now that nearly half a year has passed, the EU Commission is getting back to basics: ensuring public safety by actively undermining public safety.

Here’s Iain Thomson, reporting for The Register:

The EU has shared its plans to ostensibly keep the continent’s denizens secure – and among the pages of bureaucratese are a few worrying sections that indicate the political union wants to backdoor encryption by 2026, or even sooner.

[…]

“We are working on a roadmap now, and we will look at what is technically also possible,” said Henna Virkkunen, executive vice-president of the EC for tech sovereignty, security and democracy. “The problem is now that our law enforcement, they have been losing ground on criminals because our police investigators, they don’t have access to data,” she added.

“Of course, we want to protect the privacy and cyber security at the same time; and that’s why we have said here that now we have to prepare a technical roadmap to watch for that, but it’s something that we can’t tolerate, that we can’t take care of the security because we don’t have tools to work in this digital world.”

This all sounds somewhat reasonable if you take VP Virkkunen at her word. Of course, the claim that police investigators are routinely stymied by a lack of “access to data” demands a citation, but there’s nothing in Virkkunen’s statement that clarifies how often this is actually a problem. Obviously, it happens in more than 0% of cases. But is it actually happening so often the only solution is breaking encryption for everyone, not just the criminal suspects law enforcement officers are interested in?

The report [PDF] doesn’t offer much clarification either. It does, however, open with a statement that apparently expects EU residents to believe that their personal security is far less important than (multi)national security that apparently can only be obtained by undermining the security of millions of people. (Emphasis in the original.)

Security is the bedrock upon which all our freedoms are built. Democracy, the rule of law, fundamental rights, the wellbeing of Europeans, competitiveness and prosperity – all hinge on our ability to provide a basic security guarantee. In the new era of security threats that we now live in, EU Member States’ ability to guarantee security for their citizens is more than ever contingent on a unified, European approach to protecting our internal security. In an evolving geopolitical landscape, Europe must continue to make good on its enduring promise of peace.

What the EU Commission would like readers to believe is that their opinions (and their personal security) matter.

We need a whole-of-society approach involving all citizens and stakeholders, including civil society, research, academia and private entities. The actions under the strategy therefore take an integrated, multi-stakeholder approach wherever possible.

“Wherever possible.” There’s the carve-out. Since most of this has to do with national security, it will be explained to stakeholders refused entry to the discussion that the issues are far too sensitive to be observed and discussed by mere members of the public, no matter how well-qualified they are to discuss these issues.

It takes a few more pages before the EU Commission finally lays out its anti-encryption goal. (Emphasis in the original.)

[T]he Commission will present in the first half of 2025 a roadmap setting out the legal and practical measures it proposes to take to ensure lawful and effective access to data. In the follow-up to this Roadmap, the Commission will prioritise an assessment of the impact of data retention rules at EU level and the preparation of a Technology Roadmap on encryption, to identify and assess technological solutions that would enable law enforcement authorities to access encrypted data in a lawful manner, safeguarding cybersecurity and fundamental rights.

This is all about giving law enforcement encryption backdoors. Any pretense of involving all shareholders has pretty much been dismissed at this point. On top of that, the vague assertion by VP Virkkunen about cops “losing group to criminals” due to a “lack of access” to device and communications content isn’t actually backed by the contents of this report. All it has to say on the subject is an absurdly obvious statement of fact that doesn’t actually state one way or the other whether or not investigators are having problems accessing this information without the use of encryption-breaking assistance:

Around 85% of criminal investigations now rely on law enforcement authorities’ ability to access digital information.

The report from EU law enforcement complaining (equally vaguely) about the same issue is similarly devoid of hard data detailing encrpytion’s deleterious effect on criminal investigations. Instead, it provides a long list of options currently available to law enforcement before somehow arriving at the conclusion that all of these options simply aren’t enough. What law enforcement wants is instant access to whatever it wants to access. But that was never the reality even back in the good old pre-digital days.

It’s the height of entitlement to claim you deserve full access to anything cops come across simply because it can now be held entirely in a device that fits into someone’s pocket. Criminals have always tried to hide or destroy evidence. But just because they have done this for years, no one has suggested it’s illegal for people to own paper shredders, fire pits, shovels, or access to nearby bodies of water. And certainly no one ever suggested it should be up to the government to decide whether or not people should have access to any of these things.

The EU Commission wants the impossible: “secure” backdoors that only the good guys can access. And if it can’t have that (and it can’t), it’s more than happy to have the next best thing: backdoors that can be exploited by criminals, so long as they can also be exploited by cops.

Filed Under: encryption, encryption backdoors, eu commission, privacy, security

A Win For Encryption: France Rejects Backdoor Mandate

from the a-rare-win dept

In a moment of clarity after initially moving forward a deeply flawed piece of legislation, the French National Assembly has done the right thing: it rejected a dangerous proposal that would have gutted end-to-end encryption in the name of fighting drug trafficking. Despite heavy pressure from the Interior Ministry, lawmakers voted Thursday night (article in French) to strike down a provision that would have forced messaging platforms like Signal and WhatsApp to allow hidden access to private conversations.

The vote is a victory for digital rights, for privacy and security, and for common sense.

The proposed law was a surveillance wish list disguised as anti-drug legislation. Tucked into its text was a resurrection of the widely discredited “ghost” participant model—a backdoor that pretends not to be one. Under this scheme, law enforcement could silently join encrypted chats, undermining the very idea of private communication. Security experts have condemned the approach, warning it would introduce systemic vulnerabilities, damage trust in secure communication platforms, and create tools ripe for abuse.

The French lawmakers who voted this provision down deserve credit. They listened—not only to French digital rights organizations and technologists, but also to basic principles of cybersecurity and civil liberties. They understood that encryption protects everyone, not just activists and dissidents, but also journalists, medical professionals, abuse survivors, and ordinary citizens trying to live private lives in an increasingly surveilled world.

A Global Signal

France’s rejection of the backdoor provision should send a message to legislatures around the world: you don’t have to sacrifice fundamental rights in the name of public safety. Encryption is not the enemy of justice; it’s a tool that supports our fundamental human rights, including the right to have a private conversation. It is a pillar of modern democracy and cybersecurity.

As governments in the U.S., U.K., Australia, and elsewhere continue to flirt with anti-encryption laws, this decision should serve as a model—and a warning. Undermining encryption doesn’t make society safer. It makes everyone more vulnerable.

This victory was not inevitable. It came after sustained public pressure, expert input, and tireless advocacy from civil society. It shows that pushing back works. But for the foreseeable future, misguided lobbyists for police national security agencies will continue to push similar proposals—perhaps repackaged, or rushed through quieter legislative moments.

Supporters of privacy should celebrate this win today. Tomorrow, we will continue to keep watch.

Republished from the EFF’s Deeplinks blog.

Filed Under: backdoors, encryption, france, ghost participant

from the but-her-emails dept

Wed, Mar 19th 2025 05:26am - Karl Bode

It’s best to view Elon Musk’s DOGE as an attack. While right wing propaganda (and gullible media outlets and politicians) frame DOGE as a “cost saving” effort at “improving government efficiency,” that’s just flimsy-ass cover for its real purpose: the dismantling of corporate oversight, environmental guard rails, consumer protection, civil rights, and the social safety net by weird zealots.

But DOGE is also just an incompetently run clown show.

There were already widespread concerns about Musk’s tween 4chan brats having widespread access to sensitive public information with no real oversight. But the randos that make up Trump and Musk’s rotating orbit of drooling sycophants also appear to be accessing this data using all manner of unsecured personal devices They couldn’t even launch the DOGE website competently with proper security.

Now there’s reporting out of the New York Times suggesting that Musk is casually integrating Starlink systems into the White House telecom network for no coherent reason outside of the fact it gives the illusion that it’s helping:

“Starlink, the satellite internet service operated by Elon Musk’s SpaceX, is now accessible across the White House campus. It is the latest installation of the Wi-Fi network across the government since Mr. Musk joined the Trump administration as an unpaid adviser.”

The New York Times falsely calls this a “Wi-Fi” network, when Starlink is Low Earth Orbit (LEO) satellite network. And in a complex as wired as the White House, there’s really no coherent reason to install it. The White House network is rife with gigabit capable fiber and gigabit-capable Wi-Fi that can far exceed anything Starlink delivers. Starlink would be a clearly inferior, slower, connectivity option.

According to the NY Times, one of Musk’s DOGE brats from X just decided one day to install a Starlink terminal on the White House roof, tripping security alarms and setting off a confrontation with Secret Service. All, purportedly, to “improve internet access” at probably one of the most well-connected buildings in the world.

There are only a few reasons to do this. One, is as a marketing stunt to help advertise Starlink as a miracle fix to a nonexistent problem. Two is to have a communications backchannel for stuff you don’t want tracked by any sort of White House network logging technologies. But even then, there are suggestions the Starlink traffic isn’t encrypted, creating a huge security risk:

“It was also unclear if Starlink communications were encrypted. At a minimum, the system allows for a network separate from existing White House servers that people on the grounds are able to use, keeping that data separate.”

It’s very rare, weird, and very dangerous to just mindlessly intermingle a private, and potentially unencrypted telecom connectivity option with existing White House systems and workflows, as numerous IT folks on Bluesky were quick to note:

And slapping a nontransparent comms channel on the roof of the White House so you and your weird authoritarian buddies can giggle about your illegal and unpopular dismantling of government functions is pretty far afield from all the “full transparency” they promised.

Again, if you don’t have any respect for the function of governance, you’re not going to be particularly careful as you and your earlobe nibbling tweens go about dismantling it. And if you have no shame or ethics, you also think nothing of leveraging your unelected influence to use the White House as a glorified marketing stunt. And if you’re incompetent, you’re going to be incompetent.

All very much in character for the fake government agency run by the fake super-genius engineer tasked with fake innovation and efficiency improvements.

Filed Under: doge, elon musk, encryption, national security, privacy, satellite, security, starlink, white house, wireless
Companies: spacex, starlink

The UK Government Just Made Everyone Less Safe As Apple Shuts Down iCloud Encryption

from the shortsighted-dangerous-nonsense dept

In a stunning display of government overreach, the UK has effectively forced Apple to disable its iCloud encryption for British users. Earlier this month, we wrote about the UK wielding the Investigatory Powers Act — aka “The Snooper’s Charter” — to demand Apple create a backdoor in its iCloud encryption for all users globally. Despite Apple’s long-standing warnings that it would rather exit the UK market than compromise encryption, the UK government doubled down.

The ensuing public outcry and warnings of “serious consequences” from US politicians fell on deaf ears. While the government’s exact demands remain secret (because of course they do), Apple’s response speaks volumes: they’re shutting down iCloud encryption for UK users entirely rather than create a global backdoor.

Apple disabled its most secure data storage offering for new customers in Britain on Friday rather than comply with a secret government order that would have allowed police and intelligence agencies to access the encrypted content.

That sounds like the UK isn’t backing down.

This is a terrible result for everyone, making Apple users globally (but especially in the UK) more vulnerable. Law enforcement’s tired narrative frames this as a trade-off between privacy and safety, but that’s dangerously wrong. Encryption isn’t just about privacy — it’s a fundamental security mechanism that protects against identity theft, financial fraud, corporate espionage and much more. This move effectively dismantles both privacy and safety, not because law enforcement lacks investigative tools, but because they’re really just lazy and demanding a “convenient” backdoor that inevitably creates new security risks.

While this compromise gives UK law enforcement their coveted access to British users’ iCloud data, it creates a dangerous precedent and leaves user data vulnerable to bad actors ranging from cybercriminals to hostile nation-states. Even worse, this “solution” likely falls short of the government’s reported demands for global backdoor access — suggesting this might just be round one of a longer fight.

As the folks at EFF note, this would have been a disaster:

Had Apple complied with the U.K.’s original demands, they would have been required to create a backdoor not just for users in the U.K., but for people around the world, regardless of where they were or what citizenship they had. As we’ve said time and time again, any backdoor built for the government puts everyone at greater risk of hacking, identity theft, and fraud.

This blanket, worldwide demand put Apple in an untenable position. Apple has long claimed it wouldn’t create a backdoor, and in filings to the U.K. government in 2023, the company specifically raised the possibility of disabling features like Advanced Data Protection as an alternative. Apple’s decision to disable the feature for U.K. users could well be the only reasonable response at this point, but it leaves those people at the mercy of bad actors and deprives them of a key privacy-preserving technology. The U.K. has chosen to make its own citizens less safe and less free.

It’s not just EFF folks saying this. Lots of security experts are horrified.

Mike Salem, UK country associate for the Consumer Choice Center, called on opposition parties to voice their discontent and demand the government outlines its reasoning.

“The UK government has set a precedent, and cast a new reputation that underscores the erosion of personal liberties and privacy in a digital age where these values are needed more than ever,” he said.

“This marks a very sad day for the basic principle of consumer privacy in the 21st century, depriving users of the tools that leave UK citizens exposed to governments, criminals and malicious hackers. The fact this has been done without debate, oversight or advance warning to UK Apple users is extremely concerning,” Salem said.

David Ruiz, senior privacy advocate at Malwarebytes, described the news as a “disaster” for the UK and one with potential global consequences.

“To demand access to the world’s data is such a brazen, imperialist manoeuvre that I’m surprised it hasn’t come from the US. This may embolden other countries, particularly those in the Five Eyes, to make a similar demand of Apple,” he argued.

Others have pointed out that if Apple had caved to the UK’s stupid demand, they would have almost immediately faced identical demands from other countries, including Russia, Turkey, Iran… you name it.

It is difficult to think of a more shortsighted move than what the UK has done here. It has put its own citizenry at greater risk, while threatening some of the basic fundamentals of private storage.

It’s good that Apple is taking a stand, but it feels like this is just one battle in a war that is far from over.

Filed Under: backdoors, encryption, icloud, investigatory powers act, snoopers charter, uk
Companies: apple

Ctrl-Alt-Speech: Backdoors And Backsteps

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben are joined by a group of students from the Media Law and Policy class at the American University School of Communication. Together they cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

Filed Under: ai, artificial intelligence, content moderation, encryption, eu, first amendment, free speech, kanye west, uk
Companies: apple, google, meta, shopify, tiktok

UK Orders Apple To Break Encryption Worldwide While World Is Distracted

from the the-death-of-privacy dept

In a stunning escalation that confirms our worst fears, the UK government has finally shown its true hand on encryption — and it’s even worse than we predicted. According to a bombshell report from Joseph Menn at the Washington Post, British officials have ordered Apple to create a backdoor that would allow them to access encrypted content from any Apple user worldwide.

This comes after years of the UK government’s steadily mounting assault on encryption, from the Investigatory Powers Act to the Online Safety Act. While officials repeatedly insisted they weren’t trying to break encryption entirely, those of us following closely saw this coming. Apple even warned it might have to exit the UK market if pushed too far.

Security officials in the United Kingdom have demanded that Apple create a back door allowing them to retrieve all the content any Apple user worldwide has uploaded to the cloud, people familiar with the matter told The Washington Post.

The British government’s undisclosed order, issued last month, requires blanket capability to view fully encrypted material, not merely assistance in cracking a specific account, and has no known precedent in major democracies.

Let’s be super clear here: The UK government is demanding that Apple fundamentally compromise the security architecture of its products for every user worldwide. This isn’t just about giving British authorities access to British users’ data — it’s about creating a master key that would unlock everyone’s encrypted data, everywhere.

This is literally breaking the fundamental tool that protects our privacy and security. Backdoored encryption is not encryption at all.

The technical reality is stark: You can’t create a backdoor that only works for “good guys.” Any vulnerability built into the system becomes a vulnerability for everyone — state actors, cybercriminals, and hostile nations alike. And right now, it’s worth recognizing that any government (including our own) can be seen as a “hostile nation” to many.

Even if Apple withdraws from the UK market entirely, as the Post reports they’re considering, it won’t satisfy the UK’s demands:

Rather than break the security promises it made to its users everywhere, Apple is likely to stop offering encrypted storage in the U.K., the people said. Yet that concession would not fulfill the U.K. demand for backdoor access to the service in other countries, including the United States.

This global reach is particularly concerning given the UK’s membership in the Five Eyes intelligence alliance. Any backdoor created for British authorities would inevitably become a tool for intelligence and law enforcement agencies across the US, Australia, Canada, and New Zealand — effectively creating a global surveillance capability without any democratic debate or oversight in those countries.

If the UK does this, it means that the FBI will be able to use it to read anyone’s data.

The UK government’s approach here is particularly insidious. While Apple can appeal the order, their appeal rights are bizarrely limited: They can only argue about the cost of implementing the backdoor, not the catastrophic privacy and security implications for billions of users worldwide. This reveals the UK government’s complete indifference to the fundamental right to privacy.

Even more alarming is the forced secrecy component.

One of the people briefed on the situation, a consultant advising the United States on encryption matters, said Apple would be barred from warning its users that its most advanced encryption no longer provided full security. The person deemed it shocking that the U.K. government was demanding Apple’s help to spy on non-British users without their governments’ knowledge. A former White House security adviser confirmed the existence of the British order.

This gag order component is particularly chilling — the UK isn’t just demanding the power to break encryption globally, they’re demanding the right to force Apple to actively deceive its users about the security of their data. After years of dismissing concerns about the Investigatory Powers Act as “exaggerated,” the UK government is now proving its critics right in the most dramatic way possible.

The implications here cannot be overstated. This would represent the single largest coordinated attack on private communications in the digital age. It’s not just about government surveillance — it’s about deliberately introducing vulnerabilities that would be exploitable by anyone who discovers them, from hostile nation-states to criminal organizations.

The timing of this demand is nothing short of breathtaking in its recklessness. We are quite literally in the midst of dealing with the catastrophic fallout from the Chinese Salt Typhoon hack — where state-sponsored hackers exploited a government-mandated backdoor in our telephone infrastructure to conduct widespread surveillance. This hack alone should have permanently ended any discussion of intentionally weakening encryption. It’s a real-world demonstration of exactly what security experts have been warning about for decades: backdoors will inevitably be discovered and exploited by bad actors.

The irony here is almost painful: The FBI itself has been actively encouraging Americans to use encrypted communications specifically because our telephone infrastructure remains compromised by Chinese hackers. Yet at this precise moment — when we’re witnessing firsthand the devastating consequences of compromised security — the UK government is demanding we create an even bigger, more dangerous, more consequential backdoor?

This is beyond dangerous. There is no reasonable rationale for this.

There’s a good chance that the UK is doing this right now knowing that the US is totally distracted by everything that Musk and Trump are doing to dismantle the US government. But given how much Trump seems to hate the FBI right now, it seems like even more of a reason for him to call this out as an attack on Americans and our privacy. Does he want the FBI reading his data as well?

Senator Ron Wyden, who has been a tireless champion of encryption, is reasonably angry about this and is calling on both Apple and Trump to “tell the UK to go to hell.”

Trump and Apple better tell the UK to go to hell with its demand to access Americans’ private, encrypted texts and files. Trump and American tech companies letting foreign governments secretly spy on Americans would be an unmitigated privacy and national security disaster.

Senator Ron Wyden (@wyden.senate.gov) 2025-02-07T17:15:45.189Z

As he says:

Trump and Apple better tell the UK to go to hell with its demand to access Americans’ private, encrypted texts and files. Trump and American tech companies letting foreign governments secretly spy on Americans would be an unmitigated privacy and national security disaster.

Wyden calling out Trump here actually makes a lot of sense. Given Trump’s current antagonistic relationship with federal law enforcement, he might be uniquely positioned to recognize this for what it is — a foreign government demanding the power to spy on Americans, including him personally. The FBI, which would inevitably gain access to this backdoor through Five Eyes sharing agreements, would have unprecedented access to everyone’s communications — a scenario that should alarm privacy advocates across the political spectrum.

This is, without hyperbole, a five-alarm fire for digital privacy and security. The UK government is attempting to fundamentally reshape global digital security through a secretive demand, hoping the world is too distracted to notice or resist. They’re not just asking for a key to their own citizens’ data — they’re demanding the power to unlock everyone’s digital life, everywhere, while forcing Apple to lie about it.

The stakes couldn’t be higher. This isn’t just about privacy — it’s about the future of secure communication itself. Don’t let this slip by in the chaos of the moment. The UK government is betting on our distraction and apathy. Let’s prove them wrong.

Filed Under: backdoors, encryption, investigatory powers act, privacy, security, spying, surveillance, uk
Companies: apple

FBI Official Reluctantly Touts Encryption Since US Telecom Providers Are Still Compromised By Chinese Hackers

from the oh-the-delightful-irony dept

Thanks to government-mandated backdoors in US telecom/broadband services, the FBI — at least in the form of an official who refused to identify themself — has had to recommend (albeit extremely half-heartedly) that encrypted communications are perhaps the only thing keeping phone owners from being actively surveilled by Chinese hackers.

The news of a massive breach linked to “Salt Typhoon,” a Chinese state-sponsored hacking group made at least one thing perfectly clear: the sort of encryption the FBI approves of — the one with all the holes in it — is a terrible idea. What was leveraged here were the backdoors created for law enforcement access. To facilitate wiretaps, telcos and broadband providers were required by CALEA (Communications Assistance for Law Enforcement Act) to proactively make surveillance easier for law enforcement. The law, passed in 1994, originally targeted phone companies. The law was amended in 2006 to cover broadband providers.

There’s no such thing as a “safe” encryption backdoor. That much has been made obvious by this hack, along with the disturbing fact that it appears — months after discovery — these systems are still very much compromised.

If there’s any good that might come of this, it’s that the FBI might finally stop bitching so much about what it calls “warrant-proof” encryption. That’s just encryption to the rest of us, but one without government-mandated backdoors a government — whether it’s ours or China’s — can exploit at will.

With no end in sight, government officials — including one representing the FBI — are telling people to keep their devices and software updated, to set up multi-factor authentication wherever possible, and, believe it or not, to utilize encrypted services.

In the call Tuesday, two officials — a senior FBI official who asked not to be named and Jeff Greene, executive assistant director for cybersecurity at the Cybersecurity and Infrastructure Security Agency — both recommended using encrypted messaging apps to Americans who want to minimize the chances of China’s intercepting their communications.

“Our suggestion, what we have told folks internally, is not new here: Encryption is your friend, whether it’s on text messaging or if you have the capacity to use encrypted voice communication. Even if the adversary is able to intercept the data, if it is encrypted, it will make it impossible,” Greene said.

It’s no surprise a CISA rep would encourage the use of encrypted services. No one actually involved in cyber security would ever say otherwise. The FBI — personified here by a nameless official — says pretty much the same thing, although it’s not quite as enthusiastic about recommending encryption.

The FBI official said, “People looking to further protect their mobile device communications would benefit from considering using a cellphone that automatically receives timely operating system updates, responsibly managed encryption and phishing resistant” multi-factor authentication for email, social media and collaboration tool accounts.

I would love to know what this person’s definition of “responsibly managed encryption” is. For those of us who aren’t on board with the FBI’s anti-encryption plans, that would be any encrypted service that hasn’t been deliberately weakened by service providers to serve government interests. For the FBI, I would imagine it means the opposite. Or, at the very least, “responsibly managing” encryption means willingly handing over passcodes to any law enforcement investigator that asks for them prior to performing a device search.

But even if the FBI can’t bring itself to wholeheartedly recommend strong encryption, this massive breach undercuts any arguments it might attempt to make in the near future in favor of weakened encryption, a.k.a., the “lawful access” it has tried to convince legislators for years would never result in EXACTLY THE SORT OF THING WE’RE SEEING RIGHT NOW.

Hopefully, this will bring a swift — if temporary — end to the FBI’s anti-encryption agitating. But with a new(ish) boss coming to town early next year, all the logic in the world likely won’t make much of a difference if the returning president decides encryption is just another obstacle (you know, like civil rights) law enforcement shouldn’t have to deal with when going after the baddies.

Filed Under: breach, calea, china, encryption, encryption backdoors, lawful access, salt typhoon, wiretaps
Companies: at&t, verizon

Apple Snuck In Code That Automatically Reboots Idle IPhones And Cops Are Not Happy About It

from the phone-cracking-now-has-a-countdown-timer dept

Detroit law enforcement officials got a bit of shock last week when some seized iPhones rebooted themselves, despite being in airplane mode and, in one case, stored inside a Faraday bag. Panic — albeit highly localized — ensued. It was covered by Joseph Cox for 404 Media, who detailed not only the initial panic, but the subsequent responses to this unexpected development.

Law enforcement officers are warning other officials and forensic experts that iPhones which have been stored securely for forensic examination are somehow rebooting themselves, returning the devices to a state that makes them much harder to unlock, according to a law enforcement document obtained by 404 Media.

The exact reason for the reboots is unclear, but the document authors, who appear to be law enforcement officials in Detroit, Michigan, hypothesize that Apple may have introduced a new security feature in iOS 18 that tells nearby iPhones to reboot if they have been disconnected from a cellular network for some time. After being rebooted, iPhones are generally more secure against tools that aim to crack the password of and take data from the phone.

The problem (for the cops, not iPhone owners) is that the reboot takes the phone out of After First Unlock (AFU) state — a state where current phone-cracking tech can still be effective — and places it back into Before First Unlock (BFU) state, which pretty much renders phone-cracking tech entirely useless.

The speculation as to the source of these unexpected reboots was both logical and illogical. The logical assumption was that Apple had, at some point, added some new code to the latest iOS version without informing the public this new feature had been added.

The other guesses were just kind of terrible and, frankly, a bit worrying, considering their source: law enforcement professionals tasked with finding technical solutions to technical problems.

The law enforcement officials’ hypothesis is that “the iPhone devices with iOS 18.0 brought into the lab, if conditions were available, communicated with the other iPhone devices that were powered on in the vault in AFU. That communication sent a signal to devices to reboot after so much time had transpired since device activity or being off network.” They believe this could apply to iOS 18.0 devices that are not just entered as evidence, but also personal devices belonging to forensic examiners.

These are phones, not Furbies. There needs to be some avenue for phone-to-phone communication, which can’t be achieved if the phones are not connected to any networks and/or stored in Faraday cages/bags. The advisory tells investigators to “take action to isolate” iOS 18 devices to keep them from infecting (I guess?) other seized phones currently awaiting cracking.

Fortunately, a day later, most of this advisory was rendered obsolete after actual experts took a look at iOS 18’s code. Some of those experts work for Magnet Forensics, which now owns Grayshift, the developer of the GrayKey phone cracker. This was also covered by Joseph Cox and 404 Media.

In a law enforcement and forensic expert only group chat, Christopher Vance, a forensic specialist at Magnet Forensics, said “We have identified code within iOS 18 and higher that is an inactivity timer. This timer will cause devices in an AFU state to reboot to a BFU state after a set period of time which we have also identified.”

[…]

“The reboot timer is not tied to any network or charging functions and only tied to inactivity of the device since last lock [sic],” he wrote.

It’s an undocumented feature in the latest version of iOS, apparently. And one that isn’t actually a bug dressed in “feature” clothing. This was intentional, as was Apple’s decision to keep anyone from knowing about until it was discovered, presumably. Apple has issued no statement confirming or denying the stealthy insertion of this feature.

Law enforcement officials and the tech contractors they work with aren’t saying much either. Everything published by 404 Media was based on screenshots taken from a law enforcement-only group chat or secured from a source in the phone forensics field. Magnet Forensic has only offered a “no comment,” along with the acknowledgement the company is aware this problem now exists.

This means iPhones running the latest iOS version will need to be treated like time bombs by investigators. The clock will start running the moment they remove the phones from the networks they use.

This isn’t great news for cops, but it’s definitely great news for iPhone owners. And not just the small percentage who are accused criminals. Everyone benefits from this. And the feature will deter targeting of iPhones by criminals, who are even less likely to be able to beat the clock with their phone-cracking tech. Anything that makes electronic devices less attractive to criminals is generally going to cause additional problems for law enforcement because both entities — to one degree or another — know the true value of a seized/stolen phone isn’t so much the phone itself as it is the wealth of information those phones contain.

Filed Under: device cracking, device encryption, device security, encryption, law enforcement, security
Companies: apple, grayshift, magnet forensics