ssl – Techdirt (original) (raw)

Stories filed under: "ssl"

from the taking-down-security-certs-is-bad dept

A little over a year ago, Matt Holt, who created the Caddy Server that helps make it easier to protect websites with HTTPS encryption, posted a hypothetical blog post, from the year 2022, in which he worried that enterprising and persistent copyright lawyers would have continued moving up the stack with their DMCA notices, and started to use the process to get HTTPS security certificates removed.

A lawyer need only be successful in convincing one of those four ?choke points? by threatening legal action in order to suffocate the site. (There are others, like ISPs, which operate more generally, and we skip them for brevity.) These entities totally control the site?s availability, which is one crucial dimension of secure systems. Here they are again:

* Site owner. He or she can voluntarily remove the site/content. * Web host. They can destroy the site owner?s account or files. * Domain registrar. They can cancel or transfer ownership of the domain name. * DNS provider. They can make the site inaccessible via hostname.

Now that it?s 2022, a site needs HTTPS in order to be trusted by browsers. At very least, this means they show an indicator above the page. Maybe it even means the browser shows a warning before navigating to the site. Either way, HTTPS is critical to a site?s availability and integrity.

DMCA lawyers are clever, and they realize this emerging trend. They contact a site?s CA and demand the site be disconnected for violating the law (despite lack of a court case). The CA, operating without policy for such requests and afraid of legal ramifications, revokes the site?s certificate.

Within hours, browsers begin to refuse connecting to the site on port 443 and warning flags fly instead, scaring users away. Browsers don?t revert to port 80 anymore because HTTPS is expected and using HTTP is effectively a downgrade attack. Visitors aren?t sure what to do, and the site goes offline around the globe.

We’ve raised some questions in the past about this process of copyright holders moving up the stack — and not just targeting the content hosts, but companies further upstream, including ad providers, domain registers and registrars, and the like. There are serious issues with each of these, but going after security certificates seems especially pernicious.

But Matt was a bit off in his predicted timing on this. After his article ran, we learned of at least a few examples of copyright holders going after security certificate providers. Take for example this copyright notice that was sent to Squarespace (the host), Tucows (the domain register), and Let’s Encrypt (the security certificate provider).

And now TorrentFreak notes that Comodo has revoked Sci-Hub’s HTTPS certificate.

?In response to a court order against Sci-Hub, Comodo CA has revoked four certificates for the site,? Jonathan Skinner, Director, Global Channel Programs at Comodo CA informed TorrentFreak.

?By policy Comodo CA obeys court orders and the law to the full extent of its ability.?

Comodo refused to confirm any additional details, including whether these revocations were anything to do with the current ACS injunction. However, Susan R. Morrissey, Director of Communications at ACS, told TorrentFreak that the revocations were indeed part of ACS? legal action against Sci-Hub.

?[T]he action is related to our continuing efforts to protect ACS? intellectual property,? Morrissey confirmed.

We’ve obviously covered a lot about the Sci-hub story over the years, and the weird quixotic focus by some to take down a site focused on (of all things) better sharing academic knowledge (especially to academics in the developing world). It’s already sickening enough the level to which some copyright holders have gone to effectively shut down a library, but going after the security certificate is beyond the pale.

The DMCA allows for approaching a variety of different intermediaries, from network communications, to hosts, to caching, to “information location tools” (i.e. search engines), but I have a very difficult time seeing how any of that applies to security certificate providers (or, for that matter, to domain registers).

Even more bizarre is that going after the security certificate doesn’t stop any actual infringement — it just makes users a lot less safe. And yet, it’s coming from the very same copyright holders who keep trying to tell people they shouldn’t pirate content because it exposes them to malware and viruses and dangerous computers and the like. But removing security certificates would make that a much more serious problem. And yet, here we have a case where ACS went after a security certificate, a judge okayed it, and Comodo played along. That’s dangerous for the way the internet works and is kept secure. If they want to go after the hosts, go after the hosts. Destroying the ability to protect users by encrypting the traffic is just evil.

Filed Under: choke points, copyright, dmca, https, security certificates, ssl
Companies: acs, american chemical society, comodo, sci-hub

Certificate Authority Gave Out Certs For GitHub To Someone Who Just Had A GitHub Account

from the oops dept

For many years now, we’ve talked about the many different problems today’s web security system has based on the model of security certificates issued by Certificate Authorities. All you need is a bad Certificate Authority to be trusted and a lot of bad stuff can happen. And it appears we’ve got yet another example.

A message on Mozilla’s security policy mailing list notes that a free certificate authority named WoSign appeared to be doing some pretty bad stuff, including handing out certificates for a base domain if someone merely had control over a subdomain. This was discovered by accident, but then tested on GitHub… and it worked.

In June 2015, an applicant found a problem with WoSign’s free certificate service, which allowed them to get a certificate for the base domain if they were able to prove control of a subdomain.

The reporter proved the problem in two ways. They accidentally discovered it when trying to get a certificate for med.ucf.edu and mistakenly also applied for www.ucf.edu, which was approved. They then confirmed the problem by using their control of theiraccount.github.com/theiraccount.github.io to get a cert for github.com, github.io, and www.github.io.

They reported this to WoSign, giving only the Github certificate as an example. That cert was revoked and the vulnerability was fixed. However recently, they got in touch with Google to note that the ucf.edu cert still had not been revoked almost a year later.

As you can imagine, this should be a cause for quite some concern:

The lack of revocation of the ucf.edu certificate (still unrevoked at time of writing, although it may have been by time of posting) strongly suggests that WoSign either did not or could not search their issuance databases for other occurrences of the same problem. Mozilla considers such a search a basic part of the response to disclosure of a vulnerability which causes misissuance, and expects CAs to keep records detailed enough to make it possible.

Mozilla also noted that WoSign never informed it of the earlier misissuance either. This is a pretty big mistake. The Mozilla post also calls out some questionable activity by WoSign in backdating certificates, but this first point is the really troubling one.

I recognize that until a better system is found, certificate authorities issuing certificates is about all we have right now for web security — but, once again, it really seems like we need to be moving to a better solution.

Filed Under: certificate authority, https, security, ssl, subdomains
Companies: github, mozilla, wosign

Russian Censor Bans Comodo… Doesn't Realize Its Own Security Certificate Is From Comodo

from the ow!-my-foot!-shot-it-right-off! dept

The Russian government’s state censorship organization, Roskomnadzor (technically its telecom regulator) has been especially busy lately as the government has continued to crack down on websites it doesn’t like. However, as pointed out by Fight Copyright Trolls, it appears that Roskomnadzor may have gone a bit overboard recently, in response to a court ruling that had a massive list of sites to be banned (over a thousand pages). Apparently, as part of that, various sites associated with Comodo were all banned. That’s pretty bad for a variety of reasons, starting with the fact that Comodo remains one of the most popular issuers of secure certificates for HTTPS.

In fact, as many quickly noted, Roskomnadzor’s own website happens to be secured with a certificate from… Comodo:

It’s not entirely clear the impact of this, but the Rublacklist site appears to be implying (via my attempt at understanding Google translate’s translation…) that this also means that sites that rely on Roskomnadzor’s registry of sites to block… may be blocked from accessing the list. Because its own site is effectively blocked by the list. Oops.

Filed Under: ban list, censorship, https, roskomnadzor, russia, security certificate, ssl
Companies: comodo

Ted Cruz's New Presidential Campaign Donation Website Shares Security Certificate With Nigerian-Prince.com

from the feeling-safe-donating-yet? dept

Update: Yes, as lots of angry people are screaming at us (including with detailed explanations of how incredibly, unbelievably, astoundingly stupid I must be), this is a result of Cruz using Cloudflare, which lumps unrelated domains onto the same HTTPS certificate. And yes, Techdirt.com’s certificate is hosted by Cloudflare also, and we share it with other domains as well. Feel free to continue to read the original story below and contribute to how stupid you think I am in the comments…

We’re big believers in using HTTPS to secure websites (even if HTTPS certificates have their problems — it’s still better than the alternative). But there are pitfalls in setting up your certificate correctly as newly announced presidential candidate Senator Ted Cruz apparently discovered this morning. Because along with his campaign launch speech (which was widely mocked by the Liberty University students who were forced to attend), he put up a website for donations. And that website didn’t default to HTTPS and also listed nigerian-prince.com as an alternate domain on the security certificate, as first noted by the Twitter feed @PwnAllTheThings:

A few hours after this was first noticed, the Cruz campaign appears to have removed nigerian-prince.com from its certificate, but it still raises some questions about just who he has hired to build his websites. I guess that’s what happens when even the technologists in your own party openly mock Ted Cruz’s ignorance when it comes to technology issues like net neutrality.

Filed Under: donations, https, internet security, nigerian prince, presidential campaign, security, security certificates, ssl, ted cruz, tls

Chief Information Officers Council Proposes HTTPS By Default For All Federal Government Websites

from the being-the-change-people-have-been-waiting-for dept

In a long-overdue nod to both privacy and security, the administration finally moved Whitehouse.gov to HTTPS on March 9th. This followed the FTC’s March 6th move to do the same. And yet, far too many government websites operate without the additional security this provides. But that’s about to change. According to a recent post by the US government’s Chief Information Officers Council, HTTPS will (hopefully) be the new default for federal websites.

The American people expect government websites to be secure and their interactions with those websites to be private. Hypertext Transfer Protocol Secure (HTTPS) offers the strongest privacy protection available for public web connections with today’s internet technology. The use of HTTPS reduces the risk of interception or modification of user interactions with government online services.

This proposed initiative, “The HTTPS-Only Standard,” would require the use of HTTPS on all publicly accessible Federal websites and web services.

In a statement that clashes with the NSA’s activities and the FBI’s push for pre-compromised encryption, the CIO asserts that when people engage with government websites, these interactions should be no one’s business but their own.

All browsing activity should be considered private and sensitive.

The proposed standard would eliminate agencies’ options, forcing them to move to HTTPS, both for their safety and the safety of their sites’ visitors. To be sure, many cats will still need to be shepherded if this goes into effect, but hopefully there won’t be too many details to trifle over. HTTPS or else is the CIO Council’s goal — something that shouldn’t be open to too much interpretation.

As the Council points out, failing to do so places both ends of the interaction at risk. If government sites are thought to be unsafe, it has the potential to harm citizens along with the government’s reputation.

Federal websites that do not use HTTPS will not keep pace with privacy and security practices used by commercial organizations, or with current and upcoming Internet standards. This leaves Americans vulnerable to known threats, and reduces their confidence in their government. Although some Federal websites currently use HTTPS, there has not been a consistent policy in this area. The proposed HTTPS-only standard will provide the public with a consistent, private browsing experience and position the Federal government as a leader in Internet security.

The CIO’s short, but informative, explanatory page lists the pros of this proposed move, as well as spells out what HTTPS doesn’t protect against. It also notes that while most sites should actually see a performance boost from switching to HTTPS, sites that gather elements for other parties will be the most difficult to migrate. And, it notes, the move won’t necessarily be inexpensive.

The administrative and financial burden of universal HTTPS adoption on all Federal websites includes development time, the financial cost of procuring a certificate and the administrative burden of maintenance over time. The development burden will vary substantially based on the size and technical infrastructure of a site. The proposed compliance timeline provides sufficient flexibility for project planning and resource alignment.

But, it assures us (at least as much as any government entity can…), the money will be well-spent.

The tangible benefits to the American public outweigh the cost to the taxpayer. Even a small number of unofficial or malicious websites claiming to be Federal services, or a small amount of eavesdropping on communication with official US government sites could result in substantial losses to citizens.

The CIO is also taking input from the public, at Github no less.

A very encouraging — if rather belated — sign that the government is still making an effort to take privacy and security seriously, rather than placing those two things on the scales for intelligence and law enforcement agencies to shift around as they see fit when weighing their desires against Americans’ rights and privileges.

Filed Under: encryption, federal government, ftc, https, ssl, tls, websites, white house

Encryption Backdoors Will Always Turn Around And Bite You In The Ass

from the golden-keys dept

As you may have heard, the law enforcement and intelligence communities have been pushing strongly for backdoors in encryption. They talk about ridiculous things like “golden keys,” pretending that it’s somehow possible to create something that only the good guys can use. Many in the security community have been pointing out that this is flat-out impossible. The second you introduce a backdoor, there is no way to say that only “the good guys” can use it.

As if to prove that, an old “golden key” from the 90s came back to bite a whole bunch of the internet this week… including the NSA. Some researchers discovered a problem which is being called FREAK for “Factoring RSA Export Keys.” The background story is fairly involved and complex, but here’s a short version (that leaves out a lot of details): back during the first “cryptowars” when Netscape was creating SSL (mainly to protect the early e-commerce market), the US still considered exporting strong crypto to be a crime. To deal with this, RSA offered “export grade encryption” that was deliberately weak (very, very weak) that could be used abroad. As security researcher Matthew Green explains, in order to deal with the fact that SSL-enabled websites had to deal with both strong crypto and weak “export grade” crypto, — the “golden key” — there was a system that would try to determine which type of encryption to use on each connection. If you were in the US, it should go to strong encryption. Outside the US? Downgrade to “export grade.”

In theory, this became obsolete at the end of the first cryptowars when the US government backed down for the most part, and stronger crypto spread around the world. But, as Green notes, the system that did that old “negotiation” as to which crypto to use, known as “EXPORT ciphersuites” stuck around. Like zombies. We’ll skip over a bunch of details to get to the point: the newly discovered hack involves abusing this fact to force many, many clients to accept “export grade” encryption, even if they didn’t ask for it. And it appears that more than a third of websites out there (many coming from Akamai’s content delivery network — which many large organizations use) are vulnerable.

And that includes the NSA’s own website. Seriously. Now, hacking the NSA’s website isn’t the same as hacking the NSA itself, but it still seems notable just for the irony of it all (obligatory xkcd):

But the lesson of the story: backdoors, golden keys, magic surveillance leprechauns, whatever you want to call it create vulnerabilities that will be exploited and not just by the good guys. As Green summarizes:

There?s a much more important moral to this story.

The export-grade RSA ciphers are the remains of a 1980s-vintage effort to weaken cryptography so that intelligence agencies would be able to monitor. This was done badly. So badly, that while the policies were ultimately scrapped, they?re still hurting us today.

This might be academic if it was just a history lesson ? but for the past several months, U.S. and European politicians have been publicly mooting the notion of a new set of cryptographic backdoors in systems we use today. This would involve deliberately weakening technology so that governments can intercept and read our conversations. While officials are carefully avoiding the term ?back door? ? or any suggestion of weakening our encryption systems ? this is wishful thinking. Our systems are already so complex that even normal issues stress them to the breaking point. There’s no room for new backdoors.

To be blunt about it, the moral of this story is pretty simple:

> Encryption backdoors will always turn around and bite you in the ass. They are never worth it.

Let’s repeat that last line, because it still seems that the powers that be don’t get it:

Encryption backdoors will always turn around and bite you in the ass. They are never worth it.

Whether it’s creating vulnerabilities that come back to undermine security on the internet decades later, or merely giving cover to foreign nations to undermine strong encryption, backdoors are a terrible idea which should be relegated to the dustbin of history.

Filed Under: backdoors, encryption, export encryption, export keys, freak, nsa, openssl, ssl, tls, vulnerability

Gogo Inflight Wifi Service Goes Man-In-The-Middle, Issues Fake Google SSL Certificates

from the 'trusted-partner,'-my-ass dept

When you’re flying, your internet connection is completely in the hands of a single company. There’s no searching around for another signal. So, however the provider decides to handle your connection, that’s what you’re stuck with. A captive audience usually results in fun things like high prices and connection throttling. And, if you’re Gogo Inflight, it means compromising the security of every traveler who chooses to use the service, just because you can.

Gogo Inflight Internet seems to believe that they are justified in performing a man-in-the-middle attack on their users. Adrienne Porter Felt, an engineer that is a part of the Google Chrome security team, discovered while on a flight that she was being served SSL certificates from Gogo when she was requesting Google sites. Looking at the issuer of the certificate, rather than being issued by Google, it was being issued by Gogo.

The bogus certificate was captured in a screenshot tweeted out by Felt.

hey @Gogo, why are you issuing *.google.com certificates on your planes? pic.twitter.com/UmpIQ2pDaU

— Adrienne Porter Felt (@__apf__) January 2, 2015

Now, Gogo Inflight likely has several reasons why it would perform a MITM attack on its users, but none of them justify stripping away previously existing security layers. The company loves to datamine and it definitely makes an effort to “shape” traffic by curtailing use of data-heavy sites. It also, as Steven Johns at Neowin points out, is an enthusiastic participant in law enforcement and investigative activities, going above and beyond what’s actually required of service providers.

In designing its existing network, Gogo worked closely with law enforcement to incorporate functionalities and protections that would serve public safety and national security interests. Gogo’s network is fully compliant with the Communications Assistance for Law Enforcement Act (“CALEA”). The Commission’s ATG rules do not require licensees to implement capabilities to support law enforcement beyond those outlined in CALEA. Nevertheless, Gogo worked with federal agencies to reach agreement regarding a set of additional capabilities to accommodate law enforcement interests. Gogo then implemented those functionalities into its system design.

So, whatever its myriad reasons for compromising the security of travelers, it’s likely the law enforcement angle that has the most to do with its fake SSL certificates. Every communication utilizing its service is fully exposed. Gogo keeping tabs on its users for itself (data mining) and law enforcement also exposes them to anyone else on the plane who wishes to do the same. Nowhere has it stated upfront that it will remove the security from previously secure websites and services. In fact, it says exactly the opposite in its Privacy Policy.

The airlines on whose planes the Services are available do not collect any information through your use of the Services, but we may share certain types of information with such airlines, as described below. Please remember that this policy only covers your activities while on the Gogo Domains; to the extent you visit third party websites, including the websites of our airline partners, the privacy policies of those websites will govern.

Except that those policies can’t govern, not when their underlying security has been compromised by fake Gogo SSL certificates.

The solution for travelers is to skip the service entirely, or run everything through a VPN. Gogo welcomes the use of VPNs for greater security, but even this wording is at odds with what it’s actually doing.

Gogo does support secure Virtual Private Network (VPN) and Secure Shell (SSH) access. If you have VPN, Gogo recommends that you use secure VPN protocols for greater security. SSL-encrypted websites or pages, typically indicated by “https” in the address field and a “lock” icon, can also generally be accessed through the Gogo Services. You should be aware, however, that data packets from un-encrypted Wi-Fi connections can be captured by technically advanced means when they are transmitted between a user’s Device and the Wi-Fi access point. You should therefore take precautions to lower your security risks.

Again, precautions are moot if Gogo deliberately inserts itself into the transmission with bogus certificates.

Gogo has yet to respond to this, but I would imagine its answer will involve pointing to the mess of contradictions it calls a Privacy Policy. Gogo can run its service however it wants to, but with its upcoming move into providing text messaging and voicemail access, it should really revamp the way it handles its customers’ connections.

Filed Under: fake certs, mitm, security, ssl, wifi in the sky
Companies: gogo

How The NSA Works Hard To Break Encryption Any Way It Can

from the brute-force dept

Spiegel has published a detailed article, relying mostly on documents that Ed Snowden leaked, looking at the many ways in which the NSA breaks encryption (and the few situations where it still has not been able to do so). As we’ve seen from previous leaks, the NSA stupidly treats encryption as a “threat.”

And, sure, it is a “threat” to the way in which the NSA snoops on everything, but for the vast majority of users, it’s a way to protect their privacy from snooping eyes. The report does reveal that certain encryption standards appear to still cause problems for the NSA, including PGP (which you already use for email, right?), OTR (used in some secure chat systems) and VoIP cryptography system ZRTP. Phil Zimmermann, who helped develop both PGP and ZRTP should be pretty damn proud of his achievements here.

As the report notes, the NSA has the most trouble around open source programs, because it’s much more difficult to insert helpful backdoors:

Experts agree it is far more difficult for intelligence agencies to manipulate open source software programs than many of the closed systems developed by companies like Apple and Microsoft. Since anyone can view free and open source software, it becomes difficult to insert secret back doors without it being noticed. Transcripts of intercepted chats using OTR encryption handed over to the intelligence agency by a partner in Prism — an NSA program that accesses data from at least nine American internet companies such as Google, Facebook and Apple — show that the NSA’s efforts appear to have been thwarted in these cases: “No decrypt available for this OTR message.” This shows that OTR at least sometimes makes communications impossible to read for the NSA.

When it comes to non-open source systems, well, there the NSA has its ways in. In fact, the NSA seems rather proud of the fact that it can make “cryptographic modifications to commercial or indigenous cryptographic information security devices or systems in order to make them exploitable.”

The report also shows that VPNs are targeted by the NSA, and it has had a fair bit of luck in breaking many of them (especially those that rely on PPTP — which has long been recognized as being insecure, but is still widely used by some VPN providers). However, it also shows that the NSA has been able to crack IPsec VPN connections as well. In short: your VPN probably isn’t secure from the NSA if it wants in.

The NSA also has apparently been able to crack HTTPS connections, and does so regularly:

The NSA and its allies routinely intercept such connections — by the millions. According to an NSA document, the agency intended to crack 10 million intercepted https connections a day by late 2012. The intelligence services are particularly interested in the moment when a user types his or her password. By the end of 2012, the system was supposed to be able to “detect the presence of at least 100 password based encryption applications” in each instance some 20,000 times a month.

HTTPS is still a lot more secure against non-NSA-level hackers, but it certainly shows that it’s not a perfect solution.

Another big reveal: the NSA has the ability (at least some of the time) to decrypt SSH (Secure Shell) which many of us use to access computers/servers remotely.

There’s lots more in the article and in the many, many included documents (just a few of which are shown below). It’s well worth reading.

However, the key point is that the NSA is working very, very hard to undermine key encryption systems used around the internet to keep people safe. And rather than sharing when those systems are cracked and helping to make them stronger, the NSA is exploiting those cracks to its own advantage. That may not be a surprise, but for years the NSA has insisted that it is helping to make encryption stronger to better protect the public. The revelations from this article suggest that isn’t even remotely close to true.

Filed Under: encryption, gchq, nsa, otr, pgp, ssh, ssl, surveillance, zrtp

Chrome Security Team Considers Marking All HTTP Pages As 'Non-Secure'

from the moving-towards-encryption dept

Back in August, we noted that Google had started adjusting its search algorithm to give a slight boost to sites that are encrypted. That is, all else equal, sites that use HTTPS will get a slight ranking boost. The company made it clear that the weight of this signal will increase over time, and this is a way of encouraging more websites to go to HTTPS by default (something that we’ve done, but very few other sites have done).

Now it appears that the Chrome Security Team is taking things even further: suggesting that all HTTP sites be marked as non-secure:

We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.

The goal of this proposal is to more clearly display to users that HTTP provides no data security.

More specifically:

UA vendors who agree with this proposal should decide how best to phase in the UX changes given the needs of their users and their product design constraints. Generally, we suggest a phased approach to marking non-secure origins as non-secure. For example, a UA vendor might decide that in the medium term, they will represent non-secure origins in the same way that they represent Dubious origins. Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.

This seems like it could have quite an impact in driving more sites to finally realize that they should start going to HTTPS by default. There’s really no excuse not to do so these days, and it’s good to see the Chrome Security Team make this push. The more encrypted traffic there is, the better.

Filed Under: chrome, chromium, encryption, https, non-secure, privacy, security, ssl
Companies: google

Thanks To Namecheap For Sponsoring Techdirt's Switch To SSL

from the for-a-secure-internet dept

Post sponsored by

As some of you know, Techdirt recently completed the process of protecting all Techdirt traffic with full SSL encryption — something we believe every internet company should do. Part of this process involved seeking a sponsor to help us offset the money and time spent getting everything switched over, and today we’re happy to announce that Namecheap has stepped up to that role.

We’re very happy to work with Namecheap, as the company has established itself as a defender of user rights and an open and secure internet, sharing many of the same values that we espouse here at Techdirt. They were among the first domain registrars to speak up against SOPA, they contributed heavily to the matching funds in our Beacon campaign to raise money for net neutrality reporting, and they do frequent fundraising for groups like the EFF and Fight For The Future.

As part of our sponsorship deal, you’ll notice a message from Namecheap at the top of Techdirt, and see a couple more posts highlighting work the company has done and the services it offers — including SSLs.com, Namecheap’s SSL certificate shop. We’re grateful to Namecheap for its support, which helps our small team keep turning out quality content while juggling important technical upgrades like this one. We hope our readers will take a moment to support Namecheap in return, and check out its services for your needs when it comes to domain names, hosting and security certificates.

Filed Under: sponsorship, ssl, techdirt
Companies: namecheap