Europeans Club Google Over The Head At A Rate Of 1,000 Requests Per Hour After Its Search Engine Amnesia Tool Goes Live (original) (raw)

from the Google's-search-results-soon-to-be-as-reliable-as-a-retired-boxer's-memo dept

Apparently, European citizens prefer a sanitized web — one that won’t clutter up their vanity searches with embarrassing results. Julia Fioretti’s report for Reuters on the new “forget me now” web form Google recently deployed contains this impressive fact.

After putting up the online form in the early hours of Friday, Google received 12,000 requests across Europe, sometimes averaging 20 per minute, by late in the day, the company said.

Now, Google will have to sift through these entries to determine which requests exceed the bar set by the EU’s data protection law. Despite being very adamant that European citizens have the right to be “forgotten,” there’s been very little determined in terms of the bright line between citizens’ privacy and the public’s right to know. Data protection authorities are supposed to meet next week to attempt to reach some sort of consensus. Meanwhile, the requests continue to pour in.

The web form is very straightforward, asking for country of origin, as well as a brief statement as to why the complainant feels each listed link should be removed. Those making requests are required to upload a copy of documents proving their identity, a safeguard against abuse and one that might generate second thoughts in a few requesters (especially if the request fails to meet the eventual applicable standards).

Americans who want certain things to be de-listed are still out of luck… for now. As Eric Goldman points out, US courts are still very hesitant to hold Google accountable for the content it indexes and aren’t in a big hurry to carve holes in Section 230 protections. This holds true even if it’s an algorithm that’s somehow managed to cobble together something unflattering from a massive pile of indexed text.

Today’s case gives us a good example of the growing divergence between US and EU search results. O’Kroley did a vanity search and got the following search results snippet:

Texas Advance Sheet March 2012–Google Books Result books.google.com/books? id=kO1rxn9COwsC …

Fastcase—2012

… indecency with a child in Trial Court Cause N … Colin O’Kroley v. Pringle. (Tex.App., 2012). MEMORANDUM OPINION On February 9, 2012, Colin O’Kroley filed in.

The plaintiff argued that the text snippet was clearly defamatory, even if the document in full wasn’t. The court didn’t buy it.

the undersigned Magistrate Judge has found no case that makes the precise claim that O’Kroley makes here—that the underlying information, viewed in its entirety, is not defamatory, but that it has been rendered defamatory by Google’s automated editing process that juxtaposed two sentence fragments in the snippet. Nevertheless, based upon the “robust” immunity afforded under Section 230, the undersigned Magistrate Judge finds that the editorial acts of Google creating the offensive search result are subject to statutory immunity. For the foregoing reasons, the undersigned Magistrate Judge finds that Google is immune from all claims in the complaint, and that Google’s motion to dismiss must be granted.

If you’re looking to cleanse the web stateside, you don’t really have many options beyond the court system, and there’s no success guaranteed there, even if you have a more solid claim than O’Kroley’s. But I’m not sure if there’s any reason to be clamoring for a “right to be forgotten.” Google’s new service doesn’t actually make content disappear. It simply removes it from its index. Other search engines will still be able to locate it, at least for the time being.

Beyond that, you have to consider the implications of putting the “keep/remove” decision in the hands of politicians and tech giants. Both can be incredibly self-serving. Neither truly has the best interests of the public in mind. The EU may be able to dictate Google’s delistings, but at this point, it’s not operating on anything more concrete than a gut feeling that there’s something wrong with good people being linked to bad stuff. But it’s an unrealistic goal. Good people will still be wrongly linked with bad stuff, and bad people will still get away with hiding evidence of their wrongdoing.

A lot of bad precedent has led to this decision — like superinjunctions and defamation laws so easily abused, certain countries have become temporary “homes” for libel tourists. This push for specifically Google to operate a deliberately faulty search engine has been in the works for years, starting with cries about Google “enabling” piracy (and child pornography, etc.) by returning the search results it was asked to fetch, and culminating in this exercise in symbolic gestures: Google whitewashing search results while the troubling content remains undisturbed.

Filed Under: europe, right to be forgotten, search
Companies: google