facebook files – Techdirt (original) (raw)
Lots Of Big Media Companies Had Access To The Facebook Files; Only Gizmodo Decided To Put In The Work To Make Them Public
from the good-for-them dept
Over the last month or so, you’ve probably heard a lot about the Facebook Files or the Facebook Papers, which are the documents shared by former Facebook employee and whistleblower Frances Haugen with the media, starting with the Wall Street Journal, and then a rather reluctant “consortium” of seventeen big name US-based news organizations. The reluctance was apparent in the name of the Slack group created for all of the reporters working on the project: “Apparently We’re A Consortium Now.”
While I’ve been skeptical of some of the framing of the reporting on the papers, I still do generally believe it was a good thing to get this research out to the world — even if I have little confidence that the media could ever do a good job conveying the story.
As news of the consortium broke, many people called out the fact that all of these big journalism organizations weren’t actually releasing the documents they were going through themselves, often only describing them or quoting parts of them. Given that in a few cases where we’ve been able to see the full documents, it has appeared that some of the reporting was misleading or confused, this was a concern. And, of course, there were other concerns about the makeup of the consortium, and the fact that it was entirely based in the US.
That doesn’t mean that it made sense to freely release all the documents to the public. There are plenty of reasonable concerns about privacy when you have a giant cache of internal documents. That’s why it’s a good thing to find out that Gizmodo has now taken on the task of making the Facebook Papers public, and doing so in partnership with a bunch of independent experts who will help Gizmodo’s reporters sift through the documents and make sure that they’re okay to be released:
Today, we see a strong public need can be served by making as many of the documents public as possible, as quickly as possible. To that end, we?ve partnered with a small group of independent monitors who are joining us to guide our work in preparing the papers for public release. The mission is to minimize any costs to individuals? privacy or the furtherance of other harms while ensuring the responsible disclosure of the greatest amount of information in the public interest.
As Gizmodo notes, there are many reasons to carefully review the documents before releasing them:
More than for privacy, the documents require extra review to ensure we aren?t just handing groups of criminals and spies a roadmap to undermining the controls Facebook does have in place to detect propaganda aimed at spreading lies, hate, and fear. That would undermine any benefit the world stands to reap from this act of whistleblower justice.
The work is just beginning but we?re eager to start releasing documents as as possible. The first batch will likely consist of documents that warrant the least amount of redactions, just to get the ball rolling.
This is all good news. But it’s a bit crazy that it’s Gizmodo doing all this work. Gizmodo wasn’t even a member of the original consortium and only joined after the first batch of stories went out. Also, Gizmodo is way smaller and with way fewer resources than many of the other members of the consortium, which includes the flush NY Times, the Washington Post, NBC, CNN, the Associated Press, Politico, Wired and more.
The fact that it took a month for any of the members, let alone one of the smaller ones, to actually decide to put together the effort to release the papers is a damning statement on how many members of the consortium see their role in the media to be a gatekeeper to information, rather than providing the public access to information.
Filed Under: facebook files, facebook papers, reporting, transparency
Companies: facebook, gizmodo
It's Ridiculous The 'Developing World' Wasn't Given Access To The Facebook Files
from the do-this-the-right-way dept
Mon, Nov 1st 2021 05:53am - Karl Bode
By now it’s fairly clear the Facebook leaks showcase a company that prioritized near-mindless international growth over the warnings of their own experts. They also show a company that continues to painfully struggle to be even marginally competent at scale, whether we’re talking about content moderation or rudimentary customer service. While this has become an all-encompassing media spectacle, the real underlying story isn’t particularly unique. It’s just a “growth for growth’s sake” mindset, where profit and expansion trumped all reason. It just happens to be online, and at unprecedented international scale.
One thing I haven’t seen talked a lot about is the fact that if you look back a few years, an awful lot of of folks in developing nations saw these problems coming a mile away, long before their Western counterparts. For a decade, international activists warned repeatedly about the perils of Facebook’s total failure to understand the culture/regulations/language/norms of the countries they rapidly flooded into. Yet bizarrely, Frances Haugen’s PR team somehow excluded most of these countries when it came time to recently release access to the Facebook files:
re: #facebookpapers — we're hearing from * a ton * of reporters in latin america / india and other regions outside of the west that aren't getting access. please, please dm or email dell/myself and we'll do what we can to help https://t.co/S3Vlw9w19A
— shoshana wodinsky (she/her) (@swodinsky) October 29, 2021
The “consortium” handling release of the files is basically a loose collaboration between 17 U.S. news orgs and a handful of European outlets. They’re all being given access to redacted versions of documents Haugen provided the Securities and Exchange Commission, showing Facebook repeatedly prioritized growth and profit over, well, everything else. The whole thing was handled by Haugen’s PR team via an ordinary embargo, which some oddly saw as itself somehow nefarious (it’s not, embargoes, though often kind of stupid, are commonly used to maximize impact).
The real problem was who was included in that consortium. And a bigger problem, oddly not talked about until a month into the Facebook leak news cycle, was that much of the developed world was just… excluded… from the coalition by her PR reps. The exclusion of academics and researchers that could make the most sense of the data was a problem. But restricting analysis to most white, western newsrooms, (despite Haugen’s very clear understanding that most of Facebook’s impact problems disproportionately harmed developing nations) is particularly odd and tasteless.
For all the problems Facebook (sorry, Meta) has had in the United States in regards to managing the company’s platform at scale, those problems have been dramatically worse internationally. Facebook was so excited to flood into dozens of international locations to wow investors, they didn’t dedicate the time, resources, or attention needed to actually understand what they were doing. Even if they had (and the whistleblowers keep showing they absolutely didn’t), the sheer scale of the expansion made it impossible to do well. That Facebook did so anyway despite being warned about it is an act of greed and hubris.
It was actually a net neutrality debate that keyed many overseas activists into Facebook’s problems more than a decade ago. Activists in India were particularly sensitive to Facebook’s attempts to conflate “the internet” with Facebook in developing nations. If you recall, activists in India successfully derailed Facebook’s Free Basics program, which was Facebook’s attempt to corner developing nation ad markets under the banner of altruism.
Basically, it involved Facebook striking deals with local wireless companies to offer discounted access to Facebook, under claims that aggressively curated version of “online” was better than no online access at all. It was a highly curated walled garden bastardized variation of AOL or CompuServe, in which Facebook decided what information, services, and resources mattered (they initially even banned encrypted services). But activists and international experts were quick to see the problem with giving Facebook this kind of power, especially in countries they didn’t take the time to understand.
We’ve seen repeatedly how conflating “Facebook” with “the internet” (or Whatsapp with “the internet”) has created a laundry list of problems that are especially pronounced in developing nations. The centralized approach of programs like Free Basics defeated the purpose of the open internet, reduced transparency, was a big boon for authoritarian governments, and helped create an even more potent funnel for propaganda. One recurring theme in whistleblower accounts is that Facebook’s own researchers generally warned about all of this, repeatedly, but were ignored for profit and growth’s sake.
As early as 2015 organizations like Mozilla were busy arguing that if Facebook genuinely cared about information access in developing nations, they should simply fund access to the internet itself. Facebook was ultimately forced to back off its plan in some countries like India and Egypt, but if you were a reporter or activist in these countries who pointed out the problems with Facebook’s ambition, you were accused of being an enemy of the poor. When the Free Basics brand became toxic, Facebook just named Free Basics something else (sound familiar?).
There has been some valid and not so valid criticism of the way Haugen handled these latest revelations. Some have tried to argue that because she was smart enough to hire lawyers and a PR team to maximize impact she can’t possibly be technically seen as a “whistleblower.” There was also some brief hyperventilation over a Politico report, with some trying to claim that because she had received some money from investor Pierre Omidyar, she shouldn’t be taken seriously. But a shrewd, organized whistleblower is still a whistleblower, and Omidyar proxy groups actually just funded whistleblower orgs Haugen was part of after she went public. It’s actually a good thing to see a whistleblower do the right thing and not be economically and reputationally devastated for once.
But it’s both weird and telling that people freaked out about these perceived injustices, but didn’t notice that the whistleblower’s PR coalition apparently just forgot the developing world existed:
I have massive respect for a lot of the reporters working on the files but whoever is keeping the document access limited to Americans is in the absolute wrong.
— Chillian J. Yikes! (@jilliancyork) October 29, 2021
Nobody I’ve talked to so far at news organizations seems clear why this seems to have happened (a strange decision for an effort geared toward greater transparency). It’s not like it would be particularly difficult to coordinate the release via the same organizations in places like India that warned about Facebook’s consolidated power almost a decade ago (see: IFEX). Some outlets, like Gizmodo, have been trying to expand access to the source documents to everyone. That’s apparently to the chagrin of Haugen’s PR team, who seems to think they can put the genie back in the bottle.
There was a certain hubris in Facebook stumbling its way across the developing world in a quest for growth without bothering to understand the impact their platform would have on foreign cultures. But there’s a fairly substantial amount of hubris in excluding these developing nations from accessing raw data on a problem they’ve disproportionally been harmed by.
Filed Under: bias, developing nations, facebook files, facebook papers, frances haugen, india, journalists
Companies: facebook, meta
When Facebook Turned Off Its News Feed Algorithm, It Made Everyone's Experience Worse… But Made Facebook More Money
from the oh,-look-at-that dept
For reasons I don’t fully understand, over the last few months, many critics of “big tech” and Facebook, in particular, have latched onto the idea that “the algorithm” is the problem. It’s been almost weird how frequently people insist to me that if only social media got rid of algorithmically recommending stuff, and went back to the old fashioned chronological news feed order, all would be good in the world again. Some of this seems based on the idea that algorithms are primed to lead people down a garden path from one type of video to ever more extreme videos (which certainly has happened, though how often is never made clear). Some of it seems to be a bit of a kneejerk reaction to simply disliking the fact that these companies (which many people don’t really trust) are making decisions about what you may and may not like — and that feels kinda creepy.
In the past few weeks, there’s been a bit of a fever pitch on this topic, partly in response to whistleblower Frances Haugen’s leak of documents, in which she argues that Facebook’s algorithm is a big part of the problem. And then there’s the recent attempt by some Democrats in Congress to take away Section 230 from algorithmically recommended information. As I noted, the bill is so problematic that it’s not clear what it’s actually solving.
But underlying all of this is a general opinion that “algorithms” and “algorithmic recommendations” are inherently bad and problematic. And, frankly, I’m confused by this. At a personal level, the tools I’ve used that do algorithmic recommendations (mainly: Google News, Twitter, and YouTube) have been… really, really useful? And also pretty accurate over time in learning what I want, and thus providing me more useful content in a more efficient manner, which has been pretty good for me, personally. I recognize that not everyone has that experience, but at the very least, before we unilaterally declare algorithms and recommendation engines as bad, it might help to understand how often they’re recommending stuff that’s useful and helpful, as compared to how often they’re causing problems.
And, for all the talk about how Haugen’s leaking has shown a light on the “dangers” of algorithms, the actual documents that she’s leaked might suggest something else entirely. Reporter Alex Kantrowitz has reported on one of the leaked documents, regarding a study Facebook did on what happens when Facebook turns off the algorithmic rankings and… it was not pretty. But, contrary to common belief, Facebook actually made more money without the News Feed algorithm.
In February 2018, a Facebook researcher all but shut off the News Feed ranking algorithm for .05% of Facebook users. ?What happens if we delete ranked News Feed?? they asked in an internal report summing up the experiment. Their findings: Without a News Feed algorithm, engagement on Facebook drops significantly, people hide 50% more posts, content from Facebook Groups rises to the top, and ? surprisingly ? Facebook makes even more money from users scrolling through the News Feed.
Considering how often we’ve heard, including from Haugen herself, that Facebook’s decision-making is almost always driven by what will beneficially impact the bottom line the most, this deserves some consideration. Because the document… suggests something quite different. In fact, what the researchers seemed to find was that people hated it, but it made them spend more time on the site and see more ads because they had to poke around to try to find the interesting stuff they wanted to see, and that drove up ad rates. If Facebook were truly focused on just the bottom line, then, they should consider turning off the news feed algorithm — or, just supporting the awful JAMA bill in Congress which will create incentives for the same result:
Turning off the News Feed ranking algorithm, the researcher found, led to a worse experience almost across the board. People spent more time scrolling through the News Feed searching for interesting stuff, and saw more advertisements as they went (hence the revenue spike). They hid 50% more posts, indicating they weren?t thrilled with what they were seeing. They saw more Groups content, because Groups is one of the few places on Facebook that remains vibrant. And they saw double the amount of posts from public pages they don?t follow, often because friends commented on those pages. ?We reduce the distribution of these posts massively as they seem to be a constant quality compliant,? the researcher said of the public pages.
As always, there are lots of factors that go into this, and one experiment may not be enough to tell us much. Also, it’s entirely possible that over time, the long term result would be less revenue because the increasing annoyances of not finding the more interesting stuff causes people to leave the platform entirely. But, at the very least, this leaked research pokes a pretty big hole in the idea that getting rid of algorithmic recommendations does anything particularly useful.
Filed Under: algorithms, chronological, facebook files, facebook papers, frances haugen, jama, leaks, news feed, section 230, user experience
Companies: facebook