In the hours following the Jan. 6 attack on the Capitol, employees at Facebook tasked with preventing “potential offline harm” found themselves under siege by a mob of a different sort. Reports of abusive content from users were flooding in. As one employee put it in an internal forum, many of the flagged posts “called for violence, suggested the overthrow of the government would be desirable, or otherwise voiced support for the protests.” The same day, Instagram employees reported that there were “no existing” protections against an onslaught of inciting content in places like the app’s list of most widely used hashtags.
Facebook CTO Mike Schroepfer called on his staff to “Hang in there.” In response, employees began to openly accuse the company of fomenting the insurrection. One wrote, “We’ve been fuelling this fire for a long time and we shouldn’t be surprised it’s now out of control.”
“Schrep, employees are tired of ‘thoughts and prayers’ from leadership,” another response read. “We want action.”
Screenshots of Meta employees’ reactions to the Jan. 6 Capitol riot were part of the Facebook Papers, a trove of documents that offer an unprecedented look inside the most powerful social media company in the world. The records were first provided to Congress last year by Frances Haugen, a Facebook product manager-turned-whistleblower, and later obtained by hundreds of journalists, including those at Gizmodo. Haugen testified before Congress about Facebook’s harms in October 2021.
As part of an ongoing project to make these once-confidential records accessible to the general public, Gizmodo is today — for the first time — publishing 28 of the documents previously exclusively shared with Congress and the media. We have undertaken this project to help better inform the public about Facebook’s role in a wide range of controversies, as well as to provide researchers with access to materials that we hope will advance general knowledge of social media’s role in modern history’s most troubling crises. Less than two weeks after Donald Trump’s mob attacked the Capitol, the results of a poll commissioned by Facebook itself showed what already felt anecdotally true to many: That a majority of Americans believed Facebook at least partly responsible for the events of Jan. 6.
The documents will reveal to you, for instance, an internal analysis of the many groups that Facebook knew to be prolific sources of both voter suppression efforts and hate speech targeting its most marginalised users. The records show the company was privately aware of the growing fears among users of being exposed to election-related falsehoods. The papers show that Meta’s own data pinpointed the account of then-President Trump as being principally responsible for a surge in reports concerning violations of its violence and incitement rules.
Today’s release is the first of a series of posts from Gizmodo to be published in tandem with legal and academic partners. Our goal is to minimise any costs to individuals’ privacy and any furtherance of other harms while ensuring the responsible disclosure of the greatest amount of information in the public interest possible. You can read more about our methodology and what we have redacted here.
Future releases will be added to this page, a directory, that will eventually offer our readers links all of the leaked internal documents we have published.
Documents About the Jan. 6 Capitol Attack
This document offers a top-level overview of steps taken by the Jan. 6 Integrity Product Operations Centre (IPOC), a working group formed after the 2020 election, largely responsible for investigating and mitigating threats that surfaced on-platform in its aftermath.
An internal memo from Meta CTO Mike Schroepfer about how “saddened” he is by the insurrection, followed by dozens of comments by employees who are equally “saddened” by the company’s inarguable role in making it happen.
Results of a platform-wide survey intended to gauge how Facebook users felt about the company’s response to the insurrection, and what they felt Facebook’s “role/responsibility” should be “in light of recent events.”
- Facebook Insurrection Redacted for Congress enclosures (Part 1)
- Facebook Insurrection Redacted for Congress enclosures (Part 2)
Part 1 is from Jan. 7, showing the spike in users reporting content for being “violent and inciting,” along with a list of the top posts, users, and hashtags being reported. Part 2 describes some of the “break the glass” measures proposed internally to reduce the likelihood of this kind of content cropping up on Instagram feeds.
An internal post from an Integrity team member linking out to a Vox video explainer on the “rhetoric of violence” that pervaded social media before the riot.
Papers Describing the Election-Related Task Force Monitoring “Complex Financial Organisations”
This document describes what the Task Force was responsible for over the course of the election, and offers reasons why that work should be generalized outside of the election.
A document offering examples of what the Task Force was responsible for investigating.
Papers Describing 2020 Election-Related Pages, Posts, Etc.
A 2018 document describing some early ideas to address fake engagement and misinformation in the run-up to the election.
An internal study trying to figure out where partisan pages get their massive follower counts.
- Understanding the Impact of Political Content on Facebook Experience and Sentiment
- Political Content on Facebook (Part 1): Understanding Consumer Experiences and How Facebook Can help
Two documents detail research into of how the average Facebook user feels about political content on their newsfeed, including proposals for company actions to ease negative sentiment.
Internal Election-Related Research
A document describing the results of a platform-wide survey asking thousands of Facebook users what the company can — and should — do about content related to voter suppression.
The results of an internal survey seeing which election-related pages and people were, and were not, flagged for extra protections against reporting using Facebook’s internal “XCheck” program.
Internal Election-Related Proposals
A proposal outlining ideas for how Meta can handle “newsworthy” (but potentially false) claims about the covid-19 pandemic and politics circulating the platform in mid-2020.
An internal proposal explaining how the company might expand its definition of “voter suppression.”
An internal analysis proposing a new method for uncovering which users are experiencing more outsized quantities political hate speech.
Internal Election-Related Explainers
An internal document outlining a “conceptual framework” that could potentially be used to uncover voter disenfranchisement targeting at-risk groups across Meta’s platforms.
Election-Related Platform and Product Updates
A post by Meta’s VP of Integrity, Guy Rosen, announcing upcoming “lockdown” efforts towards the end of 2019, in advance of the upcoming election.
An internal progress report (and resulting write-up) describing how the company planned to stop political groups from being recommended to users across the platform following the 2020 election, after failing to do so in the months prior.
A high-level overview describing how the company’s “crisis detection” strategies evolved in advance of the 2020 election.
A document describing the new “Civic Targeting Risk Scores” (CTRS) used by the company to suss out users at high risk for being targeted with disenfranchisement and political misinformation.
An announcement describing a new effort to do daily reviews of the most popular content across people’s Feeds, Stories, Pages and more; an attempt to uncover what kinds of content are gaining traction across the platform in advance of the then-upcoming election.
A document describing the platform’s rationale for not interfering with “political publishers” recommended in feed until after the election was over.
- Proposal to reset White House Instagram over hostile followersAn internal memo flagging the “overwhelmingly” hostile comments on the White House Instagram account after Biden is sworn into office.
- Groups before election
A retrospective from one product manager about lessons learned while handling problematic Facebook Groups during the 2020 election cycle.
A miscellaneous agenda from a 2019 “Election Research Workshop.”