Facebook Bans Vaccine Hoax Researchers, Blames FTC

Facebook Bans Vaccine Hoax Researchers, Blames FTC
Photo: Graeme Jennings, Getty Images

Members of a New York University research team studying political advertising and covid-19 misinformation on Facebook found their accounts suspended Tuesday night, with Facebook pinning the action on the privacy settlement it reached with federal regulators in 2019.

Laura Edelson, a Ph.D. candidate at the Tandon School of Engineering, said that Facebook had suspended accounts connected to the NYU project, Cybersecurity for Democracy, severing the access of more than two dozen researchers and journalists examining the spread of misinformation online.

Facebook suspended the accounts of three researchers as well as the team’s access to Facebook’s Ad Library and Crowdtangle, which provide data on how often particular posts are viewed, liked, and shared.

Among the projects affected by Facebook’s decision, the Virality Project, launched by the Stanford Internet Observatory, works to understand disinformation about covid-19 being spread by Facebook. The effort is backed by NYU and the University of Washington.

Damon McCoy, an associate professor of computer science and engineering at the NYU school, blasted the decision, accusing Facebook of attempting to “squash legitimate research” into the company’s role in spreading harmful content.

“With its platform awash in vaccine disinformation and partisan campaigns to manipulate the public, Facebook should be welcoming independent research, not shutting it down,” McCoy said.

Facebook’s director of project management, Mike Clark, wrote in a blog post on Tuesday that the decision was prompted by an agreement with the Federal Trade Commission to protect user privacy. (The FTC did not respond when asked for comment.)

Jonathan Mayer, an assistant professor at Princeton University, said Facebook’s justification was merely a pretext to obstruct academic research that would hold Facebook accountable.

“Facebook’s legal argument is bogus,” Mayer tweeted.

A browser extension used by the research team called Ad Observer allows individual users to participate in the research by automatically copying any ads they encounter on Facebook. By default, this means users must provide their consent, even though none of their personal information is collected through the app.

“Facebook is silencing us because our work often calls attention to problems on its platform,” Edelson said. “Worst of all, Facebook is using user privacy, a core belief that we have always put first in our work, as a pretext for doing this.”

Facebook claims the Ad Observer extension “collected data about Facebook users who did not install it or consent to the collection.” But what the company is actually referring to is the names of pages that advertisers have paid Facebook to inject into users’ feeds.

Ironically, the browser extension that Facebook claims is the problem was not impacted by the suspensions. The thousands of users who’ve downloaded Ad Observer are continuing to capture that data. One of the accounts Facebook suspended also belongs to a person whose work is unaffiliated with the tool.

Facebook went on to say it had tried to offer up a more “privacy-protective” tool, but failed to mention the tool, known as FORT, has significant limitations. Its dataset, for instance, only includes ads that Facebook itself identified as political. It only includes ads shown in the three months leading up to the 2020 election. This makes it useless for most research.

Glaringly, Facebook makes no mention of the fact that the accounts it targeted for suspension are involved in research into covid-19 conspiracies, which remain rampant on the platform.

Sen. Mark Warner, chairman of the Senate Intelligence Committee, blasted Facebook over its decision, saying the NYU team had exposed fraud and predatory financial schemes on the platform, and repeatedly unearthed proof of Facebook running ads that violate its own terms of service.

“For several years now, I have called on social media platforms like Facebook to work with, and better empower, independent researchers, whose efforts consistently improve the integrity and safety of social media platforms by exposing harmful and exploitative activity,” Warner said. “Instead, Facebook has seemingly done the opposite.”