Facebook Here, May We Interest You in Some Transparency?

Image: Graeme Jennings , Getty Images
Image: Graeme Jennings , Getty Images

If you haven’t been on the receiving end of updates from Facebook comms, then you’re lucky to have avoided the equivalent of a weekly robocall pitching you new and exciting offerings. We got another one today, this time on the subject of “Recommendation Guidelines.”

These crumbs always seem to appear around when Facebook is implicated in something awful. In this case, it was leaving up the “Kenosha Guard” vigilante page which 17-year-old Kyle Rittenhouse was part of, despite over 450 user reports. Rittenhouse is charged with two counts of homicide and one of attempted homicide after opening fire with an illegally-owned gun into a crowd of protesters. Leaving the page up was an “operational mistake” according to Mark Zuckerberg.

Anyway, here’s some transparency!

Today, Facebook’s VP of Integrity announced that Facebook upholds “certain standards” for the content they recommend in your feed and have now published the guidelines. Apparently the release of their guidelines has nothing to do with the shooting of three protesters in Kenosha or a recent report on 3.8 billion views of health-related misinformation pages; Facebook just thinks you should know. A Facebook spokesperson confirmed to Gizmodo that the release was precipitated only by overall transparency efforts, and the guidelines are nothing new. It’s also a tacit admission that Facebook has, in fact, been driving clicks to this stuff at some point.

Facebook signal boosts pages, groups, and events you don’t follow by recommending them in your News Feed and Instagram feed. Pages that “may not be eligible” for recommendations, per the guidelines, include the usual: content that discusses self-harm, is sexually “suggestive,” promotes vaping, etc. It also doesn’t recommend “low-quality” content, which Facebook didn’t define in an email to Gizmodo, but includes products like cosmetic procedures, “miracle cures,” weight loss supplements, and payday loans.

Notably, Facebook doesn’t recommend content such as “clickbait,” false or misleading content, vaccine-related misinformation, and articles that lift directly from another source. According to findings from the activist group Avaaz, that didn’t stop a single article on a debunked conspiracy involving Bill Gates and vaccines from reaching a total of 8.4 million views, via pages that reposted it or shared excerpts “to avoid Facebook’s fact-checking process.”

Under Facebook’s guidelines, the Kenosha Guard and its pre-protest shooting “call to arms” wouldn’t have been recommended to non-followers because Facebook “tr[ies]” not to promote pages and events that are “associated with offline movements or organisations that are tied to violence,” which explicitly includes U.S.-based militia groups (a recent policy). It’s unclear whether Facebook actively recommended the page (Facebook declined to confirm or deny this to Gizmodo), but as mentioned, it had declined to act on user reports which should have gotten the page shut down for discussing potential violence well in advance of last week’s tragic events.

Mark Zuckerberg clarified that the moderation team in charge of the Kenosha Guard hadn’t acted on user reports because moderators didn’t fully understand the workings of “certain militias.”

Facebook also told Gizmodo that it relies on AI to suppress recommendations, and that reducing recommendations won’t prevent the algorithm from promoting a page your friend has shared (unless the page has violated its terms, which are reportedly loose for conservative pages). This would probably be bigger news if Facebook recommendations played as big a role as they do on YouTube.

All of this is to confirm that Facebook has been doing the bare minimum of what we thought they already were, because they’ve told us they would, and we’ve seen it in action. But Facebook’s known more for blame avoidance than for apologies. And anyway, a little more transparency is almost as good.