Facebook And Instagram Now Encouraging Moderators To Dob On Underage Users

Facebook And Instagram Now Encouraging Moderators To Dob On Underage Users

It just became more difficult for children under the age of 13 to use Facebook and Instagram.

Facebook already doesn’t technically allow children under the age of 13 to create accounts. But until recently, Facebook only looked into children’s accounts if someone reported them for being underage.

Now, when reviewers come across accounts appearing to belong to children under the age of 13, for whatever reason, the reviewers are supposed to lock the accounts, according to a report from TechCrunch and a recent Facebook newsroom post.

If a user was blocked for this reason, but is actually 13 or older, they will have to present Facebook with a government-issued ID to prove their age and have their account unlocked.

Facebook’s age policy was established in line with the US Child Online Privacy Protection Act, which states that sites can only collect data from children under the age of 13 with parental consent. The age policy itself has not changed — just the guidance that Facebook provides to reviewers.

A Facebook spokesperson confirmed the “operational” change to TechCrunch, explaining that reviewers will be trained to implement the age-restriction policy for both Instagram and Facebook. In response to Gizmodo’s request for comment, a Facebook spokesperson referred Gizmodo to the recent newsroom post about the change.

In the post, Facebook’s head of global policy management Monika Bickert says the new approach to regulating underage users is a response to UK’s Channel 4 documentary, in which an undercover reporter trained to become a Facebook moderator through a Dublin-based firm, CPL Resources.

“Since the program, we have been working to update the guidance for reviewers to put a hold on any account they encounter if they have a strong indication it is underage, even if the report was for something else,” Bickert writes.

The documentary showed examples of reviewers not removing posts that included racist, violent and abusive content — seemingly in contrast with Facebook’s standards. But the reporting also includes a statement from a reviewer stating that reviewers there did not take action unless a user says they are underage.

“If not, we just like pretend that we are blind and that we don’t know what underage looks like,” the reviewer said.