Here's What Happens When You Report Something On Facebook

Some Facebook users are bullies who post things that are racist, sexist and otherwise offensive. So what what happens when you report them? Facebook Security has posted an explanation that shows you the process.

The guide above (click to enlarge) looks a little bit like the game Mouse Trap, and it's not all that straightforward, though the company breaks down the process a bit, explaining that there are four teams (Safety, Hate and Harassment, Access, and Abusive Content) made up of hundreds of employees that are monitoring complaints 24/7. Facebook says:

If one of these teams determines that a reported piece of content violates our policies or our statement of rights and responsibilities, we will remove it and warn the person who posted it. In addition, we may also revoke a user's ability to share particular types of content or use certain features, disable a user's account, or if need be, refer issues to law enforcement. We also have special teams just to handle user appeals for the instances when we may might have made a mistake.

In a community of 900 million users, it's good to know that Facebook is sorting through all the nonsense on the site, even if it doesn't always get it right. [TheNextWeb]



    I always just imagined it as a very large, well-used recycle bin. Of all the stupid like-spam I've reported from either compromised accounts or just dubious marketers scraping people's facebooks after someone being dumb enough to want 4000+ "friends," I wonder if facebook cares that much at all beyond their legal requirement to.

    Cool. Would be easier to read if the image hadn't been shrunk though :/

      The social reporting option is an interesting one

    Considering studies have revealed all Facebook users have narcissistic and attention seeking personalities retained from childhood, one must ask the question, why use such a thing?

    The whole thing is bloody annoying. It only takes one report to take something down, even if it's within the terms and conditions. I've always thought for something wrongly reported, then successfully appealed, the reporter should take the ban the other person would have. Serial reporters (that have successfully been appealed) should be blocked totally.

      You sir, sound like a spammer.

    I"m happy they have something in place for someone reporting self harm/suicidal content. Though on the flip side, it's a bit temping to now change my friends status to super emo stuff and report him, then wait for a phone call saying "it's ok, we care" (please do not mistake me for not being very serious about self harm/suicide. It is not funny when a person is *actually* having issues with it)

    This must be new. I did a test several months ago. I reported an "explicit" photo on a friends FB account who was going to close it anyway. The "Explicit" photo, 2 puppies running in a field, was removed, and my frind was sent an email. Who decided that 2 puppies running in a field is explicit?

    I am assuming that it is easier for FB to remove a photo and send an email, rather than have someone from the FB team look at the photo and make a judgment on whether it should stay or not. Imagine if that photo was of something truly wrong. It would probably be a very bad idea to purposefully have someone look at it first to determine if it was 'explicit' or not.

Join the discussion!

Trending Stories Right Now