Your next job application could require a social media background check. Odds are, you have no clue what that means. Nobody does. It's new and scary and probably scours the web for pictures of you puking on the beach. And as you'll discover, one Gizmodo US editor flunked...big time.
But screw speculation. We wanted to know. So we ran background checks on six Gizmodo employees.
Here's what we found, and why you should both freak out about and embrace it.
First, some context: In May, the FTC gave a company called Social Intelligence the green light to run background checks of your internet and social media history. The media made a big hullabaloo out of the ruling. And it largely got two important facts wrong.
Contrary to initial reports, the company doesn't store seven years worth of your social data. Rather it looks at up to seven years of your history, and stores nothing.
The second was the idea that it was looking for boozy or embarrassing photos of you to pass along to your employer. In fact it screens for just a handful of things: aggressive or violent acts or assertions, unlawful activity, discriminatory activity (for example, making racist statements), and sexually explicit activity. And it doesn't pass on identifiable photos of you at all. In other words, your drunken kegstand photos are probably fine as long as you're not wearing a T-shirt with a swastika or naked from the waist down.
Basically, it just wants to know if you're the kind of arsehole who will cause legal hassles for an employer. Which brings us back to my report.
We ran background checks on six Gizmodo employees, including our editor in chief Joe Brown, and all but one came back clean. When it doesn't find anything incriminating on a potential employee, it simply issues a notice that the employees passed (see below) and doesn't generate a file.
And then there's me. I flunked hard. When that happens, Social Intelligence creates a report, which it would then send to an employer. And if you don't get a job because of your social media report, you can request a copy. Mine's filled with delightful details, like "subject admits to use of cocaine as well as LSD", and "subject references use of Ketamine".
Basically, I may never work again.
Yet the report is fascinating to look at. So privacy be damned, we've posted the entire thing online. We've also annotated it and called out some interesting highlights below.
More importantly, we learned a few things about how it works, and what you can do if you've got to have one of these reports run. And you will.
For starters, what it doesn't include in the report is nearly as interesting as what it does. Every image of me that might be able to identify my ethnicity is blacked out, even my hands. On my homepage, a line that reads "I drink too much beer" has been obscured because it's ultimately irrelevant. Screw you, boss man. I love my beer. (Joe: please do not fire me.)
And then there's the stuff it didn't find. For example, our editor in chief, Joe Brown, has a Facebook account under a different name he uses for close friends who do not want to be subjected to his work-related posts. (And, you know, to avoid annoying publicists who try to friend him.) It's easily findable if you know his personal email address. We gave that address to Social Intelligence, but it didn't dig up his aliased account, just his main profile.
It also seems like it helps to have a large web footprint. Yeah, it found some negative hits. Tip of the iceberg, my man!
There was much more to find buried deep in my Google search results that could have been just as incriminating. Sometimes, on even more than one level. Like this tweet, which although written in jest, is pretty much a twofer:
Another interesting tidbit: It only uses the data an employer gives it to run a search. This tends to be standard issue information from your resume. Your name, your university, your email address and physical location. Which means that, ultimately, you are the one supplying all the data for a background check. Because you are the one who supplies that data to your employer. And that means you should be smart about what kinds of contact information you put on your resume.
Your personal email address, especially if you've had it for a long time, could have all kinds of things tied to it that you'd rather an employer not see. Spend the nothing it costs to set up a dedicated job search email account, and list that one on your c.v.
And then there's sex. One of the things that Social Intelligence scans for is "sexually-explicit content for the purposes of sexual excitement and/or erotic satisfaction". That can be photos, video or even text. And there's no clear cut rule as to what's explicit. "Since our team are in fact human beings," says CEO Max Drucker," they are able to discern to the best degree possible what 'explicit' means."
Maybe so. And in fairness, the explicit example he sent me did make me want to bleach my eyeballs. But what's sexually explicit in one place, like where I grew up in Alabama, may not be in another, like where I live now in San Francisco. None of my Folsom Street Fair photos turned up in the report. Nor did any of the Bay to Breakers cock shots that I've published on both my blog and in my Flickr stream. I wouldn't consider those explicit (after all, they were taken on the streets of San Francisco) but would I want this extremely NSFW photo going to a potential employer? Probably not.
But ultimately the bottom line, and my takeaway, is that these kind of services actually make a lot of sense. Employers would have to be stupid not to Google job candidates. Yet it's better for both the employer and the candidate to have a disinterested third-party do full-scrape background checks.
We now routinely bandy about the kind of information online that employers are legally prohibited from asking. Your average Facebook profile can reveal an entire litany of details like your race, sexual orientation, national origin, or religious affiliation that are off-limits in the hiring process.
As an employee, you don't want potential employers knowing certain things about you that might make you a less attractive candidate due to their personal biases. As an employer, even if none of those things matter, just accidentally finding them out can be a problem.
For example, consider the following scenario. Let's say you're a California-based employer and you do a basic background check on a job candidate. In scouring the web, you discover a brand new Tumblr update that says "I'm pregnant!" Holy impending mandatory paid time off! But you're good a corporate citizen. That doesn't matter to you. Yet for unrelated reasons, you hire a different candidate. Meanwhile, the rejected candidate sees your company's IP address in her analytics program. She assumes you didn't hire her because she's pregnant. She sues. Now what?
If Social Intelligence finds out you're pregnant, or gay, or a Muslim, or newly married, or newly gay married to a pregnant Muslim, it leaves that out of its report. All an employer sees is, basically, that you passed or failed. And it won't flunk you for getting drunk or knocked up.
Even if you do both things at the same time. Party on.