Last week, I ran an ad on Facebook that was targeted at a computer science professor named Alan Mislove. Mislove studies how privacy works on social networks and had a theory that Facebook is letting advertisers reach users with contact information collected in surprising ways. I was helping him test the theory by targeting him in a way Facebook had previously told me wouldn’t work. I directed the ad to display to a Facebook account connected to the landline number for Alan Mislove’s office, a number Mislove has never provided to Facebook. He saw the ad within hours.
One of the many ways that ads get in front of your eyeballs on Facebook and Instagram is that the social networking giant lets an advertiser upload a list of phone numbers or email addresses it has on file; it will then put an ad in front of accounts associated with that contact information. A clothing retailer can put an ad for a dress in the Instagram feeds of women who have purchased from them before, a politician can place Facebook ads in front of anyone on his mailing list, or a casino can offer deals to the email addresses of people suspected of having a gambling addiction. Facebook calls this a “custom audience.”
You might assume that you could go to your Facebook profile and look at your “contact and basic info” page to see what email addresses and phone numbers are associated with your account, and thus what advertisers can use to target you. But as is so often the case with this highly efficient data-miner posing as a way to keep in contact with your friends, it’s going about it in a less transparent and more invasive way.
Facebook is not content to use the contact information you willingly put into your Facebook profile for advertising. It is also using contact information you handed over for security purposes and contact information you didn’t hand over at all, but that was collected from other people’s contact books, a hidden layer of details Facebook has about you that I’ve come to call “shadow contact information.” I managed to place an ad in front of Alan Mislove by targeting his shadow profile. This means that the junk email address that you hand over for discounts or for shady online shopping is likely associated with your account and being used to target you with ads.
Facebook is not upfront about this practice. In fact, when I asked its PR team last year whether it was using shadow contact information for ads, they denied it. Luckily for those of us obsessed with the uncannily accurate nature of ads on Facebook platforms, a group of academic researchers decided to do a deep dive into how Facebook custom audiences work to find out how users’ phone numbers and email addresses get sucked into the advertising ecosystem.
Giridhari Venkatadri, Piotr Sapiezynski, and Alan Mislove of Northeastern University, along with Elena Lucherini of Princeton University, did a series of tests that involved handing contact information over to Facebook for a group of test accounts in different ways and then seeing whether that information could be used by an advertiser. They came up with a novel way to detect whether that information became available to advertisers by looking at the stats provided by Facebook about the size of an audience after contact information is uploaded. They go into this in greater length and technical detail in their paper.
They found that when a user gives Facebook a phone number for two-factor authentication or in order to receive alerts about new log-ins to a user’s account, that phone number became targetable by an advertiser within a couple of weeks. So users who want their accounts to be more secure are forced to make a privacy trade-off and allow advertisers to more easily find them on the social network. When asked about this, a Facebook spokesperson said that “we use the information people provide to offer a more personalised experience, including showing more relevant ads.” She said users bothered by this can set up two-factor authentication without using their phone numbers; Facebook stopped making a phone number mandatory for two-factor authentication four months ago.
The researchers also found that if User A, whom we’ll call Anna, shares her contacts with Facebook, including a previously unknown phone number for User B, whom we’ll call Ben, advertisers will be able to target Ben with an ad using that phone number, which I call “shadow contact information,” about a month later. Ben can’t access his shadow contact information, because that would violate Anna’s privacy, according to Facebook, so he can’t see it or delete it, and he can’t keep advertisers from using it either.
The lead author on the paper, Giridhari Venkatadri, said this was the most surprising finding, that Facebook was targeted ads using information “that was not directly provided by the user, or even revealed to the user.”
I’ve been trying to get Facebook to disclose shadow contact information to users for almost a year now. But it has even refused to disclose these shadow details to users in Europe, where privacy law is stronger and explicitly requires companies to tell users what data it has on them. A UK resident named Rob Blackie has been asking Facebook to hand over his shadow contact information for months, but Facebook told him it’s part of “confidential” algorithms, and “we are not in a position to provide you the precise details of our algorithms.”
“People own their address books,” a Facebook spokesperson said by email. “We understand that in some cases this may mean that another person may not be able to control the contact information someone else uploads about them.”
To test the shadow information finding, the researchers tried a real-world test. They uploaded a list of hundreds of landline numbers from Northeastern University. These are numbers that people who work for Northeastern are unlikely to have added to their accounts, though it’s very likely that the numbers would be in the address books of people who know them and who might have uploaded them to Facebook in order to “find friends.” The researchers found that many of these numbers could be targeted with ads, and when they ran an ad campaign, the ad turned up in the Facebook news feed of Mislove, whose landline had been included in the file; I confirmed this with my own test targeting his landline number.
“It’s likely that he was shown the ad because someone else uploaded his contact information via contact importer,” a Facebook spokesperson confirmed when I told the company about the experiment.
Facebook did not dispute any of the researchers’ findings. “We outline the information we receive and use for ads in our data policy, and give people control over their ads experience including custom audiences, via their ad preferences,” said a spokesperson by email. “For more information about how to manage your preferences and the type of data we use to show people ads see this post.”
In that post, “Hard Questions: What Information Do Facebook Advertisers Know About Me?”, Facebook’s vice president of ads Rob Goldman discusses how advertising works on Facebook and what you can do if “I don’t want my data used to show me ads.” The post doesn’t mention the surprising collection or use of contact information for targeted advertising that the researchers discovered.
“I think that many users don’t fully understand how ad targeting works today: that advertisers can literally specify exactly which users should see their ads by uploading the users’ email addresses, phone numbers, names+dates of birth, etc,” said Mislove. “In describing this work to colleagues, many computer scientists were surprised by this, and were even more surprised to learn that not only Facebook, but also Google, Pinterest, and Twitter all offer related services. Thus, we think there is a significant need to educate users about how exactly targeted advertising on such platforms works today.”
While Facebook isn’t upfront about which of your contact information it uses for ads, it is upfront about which advertisers are getting access to you with it. Facebook’s “ad preferences” page has a section devoted to “advertisers you’ve interacted with” where it will show you which advertisers have you in their contact list. My own list has over 300 advertisers on it, very few of whom to which I remember consciously giving my contact information—but if I did it would likely have been a junk email address so that I never had to hear from them again. Mislove says Facebook could be far more transparent here:
“Facebook could also reveal to users which [personal information] was used to target the delivered ad, helping users understand how their [information] is used by advertisers,” said Mislove by email. In other words, Facebook could tell me which email address or phone number all these advertisers have on me. With the involvement of shadow contact information, though, Facebook may have been avoiding that because it doesn’t want me to know what personal information Facebook has on me.
There are certainly creepier practices happening in the advertising industry, but it’s troubling this is happening at Facebook because of its representations about letting you control your ad experience. It’s disturbing that Facebook is reducing the privacy of people who want their accounts to be more secure by using the information they provide for that purpose to data-mine them for ads. And it’s also troubling to discover another way in which shadow contact information is used, beyond friend recommendations, given that Facebook doesn’t let users see this information about themselves or let them delete it.
Mislove thinks Facebook can make its platform more transparent by telling users everything it knows about them, including all the contact information it’s gathered from various sources, and how that information gets used. He suggests that Facebook let users see all the data it has on them and then let them specify whether it is correct and whether advertisers can use it.
Facebook has claimed that users already have extensive control over what information is made available to advertisers, but that’s not entirely true. When I asked the company last year about whether it used shadow contact information for ads, it gave me inaccurate information, and it hadn’t made the practice clear in its extensive messaging to users about ads. It took academic researchers performing tests for months to unearth the truth. People are increasingly paranoid about the creepy accuracy of the ads they see online and don’t understand where the information is coming from that leads to that accuracy. It seems that, when it came to this particular practice, Facebook wanted to keep its users in the dark.