Can I Stop Big Data Companies From Getting My Personal Information?

Can I Stop Big Data Companies From Getting My Personal Information?

I am going to answer this one right here in the intro: no, you can’t. In 2020, it is hard to just to go to the grocery store without inadvertently surrendering 40 or 50 highly personal data-points on the walk over. Go ahead, delete your Facebook — it makes no difference. It wouldn’t make a difference if you’d never had one in the first place — as we know, Facebook has enough data to build “shadow profiles” for those who, somehow, have never joined the site. We’re at the stage of harm reduction, pretty much — trying at least to limit Big Data’s file on us. For this week’s Giz Asks, we reached out to a number of experts for advice on how we might go about doing that.

Scott Shackleford

Associate Professor of Business Law and Ethics, Cybersecurity Program Chair, and Executive Director of the Ostrom Workshop on Cybersecurity and Internet Governance at Indiana University

The problem is that things aren’t set to private by default, so we have to take some affirmative steps to take ownership over our cyber-hygiene. There are some basic precautions: using browsers like Duck Duck Go that don’t track you as your navigating around (at least, not as much); various privacy extensions; a decent VPN. It’s important, too, to not reuse passwords, and to think critically about using platforms like Facebook or Google to log into tons of sites. In other words, maybe think twice before clicking that ‘link my account’ button, because the more you do that, the more of your data they’re able to accumulate. At the end of the day, be mindful of the stuff you’re putting out there, because one way or the other — even if you use a service like LastPass — there can still be a breach, and your information can still get out there.

What’s always fun, if you haven’t done it for a while, is to download all the information that Google or Facebook have on you — that can be very illuminating. It’s a worthwhile wake-up call, to show why it’s so important to take this stuff seriously. Ultimately, we can all do a better job with our cyber-hygiene. Until the incentive structure is better-aligned to make companies take this stuff more seriously — through regulation or market forces or otherwise — this problem is going to keep getting worse.

But the big tech companies are only part of the problem. There are hundreds and hundreds of these data aggregators, and they’re collecting usually on the order of thousands of data-points on all of us even if you don’t have a GMail account or a Facebook. Frankly, there’s no real way to opt out of that — that’s part of the problem. It’s up to the FTC to go after companies that use these unfair and deceptive trade practices. They’re doing that some, but there’s so much of it out there that they’re pretty tapped out — there’s only so much they can do.

Meg Leta Jones

Associate Professor, Communication, Culture & Technology, Georgetown University

Yes! You can write your congressperson to get a law passed. It could say something like this:

Dear Representative,

I would like to know why I am asked to manage the absurd amount of personal data generated about me. I do not create it, I do not use it, and I do not profit from it. Please stop trying to make it easier for me, in the name of privacy, to manage this mess — I did not make it and I should not have to clean it up. You should pass laws that limit the creation of such data, its use, and its life cycle.

You could move to California and vote for the California Privacy Rights Act, which includes a requirement that some businesses include a DO NOT SELL MY DATA button and adds data minimization obligations.

Without data protection laws, users and data subjects cannot effectively perform the amount of “privacy work” (as Alice Marwick calls it) necessary to be left alone by data companies (who isn’t a data company these days?). Privacy journalists Julia Angwin and Kashmir Hill have gone above and beyond what any “average” user might be expected or willing to do to avoid being tracked by big data companies — with little success.

Privacy is networked and social. Individual choice won’t protect your privacy anymore than recycling will solve climate change.

Gus Hurwitz

Associate Professor, Law, University of Nebraska, where he directs the Nebraska Governance and Technology Centre, and is Director of Law & Economics Programs at the International Centre for Law & Economics

The simple answer is “not really,” and the longer answer is more complicated.

There’s an entire ecosystem of what we may think of as “Big Data companies.” The most obvious are companies that users interact with directly, like large social media platforms, retailers, and media platforms. These companies can directly see a lot about what users do online. But there are also data aggregators and data brokers, which may not interact with consumers directly but instead get data about consumers from companies or other sources that do.

Many of these companies do have mechanisms to let consumers see what information they collect and also to correct or request the deletion of that data. But the reality is that there are so many of these companies and this is such a cumbersome process that it really isn’t practical for consumers to prevent this information from being collected, shared, or used. This is especially true when we’re talking about data aggregators, which collect data from all sorts of sources. For instance, aggregators may get information about you from your local DMV, government and other public records, or your grocery store.

It is also important to ask why this data is being collected in the first place. Some companies unquestionably misuse or mishandle consumer data that they collect. But most companies collect data to offer new or better products that consumers value. Understanding consumer interests allows companies to develop content that is engaging and products that fill previously un-served demand. It can be used to tailor products to individual types of users, or to craft a user experience that is more enjoyable (or less frustrating).

This is all to say that we should be cautious to not throw the baby out with the bathwater. Consumers absolutely can be harmed by companies’ data collection practices. But just as it is difficult to see all of the players in the big data ecosystem, it can also be difficult to see the various beneficial uses this data can be put to. Any regulation of “big data” should focus on actual harm, not just generalized concern about data collection.

The more likely, and useful, approach to addressing concerns about data collection is to focus less on the fact of data collection and instead to put in place “rules of the road” for how that data can be used, including strict rules that allow consumers to sue when their data is used (or misused) in ways that harm them. This could include clear penalties for companies that have lax security practices. It could mean that firms must disclose what data they have collected or are using, or where they get their data, about consumers. Or it could be prohibiting using data for certain for specific purposes, such as marketing certain types of products or services.

Sandra Wachter

Associate Professor and Senior Research Fellow in Law and Ethics of AI, Big Data, as well as Internet Regulation at the Oxford Internet Institute at the University of Oxford

I think the more important question has to do with what happens after your data is obtained. In most cases, the way this works is that you visit a website, or install an app, or rent a movie — whatever it might be — and are asked for your personal information. They might collect something completely neutral or uninteresting — your post code, your email address, your age — in exchange for some free service. In that situation, it feels like you have control; you know what’s going on, and you know what you’re giving up in order to receive whatever good you’re interested in. But that’s not where the story ends. The interesting part happens after the data is collected — the inferences that are drawn about you based on the collected data. Very often, I think we’re not actually aware of how seemingly banal data can paint a very intimate picture about ourselves. Three clicks on Facebook can reveal my sexual orientation; other platforms can determine, based on the way I engage with them, whether or not I have Alzheimer’s or Parkinson’s disease. My tweets allow you to infer whether I’m depressed. I think we have to be very aware that we’re leaving a trail behind that reveals very sensitive details about us.

One of my current research projects, running for the next three years, is called AI and the Right to Reasonable Inferences. In Europe, there are solid legal frameworks around data that is volunteered — data explicitly asked for and surrendered. But these frameworks don’t cover inferences made from that data, and who has the right to those inferences. It could be that it’s not considered personal data because it’s technically created by somebody else. So if the law doesn’t account for it, we need to figure out what kind of reasonable inferences we should actually allow in our society. Because if all those algorithms are making life-changing decisions about personal and private life, then you should have a right to be reasonably assessed, and that means at least understanding what’s going on and having a say in the matter — some sort of recourse mechanism.

Fred H. Cate

Vice President for Research and Professor of Law at Indiana University and Senior Fellow at their Centre for Applied Cybersecurity Research

There are many little things you can do to try to chip away at third-party access to your data — but it’s not going to make a difference. If you take out a teaspoon of the ocean and throw it on the beach, are you reducing the volume of the ocean? Well, yes, but I doubt anyone would notice.

That said, the most important thing you can do is to stop volunteering data. That includes not only the things you upload onto Facebook, but also clicking ‘No’ whenever a platform asks if they can share information across devices. That information is going to the cloud, and other people are going to have access to it.

You can also use a VPN, and search engines like DuckDuckGo, which don’t collect your data. You can put yourself on Do Not Call or Do Not Share lists — there are lots of those for data brokers and credit bureaus. You can opt out of financial information sharing at your bank. Really, you could spend all day every day taking advantage of these various things, and you might feel better that you’re doing something, and that’s not unimportant, but is there going to be any less data floating around about you? Probably not.

It’s important to separate these companies’ simply having the data from all the other stuff around it — is the data accurate? Is it relevant to the purpose it’s being used for? Is it being used fairly? Section 5 of the FTC Act, which prohibits unfair and deceptive trade practices, helps, but, again, if there are a million violations a day, they’re probably investigating two of them. And if there are a billion a day, they’re still only investigating two.

Do you have a burning question for Giz Asks? Email us at tipbox@gizmodo.com.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.