Meanwhile In The Future: Everybody Is Reviewed In A Reputation Database

Meanwhile In The Future: Everybody Is Reviewed In A Reputation Database

Recently, an app called Peeple got a whole lot of attention for trying to be the Yelp for Humans. But it’s just one in a really long line of apps that try to apply consumer reviews to people.

The app LuLu lets women leave reviews for the men they have dated. The app SketchFactor invited people to leave reviews for neighbourhoods. And on some level, apps like AirBnB, Rate My Professor, Uber and ZocDoc are all places where people can review other people.

But what would it be like if we lived in a world where everything you do is subject to a rating dolled out by a combination of machines and other people? Other drivers can rate and review your road etiquette. Your coworkers can review your work and your personality. Your teammates can review your performance on the soccer field. Your partners can review your, ahem, performance.

There are a bunch of science fiction stories about this kind of future, where reputation is something that’s extremely valuable. The book Super Sad True Love Story by Gary Shteyngart was brought up by two of the three guests this week. And futurists actually have a name for this idea, it’s called the “reputation economy.” And, it’s already happening.

Michael Fertik, the founder of Reputation.com and the author of the book The Reputation Economy, talks on the episode about all the ways that brands and companies are already compiling your information into a profile that helps them make decisions about you. Linkedin, AirBnB, Uber, they’re all gathering what Fertik calls your “digital exhaust” to learn more about you.

But there was something about the app Peeple that really made people angry. The founders of Peeple gave an interview to the Washington Post, and once that story went live they were inundated with really intense criticism. One founder said she even got death threats. People really did not like Peeple. So what makes Peeple different from say AirBnB where you rate your tenants? Jeff Hancock, a professor of communications at Standford, says it comes down to turning your interpersonal relationships into transactions.

He gives this example: If your friend came over for dinner and brought a $US20 bottle of wine, you’d thank her. If your friend came over and handed you four $US5 bills, you’d be a little confused. They’re the same, kind of, but one makes the idea that being friends, and hanging out, is a business transaction.

When we see things as businesses or products, we’re ok with reviewing them. That’s what makes Yelp and Amazon ok. And it’s also what makes people ok with things like Rate My Professor and ZocDoc. Your professor and doctor are providing you with services. And there are all kinds of rating systems like this, for babysitters and Uber drivers and plumbers and more. Fertik says that will just keep happening, each in their own little space.

But in 15 or 20 years, all those reputation systems might be combined. And they might totally dictate your life: what jobs you get, what insurance you’re offered, who you date, where you live. And that’s where people start to worry.

In Super Sad True Love Story, this future is not a utopia. In fact, in most science fiction that involves any kind of reputation system like this, it is quickly subverted and used to control people. Alison Hearn, a professor of media studies at Western University in Canada, says that often these kinds of reputation systems operate on behalf of brands, not people. And as such they’re biased towards a certain kind of review and outcome.

In some cases, people feel grossed out by the idea of rating other people. The app Peeple is a good example of that. But there are plenty of cases where you could imagine folks readily reviewing each other in ways that aren’t all that positive. Take driving for example. If you had a method of rating other drivers around you, would you take it? If someone cut you off, would you review them?

You probably would. Everybody thinks they’re a better driver than everybody else, even though that’s impossible. And there’s lots of research to show that people do things in cars that they would never do in another setting. Road rage is a very real thing. So, do you want to be known for your driving habits? Do you want that one time you cut in front of someone to stay, forever, on your reputation profile?

The driving example gets at another problem a lot of people see with these kinds of reputation driven systems: they’re more likely to log bad behaviour than good behaviour. People go to these places when they have a complaint far more than when they have a compliment.

But even if people aren’t being malicious — and of course there will be people who game these systems to attack people they disagree with — there’s a very real problem of codifying the kinds of implicit and explicit biases people have.

Fertik predicts that in just five years, companies won’t post jobs, but rather plug in their desires into a database to find the right person. Jobs will come to you, he says. But part of that selection process will probably include parameters outside someone’s direct qualifications. They will include things that try to get at their personality, their “cultural match,” with the company.

But often the idea of “fit” or “culture match” in hiring is used to not hire people who don’t look like the rest of the company. To put it more bluntly, companies use those words to not hire diverse candidates. To not hire a person who is disabled or transgender or black. If companies are plugging specific “cultural” parameters into an algorithm to pull up matches from possible candidates, those candidates are all going to look really similar. And the people who don’t have the right keywords attached to their profile, are out of luck.

If financial success, personal success, housing, food options, all that is tied into this reputation system, the people who have the understanding and the money to make that reputation system work for them will succeed. The people who don’t, won’t. And those people are sometimes our most valuable humans. Hearn worries that people who might be grating or rebellious or creative, people who don’t conform, will be punished with a “bad reputation” and shut out from everything from housing to schooling to job opportunities. Of course, this already happens, but it’s worth wondering whether we want to make it even easier.

A quick programming note: This is our 23rd episode of Meanwhile in the Future, and it’s actually the very last episode in the first season of the show. I know you probably had no idea there even was a first season happening, but there was! And this is the end of it. We’re going to take a little break before season two starts up again, but I promise it will be worth the wait. I’ve got tons of great episodes planned for Season Two.

Illustration by Tara Jacoby


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.