Mark Zuckerberg: OK, Fine, Regulate Facebook

Mark Zuckerberg: OK, Fine, Regulate Facebook

Facebook CEO Mark Zuckerberg—whose company has blundered its way into controversies over everything from user privacy and data breaches to amplification of extremist content and literal genocide as of late—responded to growing criticism of the tech sector by calling for more outside regulation in an op-ed in the Washington Post (and on his own personal Facebook page) on Saturday.

Zuckerberg broke down the areas where he is now saying regulation could be helpful into four sections: harmful content, election integrity, privacy, and data portability. Somewhat more surprisingly, he offered specifics of what that might look like.

On the first, harmful content, Zuckerberg wrote that platforms face a “responsibility to keep people safe on our services” and that “internet companies should be accountable for enforcing standards on harmful content.” He also said that to do this effectively, Facebook needs to be able to identify and eliminate violent or hateful speech, but also called for “a more standardised approach” across the industry that includes third-party oversight:

One idea is for third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.

Seeing as it took Facebook years to concede that white nationalism and white separatism are actually the same thing as white supremacy, independent oversight of content decisions is probably not that bad an idea.

As to election integrity, Zuckerberg said that the company has already made steps such as forcing political ad buyers to verify their real life identities and creating a political ad database, but suggested that what is really needed is a… total overhaul of the campaign finance environment:

… Deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors.

Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting. We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.

These are all things that are probably true, but also dodge the question of why Facebook is vulnerable to political machinations in the first place, as well as whether the whole Facebook information economy is in fact the problem. Notably, at least in the U.S., these kinds of changes would entail a massive overhaul of campaign finance and disclosure laws that is unlikely to emerge for years, if it does anytime in the visible future.

There’s also the fact that the company has historically tried its best to be exempt from ad disclosure rules, and has generally been muddled in ethical issues around ads, like how the Department of Housing and Urban Development just slapped it with charges of enabling housing discrimination

As to privacy, Zuckerberg called for the U.S. to pass legislation similar to the European Union’s sweeping General Data Protection Regulation, which he said he would prefer to become a “common global framework” (as opposed to a patchwork of laws in each nation). He also called for data portability, which he described as free flow of information between services—though he alluded to Facebook Login as an example, which is really more a way the company has extended its tracking tendrils across the web than anything about safeguarding user rights:

If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.

This is important for the Internet — and for creating services people want. It’s why we built our development platform. True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information. But this requires clear rules about who’s responsible for protecting information when it moves between services.

(As TechCrunch noted, Facebook itself has been dragging its feet on data portability, only allowing users to export friends lists in a way that makes it difficult to find them on other social networks.)

Still, this is a big switch from a year ago, when Zuckerberg was publicly on the fence about whether regulation was necessary at all, described the GDPR as good in principle but only for Europe, and was suggesting self-regulation was the better approach. What seems to have changed in the meantime is that the external political pressure on Facebook has continued to mount: It and other tech companies have faced an increasingly hostile reception from the public and elected officials, including threats of regulation and talk of antitrust action. One example: The Australian government is threatening to pass laws in the wake of the Facebook-livestreamed Christchurch massacre that would land platform execs in jail and impose significant fines if they didn’t act quickly to remove terroristic content.

In other words, Zuckerberg et al. perhaps now believe that the GDPR-like regulations, as well as others on topics like content moderation, are inevitable and it’s best for Facebook to get ahead of the bandwagon.

For example, take Phillip Morris, the cigarette titan that came out in favour of regulation in the tobacco industry: One paper in BMJ’s Tobacco Control described the intent as to “to enhance its legitimacy, redefine itself as socially responsible, and alter the litigation environment.” Facebook is clearly on a far lower tier of evil than the cigarette industry, but speaking generally, industries don’t support regulation unless it either helps their bottom line or helps them avoid harsher regulations. Note that Facebook has reportedly been beefing up its army of lobbyists in DC, who may come in quite handy if there comes a time when legislators are deciding what is to be done about it. As Bloomberg noted:

Facebook has an incentive to play a strong role in the debate around technology companies’ data regulation. The company’s rapid revenue growth and billions of dollars in profits are fuelled by collecting numerous data points around its customers and making that easily available to advertisers.

… Zuckerberg this year has worked to frame Facebook’s more critical problems as broader issues for the internet at large, not just affecting his company. His willingness to embrace regulation could take the harder questions out of Facebook’s hands, or at least give the company more time to solve them.

Zuckerberg also sidestepped what is probably the most potent criticism of Facebook: That it and its Silicon Valley cousins like Google and Amazon are so big, so powerful, and so entrenched that the proper regulatory response carving out chunks. On that, not a peep.

Mark Zuckerberg’s op-ed in full below:

Technology is a major part of our lives, and companies such as Facebook have immense responsibilities. Every day we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks. These are important for keeping our community safe. But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone.

I believe we need a more active role for governments and regulators. By updating the rules for the internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.

From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.

First, harmful content. Facebook gives everyone a way to use their voice, and that creates real benefits — from sharing experiences to growing movements. As part of this, we have a responsibility to keep people safe on our services. That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale we’ll always make mistakes and decisions that people disagree with.

Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own. So we’re creating an independent body so people can appeal our decisions. We’re also working with governments, including French officials, on ensuring the effectiveness of content review systems.

Internet companies should be accountable for enforcing standards on harmful content. It’s impossible to remove all harmful content from the internet, but when people use dozens of different sharing services — all with their own policies and processes — we need a more standardised approach.

One idea is for third-party bodies to set standards governing the distribution of harmful content and measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.

Facebook already publishes transparency reports on how effectively we’re removing harmful content. I believe every major internet service should do this quarterly, because it’s just as important as financial reporting. Once we understand the prevalence of harmful content, we can see which companies are improving and where we should set the baselines.

Second, legislation is important for protecting elections. Facebook has already made significant changes around political ads: Advertisers in many countries must verify their identities before purchasing political ads. We built a searchable archive that shows who pays for ads, what other ads they ran and what audiences saw the ads. However, deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors.

Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting. We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.

Third, effective privacy and data protection needs a globally harmonized framework. People around the world have called for comprehensive privacy regulation in line with the European Union’s General Data Protection Regulation, and I agree. I believe it would be good for the internet if more countries adopted regulation such as GDPR as a common framework.

New privacy regulation in the United States and around the world should build on the protections GDPR provides. It should protect your right to choose how your information is used — while enabling companies to use information for safety purposes and to provide services. It shouldn’t require data to be stored locally, which would make it more vulnerable to unwarranted access. And it should establish a way to hold companies such as Facebook accountable by imposing sanctions when we make mistakes.

I also believe a common global framework — rather than regulation that varies significantly by country and state — will ensure that the internet does not get fractured, entrepreneurs can build products that serve everyone, and everyone gets the same protections.

As lawmakers adopt new privacy regulations, I hope they can help answer some of the questions GDPR leaves open. We need clear rules on when information can be used to serve the public interest and how it should apply to new technologies such as artificial intelligence.

Finally, regulation should guarantee the principle of data portability. If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.

This is important for the internet — and for creating services people want. It’s why we built our development platform. True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information. But this requires clear rules about who’s responsible for protecting information when it moves between services.

This also needs common standards, which is why we support a standard data transfer format and the open source Data Transfer Project.

I believe Facebook has a responsibility to help address these issues, and I’m looking forward to discussing them with lawmakers around the world. We’ve built advanced systems for finding harmful content, stopping election interference and making ads more transparent. But people shouldn’t have to rely on individual companies addressing these issues by themselves. We should have a broader debate about what we want as a society and how regulation can help. These four areas are important, but, of course, there’s more to discuss.

The rules governing the internet allowed a generation of entrepreneurs to build services that changed the world and created a lot of value in people’s lives. It’s time to update these rules to define clear responsibilities for people, companies and governments going forward.

[Washington Post]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.