UK’s Online Safety Bill Suggests Locking Up Tech Execs For Their Platform’s Crimes

UK’s Online Safety Bill Suggests Locking Up Tech Execs For Their Platform’s Crimes
Photo: Joe Raedle, Getty Images

After spending close to half a year combing through the UK government’s proposed Online Safety Bill, Ofcom — the UK parliamentary wing responsible for overseeing broadcasting and telco businesses throughout the region — finally dropped its suggestions to tighten the drafted legislation. The 194-page report is chock-full of ideas for the flavours of “online harm” that could be encompassed under the bill, and the grisly consequences that could potentially face tech companies that don’t play by Parliament’s proposed rules.

For folks that are unfamiliar, the UK’s bill — which was first properly announced this past May — is broadly meant to impose rules that would force tech firms to better regulate all of the content that “harms” users of all ages. The bill doesn’t only cover the Big Tech Baddies, like Facebook, Google, and Snapchat, but really any site where users can post original content and interact with each other: that includes search engines, porn sites, Fanfiction.net, you name it.

The bill holds these companies (and other similar services) responsible for protecting users from so-called “harmful content.” This ranges from stuff that’s downright illegal, like child pornography and terrorist material, to content that’s legal, but “harmful” to children or adults in some way. The bill’s initial definition of “harm,” was ridiculously vague — so much so that critics rightfully pointed out that any content on any platform could theoretically fall under that umbrella.

Ofcom’s recommendations thankfully include some narrower definitions for what “online harm” looks like. According to the committee, platforms should be responsible for users who send unsolicited nudes to others (a practice Ofcom creatively calls “cyberflashing”). Platforms should also be held responsible for hosting content that could encourage users to self-harm, which is shockingly easy to find across the web right now. Ofcom also suggests creating a new legal obligation to age-gate porn sites in order to keep those pesky pre-teens from accidentally (or purposefully) stumbling onto adult content.

Of course, this is a UK bill, and largely applies to tech company’s operations… in the UK; even if some version of these final rules comes to pass down the road, users who are stateside are unlikely to notice any changes to their favourite (or least favourite sites).

But if this bill passes, and Ofcom flags certain practices that the platforms should change, and those platforms just ignore those recommendations, things change a bit. In cases where platforms host any of the above offences — cyberflashing, non-age gated porn, self-harm content — and those platforms refuse to comply with Ofcom’s rules, the bill suggests criminal charges for the senior managers of the companies involved.

In other words: if Facebook or Youtube doesn’t abide by the rules that Parliament puts down, it could spell jail time for Mark Zuckerberg or Susan Wojcicki or any other named executive.

In its current state, the bill gives platforms a two-year grace period to flout these suggestions before criminal charges start raining down. But some UK authorities — like Nadine Dorries, the region’s recently appointed Secretary of State for for all things digital — have called for a shorter moratorium.

“I think it’s nonsense that platforms have been given two years to get themselves ready for what would be criminal action. They know what they are doing now,” she previously told reporters. “They actually have the ability to put right what they’re doing wrong now. They have the ability now to abide by their own terms and conditions. They could remove harmful algorithms tomorrow.”

Members of the U.K.’s Parliament will have two months to mull over Ofcom’s report.