All of a sudden, YouTube can’t escape the spotlight. Normally the spotlight is exactly what the video streaming site wants but lately the temperature has been rising.
Earlier this week, a devastating report was published examining how YouTube executives seem to have continuously abdicated their responsibility to deal with toxic videos spreading on the platform while chasing the magic dragon of increased viewer engagement.
Today, Congress is looking at the draft of a new bill aiming to regulate YouTube more like television. That could mean tackling the platform’s recommendation and search algorithms, code that seems to be incredibly good at delivering extreme content — the kind that drives more engagement despite YouTube’s protestations to the contrary — to YouTube’s nearly 2 billion monthly users.
The Kids Internet Design and Safety (KIDS) Act, a new draft bill from Massachusetts Democratic Senator Ed Markey, proposes new rules on how advertisements are served, how data is collected and how algorithms recommend and push media to children.
The bill is backed by the Silicon Valley-based advocacy organisation Common Sense. It’s meant to update the 1990 Children’s Television Act to cover the 2019 landscape. That means YouTube which, Common Sense argues, as an unregulated platform delivers to “kids and teens disturbing content with violence, self-harm, profanity, hate speech, and mass shootings,” the group’s Thursday press release said.
The Federal Trade Commission would enforce the law.
The bill, which hasn’t been released to the public because it’s still being drafted, could have a lot of serious implications and the public has only really just begun wrestling with it.
One point of interest if your interest is money: YouTube’s unboxing videos are absolute blockbusters. Just about every toy, gadget and physical product in existence have scores of unboxing videos, many of which target kids. It’s a huge money maker from everyone involved—the creators, the manufactures and of course YouTube itself. How will this new legislation impact lucrative genres of videos that are, if we’re being honest, most often just thinly veiled ads targeted at kids?
YouTube did not respond to a request for comment on the bill.
“Comprehensive legislation on children’s media is a step toward addressing the much larger and more pressing reality we all face today: the growing influence of tech on our kids and its unintended consequences,” said Jim Steyer, CEO of Common Sense. “We need comprehensive and enforceable rules that reflect the current media landscape to safeguard children’s programming and ensure the well-being of kids and generations to come.”
Steyer’s group, which is helping draft the bill, promises the bill will “enshrine rules to address the use of algorithms that push extreme content in front of kids.”
Here’s a full list of what the bill’s authors are promising:
Stop manipulative and damaging design features that keep kids glued to the screen.
Limit marketing and commercialization; create rules to limit the method and the content of ads that appear in front of kids.
Prevent the amplification of harmful content; enshrine rules to address the use of algorithms that push extreme content in front of kids.
Require platforms to provide parents with clear guidance on kid-healthy content.
Create incentives for positive content creation.
Require transparency and strong enforcement; designate the Federal Trade Commission (FTC) to enforce the law.
This is obviously about a lot more than toy unboxing videos. YouTube’s recommendation algorithm has come under intense scrutiny of late for putting extreme videos in front of people of all ages.
Regulation of Silicon Valley’s world-spanning platforms isn’t just an abstraction anymore and this bill is just at the start of that conversation. There are a ton of questions still to be answered including, for instance, what would the transparency requirements look like and how exactly would the site be required to prevent the amplification of harmful content, as Common Sense promises the bill will deliver?
However this turns out, the next point in this conversation is obvious: If they can do that for kids, what about the rest of us?