When somebody wants to silence speech, they often use the quickest method available. When the speech is hosted on a major online platform, that method is usually a copyright or trademark complaint.
For many years, EFF has worked with people whose lawful speech has been unfairly targeted by these sorts of complaints. We’ve observed that some approaches tend to work better than others in preventing that sort of deliberate abuse, as well as the casual censorship that comes from haphazard and dragnet approaches to policing online infringement.
In the copyright context, the contours of service provider policies are generally set by the safe harbour provisions of the Digital Millennium Copyright Act. Those provisions outline the practices to which online service providers must adhere in order to avoid copyright liability for the actions of their users. But services have some flexibility in how they implement those requirements, and can make decisions that optimise for defending user speech — or instead for minimising their own legal costs, reducing engineering requirements, or building relationships with rightsholder groups, for example.
When it comes to trademarks, the absence of a detailed statutory safe harbour can mean more uncertainty for service providers, but also more flexibility. Some service providers are very conservative in their response to trademark complaints, taking down content quickly when there’s a complaint, even where there’s little real risk of liability. But other service providers choose to adopt policies that accept that a small degree of legal risk is worthwhile to protect their users. For example, they can require trademark complaints to be complete and valid, can make sure that content is only taken down after human review and consideration, and can give users a chance to challenge those complaints.
And with respect to both copyright and trademark, services can work to ensure that their policies are exercised in an open and transparent manner, so that users can better understand the scope and scale of copyright and trademark complaints and company responses.
Major online platforms have become the hubs for so much of our speech. The result is that their policy decisions can have an outsized impact on what speech enters the public discourse, and what gets silenced or relegated to secondary status. As users choose which platforms will host their updates, writing, images, and videos, they ought to know which of these services have made explicit commitments to defend that speech against bullies that would try to take it down.
As with our April “Who Has Your Back” report, which addresses government requests for personal data, the categories we evaluate in this report are based on objectively verifiable, public policy statements. In order to preserve that quality, we’ve chosen not to award stars unless we can cite a public policy, even in cases where internal policies may meet our evaluation thresholds. We’ve also chosen not to award stars in cases where we’ve learned that a company has not heeded its own public policies. If users believe that a company’s actions don’t match its policies, and can provide specific examples, please let us know.
We compiled the information in this report by examining each company’s published terms of service, copyright and trademark policies, and transparency reports where available. As part of our evaluation, we contacted each company to explain our findings and to give them an opportunity to improve their public stances.
We evaluated the following five criteria for each service:
- DMCA takedown notices. Services earn a star in this category for requiring a formal, complete, and valid DMCA notice for copyright-based takedowns of content. Services must also commit to forwarding the information contained in that notice to the affected user. In some cases services make that information available only upon request; where a service has made a public promise to do so, we have awarded a star.
- DMCA counter-notices. In order to earn a star in this category, services must have a publicly documented counter-notice procedure that includes a commitment to promptly restoring all counter-noticed works after the required 10-14 days of downtime. Additionally, services must commit to excluding counter-noticed works from “repeat infringer” policies.
- Trademark complaints. As with the first category, services earn a star here for requiring a formal notice of trademark complaint, including information about the relevant trademark, and forwarding that information to any user whose uploads are affected.
- Trademark disputes. In order to earn a star in this category, services must outline a procedure by which users can contest trademark complaints, or commit to additional human review of the takedown. In the services we’ve evaluated, we’ve awarded a star to companies that provide a documented internal dispute resolution process that includes the user and to companies that require complainants to obtain a court order for takedown.
- Publishing a transparency report on copyright and trademark complaints. Finally, services earn a star here for publishing information on takedown requests. We’ve intentionally left this category flexible, but in future editions may increase the requirements to include some of the best practices we’ve observed, like publishing compliance rates, breaking down information about non-compliance, and forwarding actual notices to the Chilling Effects database.
EFF arrived at the evaluation criteria after careful consideration of actual industry practices, the state of copyright and trademark law, and experience with users who have dealt with abuse from copyright and trademark bullies. We were pleased to find that services have been largely receptive to our concerns, and in many cases were able to point to policies that met these criteria, or, where their “star-worthy” internal practices were not reflected in their public policies, to revise their public-facing statements so that users would know about those practices.
We should be very clear: we believe that these five evaluation criteria are floor, not a ceiling. These are minimum standards for what a service can do to defend its users’ speech against copyright and trademark bullies. For example, even a robust, user-friendly DMCA takedown policy can still present problems for speech, because of flaws in the statute itself. Even policies that earn all five stars cannot prevent all bullies.
We also note that some services have gone above and beyond the evaluation criteria in this report. Automattic, the parent company behind WordPress, has filed lawsuits in response to abusive takedown requests. Etsy prepares educational materials and blog posts about the public’s right to use trademarks and copyrighted works. Twitter has issued a thorough Transparency Report every six months for over two years. And YouTube has occasionally proactively restored content targeted by a DMCA notice, ahead of the DMCA’s 10 business day waiting period, where that content was clearly non-infringing.
Still, the report can be read to reflect a broad commitment across many of the services we’ve surveyed to handle takedown requests in a way that recognises the right and responsibilities of users as well as senders.
Download the complete “Who Has Your Back? 2014: When Copyright and Trademark Bullies Threaten Free Speechreport” as a PDF. This article first appeared on Electronic Frontier Foundation and is republished here under Creative Commons licence.