Google And YouTube Need More Transparent Takedown Procedures - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Mobile // Mobile Applications
06:24 PM
Thomas Claburn
Thomas Claburn
Connect Directly

Google And YouTube Need More Transparent Takedown Procedures

The removal of content from the Internet needs more safeguards. Right now, it's just too easy to make unsubstantiated claims that lead online services providers to block lawful content.

The removal of content from the Internet needs more safeguards. Right now, it's just too easy to make unsubstantiated claims that lead online services providers to block lawful content.As the Electronic Frontier Foundation and the American Civil Liberties Union pointed out in a joint blog post on Monday, user-generated content is playing a significant role in the political and civic debate, but "political speech has been threatened repeatedly by claims that controversial material violates a site's terms of use or infringes copyrights or trademark rights."

Among the examples of stifled speech cited by the groups: the International Olympics Committee's use of a bogus copyright claim to demand to remove a video of a protest by Students For A Free Tibet; the removal of a video critical of presidential candidate John McCain because of graphic war images, ostensibly a violation of YouTube's Terms of Service; and the Associated Press' use of the Digital Millennium Copyright Act to force the removal of blog entries quoting excerpts of AP news stories.

The two rights groups want to see content owners respond more carefully before filing takedown complaints.

But more needs to be done. As keepers of what has become a public forum for civic debate, Google and other content-hosting sites owe the public a more transparent takedown procedure.

The problem is that content removal can appear to be arbitrary or politically motivated.

For example, a YouTube user recently contacted me claiming that his video depicting the violence in Tiananmen Square in 1989 had been censored in the United States due to complaints from China. He claimed that YouTube's built-in analytics showed a spike in viewers from China just before the Olympics and that YouTube then contacted him and blocked his video. YouTube, he said, told him that his video violated community standards due to its images of graphic violence. It was Chinese discontent with his video that spurred YouTube's action, he insisted.

This video was later re-instated, but behind YouTube's age verification wall, making it less accessible to the public and, the video maker said, inaccessible in China.

Upon further examination, the YouTube user's claims appear not to hold up. Other Tiananmen Square videos remain available on YouTube in the United States. To me, this indicates YouTube had issues with this one particular video. So much for claims that YouTube censors U.S. content at the behest of Chinese authorities.

My suspicion is that this filmmaker is just hoping to get some free publicity from the press. But getting to the bottom of this isn't easy because the YouTube takedown process is still too opaque.

Content-related takedowns on YouTube begin with YouTube users, who can flag videos as inappropriate or offensive. This raises the possibility of politically motivated campaigns to claim that certain videos are inappropriate.

Flagged videos get reviewed by YouTube staff, but the determination of whether or not a video should be blocked happens outside of public view.

"It's really important right now that intermediaries take care not to take down speech improperly," EFF attorney Corynne McSherry said during a phone call earlier today. She called Terms of Use-based takedowns a "nebulous standard" and acknowledged that they're "worrisome because it's not clear what the real basis [for content removal] may be at the end of the day."

The Internet community would be better served by a more public takedown process, in which content publishers can confront and respond to complaints. Online content creators and publishers should be able to file counter-takedown notifications, as they can when hit with copyright complaints, to defend against capricious or unjust claims of community standards violations.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
10 Ways to Transition Traditional IT Talent to Cloud Talent
Lisa Morgan, Freelance Writer,  11/23/2020
What Comes Next for the COVID-19 Computing Consortium
Joao-Pierre S. Ruth, Senior Writer,  11/24/2020
Top 10 Data and Analytics Trends for 2021
Jessica Davis, Senior Editor, Enterprise Apps,  11/13/2020
White Papers
Register for InformationWeek Newsletters
Current Issue
Why Chatbots Are So Popular Right Now
In this IT Trend Report, you will learn more about why chatbots are gaining traction within businesses, particularly while a pandemic is impacting the world.
Flash Poll