
This is one wiki wormhole YouTube might not want to go down.
The video sharing site and Google subsidiary announced at SXSW this week that it will begin including Wikipedia links alongside videos touting conspiracy theories.
YouTube CEO Anne Wojcicki said she wants to make sure “video and text work together” to combat fake news. The site already labels state-run media clips as part of this initiative.
The new feature will go live in the coming months. It will be included on videos for events like the moon landing, which some conspiracy theorists believe was faked.
The plugin will include a brief excerpt from the topic’s Wikipedia page and a link to the full encyclopedia entry. Strangely, the example excerpt cuts off mid-sentence.
There are still many questions about this approach.
Wojcicki said YouTube will focus more on “hateful” content than on false content—so, for example, videos featuring flat earth truthers wouldn’t be labeled.
That distinction dovetails with YouTube’s longtime assertion that it’s not a media company and therefore isn’t liable for falsehoods on its platform. It also means the site theoretically bears less responsibility for user content.
YouTube has made some promising changes: the company recently committed to adding 10,000 people to its moderation team this year.
This human element is necessary because its algorithm still makes big mistakes. For example, after the Florida school shooting, a video calling survivor David Hogg a crisis actor began trending.
Wikipedia may not be able to fix all of these problems, however—it has its own issues. Many of the online encyclopedia’s pages for topics like money and acupuncture are filled with falsehoods.
The site’s moderators have dealt with some of these issues by banning sources like The Daily Mail. But because Wikipedia (like YouTube) is a predominantly user-generated platform that is susceptible to false information, problems still persist.
And this rhetoric isn’t just coming from outside commentators: Wikipedia’s own editors acknowledge the site’s blind spots as a news source.
The Wikimedia Foundation, a Wikipedia-affiliated charity advocating for an open internet, urged its members to proceed with caution regarding Wikipedia and YouTube.
Foundation executive director Katherine Maher expanded on these feelings in a lengthy tweetstorm.
The foundation also said it was not given advance notice of the YouTube-Wikipedia partnership.
A Wikipedia forum on breaking news said that relying on the site to properly source stories was a risky proposition.
“Wikipedia is an encyclopedia, not a newspaper,” the post read. “Our processes and principles are designed to work well with the usually contemplative process of building an encyclopedia, not sorting out the oft-conflicting and mistaken reporting common during disaster and other breaking news events.”
Indeed, Wikipedia has made big mistakes before.
In 2011, the site (along with many other media outlets) mistakenly reported that Arizona Congresswoman Gabrielle Giffords died in a mass shooting. Giffords ended up surviving the attack.
The site provided several tips about responding to breaking news.
First, editors should wait for two or three reliable websites to source the material before updating the page. In light of the Giffords error, editors should be particularly careful with stories about someone’s death.
Given that Wikipedia is still working out its own fact checking procedures, it may not be the cure-all YouTube thinks it is.