YouTube Responds to Conspiracy-Videos Controversy With Plan to Link to WikipediaVariety
After weeks of YouTube facing scrutiny over the proliferation of conspiracy-theory videos on its service, CEO Susan Wojcicki announced a new strategy to combat misinformation by pointing users to Wikipedia and other third-party sites debunking them.
Wojcicki outlined the plan during a session Tuesday at SXSW in an on-stage interview with Wired’s Nicholas Thompson. “Our goal is to start with a list of internet conspiracies listed on the internet where there is a lot of active discussion on YouTube,” she said, as reported by Wired.
YouTube will not block any conspiracy-theory content, however, unless it runs afoul of the video platform’s community guidelines. “People can still watch the videos, but then they have access to additional information,” Wojcicki said. She didn’t provide a specific timeline for when the new features would roll out.
The Google-owned video giant’s role in promoting conspiracy theories has made headlines of late. Most notably, last month a video suggesting that one of the high-school students who survived the mass killing in Parkland, Fla., David Hogg, was an actor hired by gun-control advocates briefly became YouTube’s No. 1 trending video. YouTube removed the clip within a few hours, citing violation of its policy on harassment and bullying.
YouTube also has been criticized for using algorithms that promote conspiracy theories — but so far, it hasn’t signaled any intent to modify its video-recommendation engine. YouTube’s autoplay algorithms appear designed to promote increasingly radical content, across politics and other subject areas, in an effort to keep users watching as long as possible, according to Zeynep Tufekci, an associate professor at the University of North Carolina. She wrote an op-ed on the topic published in this weekend’s New York Times.
Wojcicki described YouTube’s new counter-conspiracy initiative as providing “information cues”: text boxes linking to Wikipedia and other sources that will show up in videos flagged as promoting alternate facts. For example, on a video questioning whether humans ever actually landed on the Moon, YouTube might provide a link to Wikipedia’s page on the Apollo 11 lunar-landing mission in 1969.
How well YouTube’s “information cues” work in practice remains to be seen. It’s worth noting that Wikipedia is also an open platform and is subject to bad actors manipulating information, although the nonprofit org’s website by design includes community-based controls to edit out falsehoods. During the SXSW interview, Wojcicki acknowledged the challenge of policing the world’s largest internet-video service, and said YouTube would need to continue to employ human moderators to screen content in addition to automated processes.