Thursday, December 20, 2018

How YouTube Built a Radicalization Machine for the Far-Right

https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-vortex-of-far-right-hate

Kelly Weill
12.17.18

•••••

YouTube has become a quiet powerhouse of political radicalization in recent years, powered by an algorithm that a former employee says suggests increasingly fringe content. And far-right YouTubers have learned to exploit that algorithm and land their videos high in the recommendations on less extreme videos. The Daily Beast spoke to three men whose YouTube habits pushed them down a far-right path and who have since logged out of hate.

•••••

Launched in 2005, YouTube was quickly acquired by Google. The tech giant set about trying to maximize profits by keeping users watching videos. The company hired engineers to craft an algorithm that would recommend new videos before a user had finished watching their current video.

Former YouTube engineer Guillaume Chaslot was hired to a team that designed the algorithm in 2010.

“People think it’s suggesting the most relevant, this thing that’s very specialized for you. That’s not the case,” Chaslot told The Daily Beast, adding that the algorithm “optimizes for watch-time,” not for relevance.

“The goal of the algorithm is really to keep you in line the longest,” he said.

•••••

“I realized really fast that YouTube’s recommendation was putting people into filter bubbles,” Chaslot said. “There was no way out. If a person was into Flat Earth conspiracies, it was bad for watch-time to recommend anti-Flat Earth videos, so it won’t even recommend them.”

Lewis and other researchers have noted that recommended videos often tend toward the fringes. Writing for The New York Times, sociologist Zeynep Tufekci observed that videos of Donald Trump recommended videos “that featured white supremacist rants, Holocaust denials and other disturbing content.”

•••••

Chaslot, the former YouTube engineer, said he suggested the company let users opt out of the recommendation algorithm, but claims Google was not interested.

Google’s chief executive officer, Sundar Pichai, paid lip service to the problem during a congressional hearing last week. When questioned about a particularly noxious conspiracy theory about Hillary Clinton that appears high in searches for unrelated videos, the CEO made no promise to act.

•••••

Other YouTube-fed conspiracy theories have similarly resulted in threats of gun violence.

•••••

Some fringe users try to proliferate their views by making them appear in the search results for less-extreme videos.

•••••

Young people, particularly those without fully formed political beliefs, can be easily influenced by extreme videos that appear in their recommendations. “YouTube appeals to such a young demographic,” Lewis said. “Young people are more susceptible to having their political ideals shaped. That’s the time in your life when you’re figuring out who you are and what your politics are.”

•••••

In some cases, YouTube videos can supplant a person’s previous information sources. Conspiracy YouTubers often discourage viewers from watching or reading other news sources, Chaslot has previously noted. The trend is good for conspiracy theorists and YouTube’s bottom line; viewers become more convinced of conspiracy theories and consume more advertisements on YouTube.

The problem extends to young YouTube viewers, who might follow their favorite channel religiously, but not read more conventional news outlets.

“It’s where people are getting their information about the world and about politics,” Lewis said. “Sometimes instead of going to traditional news sources, people are just watching the content of an influencer they like, who happens to have certain political opinions. Kids may be getting a very different experience from YouTube than their parents expect, whether it’s extremist or not. I think YouTube has the power to shape people’s ideologies more than people give it credit for.”

•••••

No comments:

Post a Comment