Sunday, December 17, 2017

‘The Basic Grossness of Humans’

https://www.theatlantic.com/technology/archive/2017/12/the-basic-grossness-of-humans/548330/

Alexis C. Madrigal Dec 15, 2017

Lurking inside every website or app that relies on “user-generated content”—so, Facebook, YouTube, Twitter, Instagram, Pinterest, among others—there is a hidden kind of labor, without which these sites would not be viable businesses. Content moderation was once generally a volunteer activity, something people took on because they were embedded in communities that they wanted to maintain.

But as social media grew up, so did moderation. It became what the University of California, Los Angeles, scholar Sarah T. Roberts calls, “commercial content moderation,” a form of paid labor that requires people to review posts—pictures, videos, text—very quickly and at scale.

Roberts has been studying the labor of content moderation for most of a decade, ever since she saw a newspaper clipping about a small company in the Midwest that took on outsourced moderation work.

•••••

Roberts has traced the history of the development of moderation as a corporate practice. In particular, she’s looked at the way labor gets parceled out. There are very few full-time employees working out of corporate headquarters in Silicon Valley doing this kind of stuff. Instead, there are contractors, who may work at the company, but usually work at some sort of off-site facility. In general, most content moderation occurs several steps removed from the core business apparatus. That could be in Iowa or in India (though these days, mostly in the Philippines).

•••••

LaPlante, for example, works on Mechanical Turk, which serves as a very flexible and cheap labor pool for various social-media companies. When she receives an assignment, she will have a list of rules that she must follow, but she may or may not know the company or how the data she is creating will be used.

Most pressingly, though, LaPlante drew attention to the economic conditions under which workers are laboring. They are paid by the review, and the prices can go as low as $0.02 per image reviewed, though there are jobs that pay better, like $0.15 per piece of content. Furthermore, companies can reject judgments that Turkers make, which means they are not paid for that time, and their overall rating on the platform declines.

This work is a brutal and necessary part of the current internet economy. They’re also providing valuable training data that companies use to train machine-learning systems. And yet the people doing it are lucky to make minimum wage, have no worker protections, and must work at breakneck speed to try to earn a living.

As you might expect, reviewing violent, sexual, and disturbing content for a living takes a serious psychological toll on the people who do it.

“When I left Myspace, I didn’t shake hands for like three years because I figured out that people were disgusting. And I just could not touch people,” Bowden said. “Most normal people in the world are just fucking weirdos. I was disgusted by humanity when I left there. So many of my peers, same thing. We all left with horrible views of humanity.”

•••••

LaPlante emphasized, too, that it’s not like the people doing these content-moderation jobs can seek counseling for the disturbing things they’ve seen. They’re stuck dealing with the fallout themselves, or, with some sort of support from their peers.

“If you’re being paid two cents an image, you don’t have $100 an hour to pay to a psychiatrist,” LaPlante said.

In a hopeful sign, some tech companies are beginning to pay more attention to these issues. Facebook, for example, sent a team to the content-moderation conference. Others, like Twitter and Snap, did not.

Facebook, too, has committed to hiring 10,000 more people dedicated to these issues.

•••••

No comments:

Post a Comment