Policing the Web’s Lurid Precincts


Ricky Bess spends eight hours a day in front of a computer near Orlando, Fla., viewing some of the worst depravities harbored on the Internet. He has seen photographs of graphic gang killings, animal abuse and twisted forms of pornography. One recent sighting was a photo of two teenage boys gleefully pointing guns at another boy, who is crying.

At Caleris, Stacey Springer, left, vice president for support operations, reviewing images with Marie Wittry.
An Internet content reviewer, Mr. Bess sifts through photographs that people upload to a big social networking site and keeps the illicit material — and there is plenty of it — from being posted. His is an obscure job that is repeated thousands of times over, from office parks in suburban Florida to outsourcing hubs like the Philippines.

With the rise of Web sites built around material submitted by users, screeners have never been in greater demand. Some Internet firms have tried to get by with software that scans photos for, say, a large area of flesh tones, but nothing is a substitute for a discerning human eye.

The surge in Internet screening services has brought a growing awareness that the jobs can have mental health consequences for the reviewers, some of whom are drawn to the low-paying work by the simple prospect of making money while looking at pornography.

“You have 20-year-old kids who get hired to do content review, and who get excited because they think they are going to see adult porn,” said Hemanshu Nigam, the former chief security officer at MySpace. “They have no idea that some of the despicable and illegal images they will see can haunt them for the rest of their lives.”

David Graham, president of Telecommunications On Demand, the company near Orlando where Mr. Bess works, compared the reviewers to “combat veterans, completely desensitized to all kinds of imagery.” The company’s roughly 50 workers view a combined average of 20 million photos a week.

Mr. Bess insists he is still bothered by the offensive material, and acknowledges the need to turn to the cubicle workers around him for support.

“We help each other through any rough spots we have,” said Mr. Bess, 52, who previously worked in the stockrooms at Wal-Mart and Target.

Last month, an industry group established by Congress recommended that the federal government provide financial incentives for companies to “address the psychological impact on employees of exposure to these disturbing images.”

Mr. Nigam, co-chairman of the group, the Online Safety and Technology Working Group, said global outsourcing firms that moderate content for many large Internet companies do not offer therapeutic care to their workers. The group’s recommendations have been submitted to the National Telecommunications and Information Administration, which advises the White House on digital policy.

Workers at Telecommunications On Demand, who make $8 to $12 an hour, view photos that have been stripped of information about the users who posted them. Rapidly cycling through pages of 300 images each, they are asked to flag material that is obviously pornographic or violent, illegal in a certain country or deemed inappropriate by a specific Web site.

Caleris, an outsourcing company based in West Des Moines, Iowa, says it reviews about 4.5 million images a day. Stacey Springer, its vice president for support operations, says the job is not for everybody and that “people find they can do it, but it is usually a lot harder than they thought.” The company offers counseling as part of its standard benefits package for workers.

Ms. Springer says she believes that content moderators tend to become desensitized to the imagery, making it easier to cope. But she is called on to review the worst material, like sexual images involving children, and says that she finds some of it “hard to walk away from.”

“I do sometimes take it really personally,” she said of the pictures she reviews. “I remind myself, somebody has to do it.”

A common strategy at Web sites is to have users flag questionable content, then hand off material that needs further human review to outsourcing companies that can do so at low cost.

Global outsourcing firms like Infosys Technologies, based in Bangalore, India, and Sykes Enterprises, based in Tampa, Fla., have leapt to offer such services.

Internet companies are reluctant to discuss the particulars of content moderation, since they would rather not draw attention to the unpleasantness that their sites can attract. But people in the outsourcing industry say tech giants like Microsoft, Yahoo and MySpace, a division of the News Corporation, all outsource some amount of content review.

YouTube, a division of Google, is an exception. If a user indicates a video is inappropriate, software scans the image looking for warning signs of clips that are breaking the site’s rules or the law. Flagged videos are then sent for manual review by YouTube-employed content moderators who, because of the nature of the work, are given only yearlong contracts and access to counseling services, according to Victoria Grand, a YouTube spokeswoman.

For its part, Facebook, the dominant social network with more than 500 million members around the world, has relied on its users to flag things like pornography or harassing messages. That material is reviewed by Facebook employees in Palo Alto, Calif., and in Dublin.

Simon Axten, a Facebook spokesman, said the company had tried outsourcing the manual review of photos but had not done so widely.

Outsourcing companies are also reluctant to discuss the business on the record, since their clients demand confidentiality. One executive at a global outsourcing firm, who did not want to be named, said that large Internet firms “are paying millions a year to do this kind of thing and essentially provide some type of control over the beast that is the Internet, which for the most part is uncontrollable.”

“If they don’t do it, their commercial interests will completely die,” he added.

One major outsourcing firm with staff in the Philippines was aware of the risks of this type of work and hired a local psychologist to assess how it was affecting its 500 content moderators. The psychologist, Patricia M. Laperal of Behavioral Dynamics, said she had developed a screening test so the company could evaluate potential employees, and helped its supervisors identify signals that the work was taking a toll on employees.

Ms. Laperal also reached some unsettling conclusions in her interviews with content moderators. She said they were likely to become depressed or angry, have trouble forming relationships and suffer from decreased sexual appetites. Small percentages said they had reacted to unpleasant images by vomiting or crying.

“The images interfere with their thinking processes. It messes up the way you react to your partner,” Ms. Laperal said. “If you work with garbage, you will get dirty.”

Carlos Conde contributed reporting.

Simplex Magazine2

Aliquam erat volutpat. Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscipit lobortis nisl ut aliquip ex ea commodo consequat.