Image
With 100 hours of video uploaded to YouTube every minute, it's impossible for the site's employees to keep tabs on the mass of content continuously pouring in. While most of it is innocuous enough, some prohibited material slips through the net, including pornography, gratuitous violence, and abuse of various forms.

In a bid to catch such material more quickly, Google-owned YouTube has hired around 200 individuals and organizations to flag any material they deem to be in contravention of the video-sharing site's guidelines, the Wall Street Journal reported on Monday.

A person with knowledge of the matter told the Journal that most of those in the "flagger program" are individuals, though some are said to be "government agencies or non-governmental organizations such as anti-hate and child-safety groups."

While the site already allows users to report videos containing possibly suspect content, it's likely the material highlighted by those in the flagger program is fast-tracked to the YouTube team for evaluation. In addition, the Web giant has reportedly set up the system so that the flaggers can highlight content "at scale," instead of selecting one video at a time.

UK government flaggers

The Journal's report comes a few days after the Financial Times said Google had already given a number of UK security officials "super flagger powers" in an effort to "contain the proliferation of jihadist material prompted by the war in Syria but are likely to stir concern among civil liberties campaigners."

Google confirmed with the FT that a UK government agency is indeed working to search for particular types of material, with a government spokesperson adding that it was looking for content that might have violated the country's Terrorism Act.

Commenting on the system, a spokesperson for YouTube said the site has a "zero - tolerance policy...towards content that incites violence," adding, "Our community guidelines prohibit such content and our review teams respond to flagged videos around the clock, routinely removing videos that contain hate speech or incitement to commit violent acts."

Google was keen to point out that the final decision about whether a video is removed is theirs and theirs alone.

"Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong," a spokesperson for the Mountain View company told the Journal.