After facing fire
over reports that its moderators protect far-right activists and under-age
accounts, Facebook says it is constantly grooming over 7,500 content reviewers
how to handle posts related to hate speeches, terror and child sexual
exploitation on its platform.
The content reviewers are a mix of full-time
employees, contractors and companies Facebook partners with - covering every
time zone and over 50 languages across the world.
"Content review at this size has never
been done before. After all, there has never been a platform where so many
people communicate in so many different languages across so many different
countries and cultures. We recognise the enormity of this challenge and the
responsibility we have to get it right," Ellen Silver, Vice President of Operations
at Facebook, wrote in a blog post.
"Language proficiency is key and it lets
us review content around the clock. If something is reported in a language that
we don't support 24/7, we can work with translation companies and other experts
who can help us understand local context and language to assist in reviewing
it," Silver added.
The company came under heavy criticism Channel
4 Dispatches - a documentary series - sent an undercover reporter to work as a
content moderator in a Dublin-based Facebook contractor.
It showed that moderators at Facebook were
preventing Pages from far-right activists from being deleted even after they
violate the rules.
In a blog post, Monika Bickert, Vice President
of Global Policy Management at Facebook, said the TV report on Channel 4 in the
UK raised important questions about their policies and processes.
Facebook has also promised to double the number
of people working on its safety and security teams this year to 20,000.
Silver said the company is training its team of
content reviewers in three areas - pre-training which includes what to expect
on the job; hands-on learning that includes a minimum of 80 hours with a live
instructor followed by hands-on practice and ongoing coaching.
"We want to keep personal perspectives and
biases out of the equation entirely - so, in theory, two people reviewing the
same posts would always make the same decision. Of course, judgments can vary
if policies aren't sufficiently prescriptive.
Facebook said it audits a sample of reviewer
decisions each week to find out if a wrong call was made.
"Our auditors are even audited on a
regular basis. In addition, we have leadership at each office to provide
guidance, as well as weekly check-ins with policy experts to answer any
questions," said the social media giant.
Facebook said it has a team of four clinical
psychologists across three regions who are tasked with designing, delivering
and evaluating resiliency programmers for everyone who works with graphics and objectionable content.
"This group also works with our vendor
partners and their dedicated resiliency teams to help build industry
standards," said Silver.
No comments:
Post a Comment