Facebook Has Employed 3,000 Experts, To Take Down Violent Content And Hate Speech Against Europe

(FILES) This file photo taken on September
27, 2015 shows Facebook CEO Mark
Zuckerberg attending a Townhall meeting
at Facebook headquarters in Menlo Park,
California. Facebook said on May 3, 2017,
it would add 3,000 people to screen out
violent content as the social media giant
faces scrutiny for a series of killings and
suicides broadcast on its platform.

image

“If we’re going to build a safe community, we
need to respond quickly,” chief executive
Mark Zuckerberg said on his Facebook
page. SUSANA BATES / AFP

Facebook said Wednesday it is hiring an
extra 3,000 staff to remove violent
content like the gruesome killings and
suicides broadcast on its video platform.

The move ramps up efforts by Facebook
to filter content amid growing criticism
of the social media giant for allowing the
platform to be used to promote violence
and hateful activity.

“If we’re going to build a safe
community, we need to respond
quickly,” chief executive Mark
Zuckerberg said on his Facebook page.

Zuckerberg’s announcement came a week after a 20-year-old Thai man broadcast live video on the world’s most popular social media platform, showing him killing his baby daughter before committing suicide.

The previous week, a US man dubbed
the “Facebook Killer” shot himself to
death after three days of a frantic
nationwide manhunt.
The murder and a video sparked outrage
across the world and renewed scrutiny of
the growing number of grisly videos
being posted on social media.

Facebook removed the footage hours
after the attack. Zuckerberg
acknowledged that the world’s largest
social network had a role to play in
stemming the worrisome trend.
“We’re working to make these videos
easier to report so we can take the right
action sooner — whether that’s
responding quickly when someone needs
help or taking a post down,” Zuckerberg
said.

The 3,000 new recruits, added over the
coming year, will increase by two thirds
the size of Facebook’s community
operations team, which currently
numbers 4,500.

– ‘We can do better’ –
“We’ve seen people hurting themselves
and others on Facebook — either live or
in video posted later,” Zuckerberg said.

“It’s heartbreaking, and I’ve been
reflecting on how we can do better for
our community.”
The additional reviewers will “help us get
better at removing things we don’t allow
on Facebook like hate speech and child
exploitation,” he said.

“And we’ll keep working with local
community groups and law enforcement
who are in the best position to help
someone if they need it — either because
they’re about to harm themselves, or
because they’re in danger from someone
else.”

Critics say the social network has been
too slow to react to online violence, and
questioned whether Facebook Live — a
strategic area of development for the
company — should be disabled, after
several cases in which it was used to
broadcast rapes.

Zuckerberg said Facebook has been working on better technology that can
identify violent or inappropriate content
— and that its efforts to screen for acts of
violence appeared to be paying off.
“Just last week, we got a report that
someone on Live was considering
suicide,” he said.

“We immediately reached out to law
enforcement, and they were able to
prevent him from hurting himself. In
other cases, we weren’t so fortunate.”
It was not immediately clear how and
where Facebook would deploy the new
monitors.

As Facebook approaches a global user base of two billion, it has been grappling
with its role as a platform for sharing
news as well as calls to violence and
political propaganda.

The company has insisted it is not a
“media company” that manages content
seen by users, but it has faced growing
calls to weed out “fake news” that may
influence elections as well as “hate speech” barred in some European
countries.

Last month, Facebook stepped up its
security to counter efforts by
governments and others to spread
misinformation or manipulate
discussions for political reasons.

Facebook also recently unleashed a new
weapon in the war against “revenge
porn” at the leading social network as
well as the messaging services Messenger
and Instagram.

Advertisements