Cover that butt! Facebook unveils nudity, terrorism, revenge porn policy updates

Reuters / Dado Ruvic

Reuters / Dado Ruvic

Facebook doesn’t want its users posting naked butts or women’s nipples on the social network, according to its newly clarified Community Standards. The company also outlined what constitutes hate speech, revenge porn and terrorism.

In a detailed explanation Monday, the company outlined its new
Community Standards, which replace the previous version. The
updated policy, at nearly 2,500 words, is almost three times
longer than the old guidelines, BBC reported. The guidelines are
designed to respond to criticism it has faced and questions
raised over how content on the site is moderated.

“These standards are designed to create an environment where
people feel motivated and empowered to treat each other with
empathy and respect,”
Monika Bickert, Facebook’s head of
global policy management, and Chris Sonderby, Facebook’s deputy
general counsel, said in the post. “In particular, we’ve
provided more guidance on policies related to self-injury,
dangerous organizations, bullying and harassment, criminal
activity, sexual violence and exploitation, nudity, hate speech,
and violence and graphic content. While some of this guidance is
new, it is consistent with how we’ve applied our standards in the
past.”

Bicket told the BBC that the rewrite was intended to address
confusion about why some takedown requests were rejected. She
stressed that the changes were meant as a clarification to the
current guidelines, rather than a change in policy.

“We [would] send them a message saying we’re not removing it
because it doesn’t violate our standards, and they would write in
and say I’m confused about this, so we would certainly hear that
kind of feedback,”
she said.

“And people had questions about what we meant when we said we
don’t allow bullying, or exactly what our policy was on
terrorism,”
Bicket continued. “[For example] we now make
clear that not only do we not allow terrorist organisations or
their members within the Facebook community, but we also don’t
permit praise or support for terror groups or their acts or their
leaders, which wasn’t something that was detailed before.”

Facebook updated its definition of hate speech (Screenshot from Facebook)

In the last year alone, Facebook received heavy criticism for:
Limiting
Russian users’ access to a page
created in support of Russian
activist and blogger Aleksey Navalny (but not blocking copycat
pages); doing too little
to monitor terror plots
; suspending Native
Americans’
accounts over its “real names only”
policy ‒ after allowing
drag queens to keep their stage names
on their accounts and
apologizing to the LGBT community; not allowing teachers, victims
of domestic violence and others to
use aliases for their personal safety
; manipulating
its users’ emotions
as part of a psychological experiment ‒
and not
apologizing
for it; allowing fake tribute
pages
for Malaysia Airlines flight MH17 to link instead to
pop-up ads for online gambling, pornographic websites and other
suspicious products; contemplating whether to penalize websites
that use
“click-bait” headlines
; and introducing a “satire”
tag to help unaware users
understand that they are not a
reflection of reality.

In its section on nudity, the social media network will now
“remove photographs of people displaying genitals or focusing
in on fully exposed buttocks,”
and “restrict some images
of female breasts if they include the nipple, but we always allow
photos of women actively engaged in breastfeeding or showing
breasts with post-mastectomy scarring.”

Always is a relative term. Facebook’s official policy to allow
(most) breastfeeding pics was only
instated in June 2014
. The shift came after feminists
initiated a campaign against what they believed was gender-based
discrimination.

The company also banned “revenge porn,” or sexually
explicit content posted without the subject’s permission. A Texas
woman has sued
Facebook for failing to delete
falsified, lewd images of her
after repeated requests. Google,
Twitter
and Reddit have all banned the sharing of sexual imagery without
permission. The UK, multiple
US
states
and elsewhere
have all made sharing revenge porn a specific criminal offense.

Members of the five independent organizations that comprise
Facebook’s safety advisory board applauded the move, though with
some reservations.

“I think it’s great that Facebook has revamped its community
standards page to make it both more readable and
accessible,”
Family Online Safety Institute (FOSI) chief
executive Stephen Balkam told the BBC. “I wish more social
media sites and apps would follow suit.”

However, Balkam noted that the site has done nothing to allow
members to prevent young users from seeing graphic videos that
automatically play until Facebook has received a complaint. At
that point, the company’s staff can add an interstitial image
warning.

“It is frustrating that after all this time, Facebook users
are still not able to put up interstitials on violent or
controversial images and videos,”
Balkam said. “Facebook
has done the right thing to place interstitials themselves once a
user has reported an image or extreme content, but my hope is
that they will bring this to ordinary users sooner rather than
later.”

The social media site clarified how it will handle posts that
violate a country’s laws, even if they don’t violate the Facebook
Community Standards.

“Questions about free expression and how governments regulate
it are some of the most difficult and important issues we
face,”
Facebook founder Mark Zuckerberg wrote in a post on
the site. “In an ideal world, we would all feel empowered to
express everything we want, freely and safely. In reality, there
are many obstacles in the way. Every country, including the
United States, has laws preventing you from sharing certain
things to protect public safety and intellectual property.”

“[I]f a country requests that we remove content because it is
illegal in that country, we will not necessarily remove it from
Facebook entirely, but may restrict access to it in the country
where it is illegal,”
Bickert and Sonderby wrote, citing
hate speech as an example.

At the beginning of March, an American helicopter mechanic
working in the United Arab Emirates was arrested and now faces
charges for cyber slander
in the UAE after he criticized his
employer and made disparaging comments about “filthy
Arabs”
in a Facebook post. In December, a Moroccan-born
Danish bookseller was given a four-year jail term for supporting
terrorism after writing extremist views
on his Facebook page.

Users must report violations of the Community Standards (Screenshot from Facebook)

It remains up to Facebook users to report violations of the
updated guidelines, and the company will not automatically scan
and remove potentially offensive content, Bickert told the New
York Times in an interview.

As well as outlining the policy modifications, the company’s blog
post also discussed its Global Government Requests Report, which
was also released Monday. The report, which covers the latter
half of 2014, lays out all the information about government
requests Facebook has received for content removal, account data
and national security requests.

The company promised to “scrutinize each government request
and push back when we find deficiencies,”
Bickert and
Sonderby wrote. “We will also continue to push governments
around the world to reform their surveillance practices in a way
that maintains the safety and security of their people while
ensuring their rights and freedoms are protected.”

Post
by Mark
Zuckerberg
.

Leave a comment