Picture: REUTERS
Picture: REUTERS

FACEBOOK sought again to clarify its policies on when it removes content, providing details on how it handles nudity, terrorism and hate speech. It also released a new round of data on government requests to remove content.

The policy updates are the latest from Facebook, which is repeatedly refining its messaging on sensitive topics such as privacy and abuse. Late last year, the company tried to simplify its privacy policy in the face of criticism over how it uses customer data.

Facebook stressed that its policies were not changing, but that they were often misunderstood. The social network boasts more than 1-billion users worldwide, making it harder to develop a policy that is transparent and satisfies everyone. Facebook said as much in this new community standards update: "People from different backgrounds may have different ideas about what’s appropriate to share."

That is a fair point. The company cites examples such as "blasphemous" speech, which can be illegal in some countries but does not necessarily violate Facebook’s own policies.

In a post on the social network explaining the updated policies, CE Mark Zuckerberg said Facebook complied with "lawful government orders" to remove content to ensure the site was not blocked for millions of people. If Facebook were blocked, the content in dispute would essentially get removed anyway, he argued. He made a similar point on a visit to South America early this year.

The post coincides with Facebook’s "transparency report" of government-takedown requests for the second half of 2014. The data showed the number of requests from many Western nations, including the US, fell compared with the first half of 2014. It was the first drop for the US since Facebook started releasing such data two years ago. The amount of content restricted grew in India, which removed more Facebook content than any other nation, and jumped sharply in Turkey, where Prime Minister Recep Tayyip Erdogan’s government briefly blocked YouTube and Twitter ahead of local elections last year.

The clarifications to Facebook’s policies involve nudity, hate speech and terror — issues that other social networks have grappled with recently. (Twitter has famously wrestled with how to handle abuse and terror, while Google reversed a ban on explicit material on Blogger within days.) In explaining its policies, Facebook said it tried to be consistent, to make it easier on its staff to make decisions.

Facebook’s nudity policy restricts photos of genitals, fully exposed buttocks, sexual activity and nipples. Images of breast-feeding and art, or involving medicine and surgery, are allowed. Images used for education, satire or social commentary can be used in some cases. Detailed descriptions of sex acts can be removed.

The company will also remove direct attacks based on people’s race, ethnicity, national origin, religion, gender identity, sexual orientation or disabilities.

It stressed that speech that upsets some people would not necessarily be removed. It had to rise to the level of hate speech. It also said it did not allow organisations that engaged in terror activity or organised crime to have a presence on its site, nor did it allow content that expressed support for such behaviour.

More Africa news from The Wall Street Journal

More news from The Wall Street Journal

Premium access to WSJ.com: $1 a week for 12 weeks