Facebook now allows users to appeal certain removed posts

Facebook CEO Mark Zuckerberg at the company’s headquarters in Menlo Park California

Facebook CEO Mark Zuckerberg at the company’s headquarters in Menlo Park California

"At the time they told us they could not do it, they would not do it, and actually stopped engaging at that point", Cyril said.

For example, Facebook has long-faced criticism over how it deals with photos of breastfeeding mothers, while iconic photos such as the "Napalm girl" have been censored for breaching the company's broad nudity guidelines.

Illustrative: A Gazan man looking at a pro-Palestinian post on Facebook on April 7, 2013.

In some cases, Facebook employees also contact police or other officials to investigate risky situations or possible crimes. The social networking giant's guidelines cover everything from violence and bullying to privacy and copyright.

Though Facebook's content moderation is still very much driven by humans, the company does use technology to assist in its work.

Employees of the Competence Call Center (CCC) work for the Facebook Community Operations Team in Essen, Germany, Thursday, Nov. 23, 2017. "We know that sometimes people share nude images of their own children with good intentions; however, we generally remove these images because of the potential for abuse by others and to help avoid the possibility of other people reusing or misappropriating the images", Facebook says. The guidelines hint at the complexity of the role of moderator. (You don't have to use your real name on Instagram, for one.) The underlying policies haven't changed, Bickert said, though they now include extra guidance on making decisions.

Another complex issue is how moderators decide how and when expressions of ideas and opinions violate policy.

The company's VP of Global Product Management Monika Bickert said in a release that the company staff stationed in eleven offices worldwide includes experts on terrorism, child safety and hate speech.

Determining whether something is shared for amusement or to raise awareness, and if it was received as such, will be an inherently hard task.

Bickert says the company's challenges in enforcing its community guidelines are, first, identifying posts that break the rules, and then accurately applying policies to flagged content.

If the second moderator thinks differently, the post will be restored.

Facebook may also notify law enforcement if it believes there is a genuine risk of physical harm or a direct threat to public safety.

Facebook has for the first time detailed exactly what it will ban.

It "added a warning to a small portion that was shared for informational or counter speech purposes", Bickert and Fishman said in an online post.

If Facebook removes a photo, video or post, it will alert the user that the content was removed for violating the site's Community Standards. (Facebook users are allowed to appeal the shutdown of an entire account but not individual posts.) The Washington Post previously documented how people have likened this predicament to being put into "Facebook jail" - without being given a reason why they were locked up. The service is also beginning to provide exact reasons for content removal action. We receive millions of reports every week, and automation helps us to route those reports to the right content reviewer.

Under intense scrutiny due to a series of scandals, Facebook is now releasing its internal enforcement guidelines that govern what content is allowed on its platform and what is not. The change makes it easier for people to approve or deny the company permission to use different kinds of user data. With a team of just 7500 content reviewers, Facebook has quite a task on its hands.

"Without the users, the advertising business collapses".

Bryan Lynn wrote this story for VOA Learning English, based on information from Facebook and reports from the Associated Press and Reuters.

Donors Pledge $4.4 Billion in Aid for Syria, Neighboring Countries
Supreme Court Hears Trump Travel-Ban Case