-
Tips for becoming a good boxer - November 6, 2020
-
7 expert tips for making your hens night a memorable one - November 6, 2020
-
5 reasons to host your Christmas party on a cruise boat - November 6, 2020
-
What to do when you’re charged with a crime - November 6, 2020
-
Should you get one or multiple dogs? Here’s all you need to know - November 3, 2020
-
A Guide: How to Build Your Very Own Magic Mirror - February 14, 2019
-
Our Top Inspirational Baseball Stars - November 24, 2018
-
Five Tech Tools That Will Help You Turn Your Blog into a Business - November 24, 2018
-
How to Indulge on Vacation without Expanding Your Waist - November 9, 2018
-
5 Strategies for Businesses to Appeal to Today’s Increasingly Mobile-Crazed Customers - November 9, 2018
Facebook Now Allows Users to Appeal Decisions to Delete, Leave Up Content
How can you know that the person in a photo agreed to have it posted on Facebook? Bickert told reporters at Facebook’s Menlo Park headquarters last week she fears terrorist or hate groups could game the rulebook so their posts avoid detection by moderators, but said transparency outweighs those concerns. They analyze posts, images, and videos in order to determine if they are violating community standards.
Advertisement
Most recently, Facebook fought accusations that it censored conservative personalities like Diamond and Silk in the United States.
Testifying before the US Congress, Facebook CEO Mark Zuckerberg had said his organisation was committed to ensure integrity of elections across the world, including India.
According to Facebook, the team is made up of 7,500 content reviews across 40 languages.
The company has said the guidelines will double its 10,000-person safety and security team by the end of this year. Four bullet-pointed paragraphs make the points that Facebook is removing more content; that it finds the vast majority of this content itself; that it takes down newly-uploaded content quickly; and that old content is removed with the same vigour as new. It says the rules apply around the world to all types of content, and claims they are created to be comprehensive – “content that might not be considered hate speech may still be removed for breaching our Bullying Policies“. We receive millions of reports every week, and automation helps us to route those reports to the right content reviewer.
Bickert says the company’s challenges in enforcing its community guidelines are, first, identifying posts that break the rules, and then accurately applying policies to flagged content.
Facebook’s updated Community Standards guidelines are what moderators use when deciding what is or is not acceptable content – including bullying, gun sales, nudity and hate speech. If the removal was done by mistake, the user will be notified of the same and the post will be restored. “We have more work to do there, and we are committed to making those improvements”. Users sharing such content must clearly show such goal or moderators will remove the content.
Quartz noted that some of the new rules were “clearly developed in response to a backlash Facebook received in the past”. But it is the first time Facebook has publicly shared complete and detailed information about all its restrictions on user content.
Being more open about its content moderation strategy and similar processes may earn Facebook more support, yet new details about why and how it removes certain content will also likely open up more opportunities for user scrutiny.
In Facebook’s “Graphic Violence” guidelines section, for example, Facebook explains that it removes content that “glorifies violence or celebrates the suffering or humiliation of others” but allows graphic content, with some limitations, to help people raise awareness about issues. Software also can identity the language of a post and some of the themes, helping the post get to the reviewer with the most expertise. For example, you can’t post addresses or images of safe houses, or explicitly expose undercover law enforcement.
Advertisement
Facebook said “coordinating harm” was also banned, which means you can’t use it to whip up an armed insurrection, organise a small guerilla war or plan a riot. They debate the pros and cons of potential policies. “Even if we are operating at 99 percent accuracy, we are still going to have a lot of mistakes every day”.