Facebook announced the plan to expand its appeal process, which will allow Facebook users to make a formal request for the first time to re-think the verdict about a particular part of social network content.
The company is also publishing an updated set of community standards, which closely monitors its policies and how it implements them. Changes come at a time when Facebook is trying to increase transparency in many policies due to criticism of its privacy policies and ways to handle other issues.
When it comes to community standards, one of the biggest criticisms faced by the company is that when Facebook thinks that a wrong decision has been made, users have very little support.
The next year, Facebook will extend its appeal process to appeal more. Initially, this means that those who have removed content to violate the company’s policies around hate speech, nudity, sexual activity, and violence, will have the opportunity to ask them to make a decision with Facebook. Later, Facebook will allow appeals in cases where the material has been reported but has not been deleted. Facebook’s VP of Facebook Policy Monika Bickert says that another strong appeals process is necessary on Facebook’s scale.
She said, “With millions of reports every week if you maintain 99 percent accuracy then you are getting too many mistakes.” “Offering an appeal was a way to say that we want to make sure that you have the voice. It is about empowering people to reach out to our community.”
The company says that reviews made by Human Moderator will not be AI tools, will be within 24 hours and users will be informed that their post will be restored after review.
This is an important detail because the appeal was previously available to those whose profiles, pages, or groups were removed, which means that decisions about day-to-day decisions on different positions can be easily changed in another format Could not be flagged for.
In addition, the Facebook community is trying to clarify its existing policies with an updated set of guidelines. These include finer details about Facebook’s rules, which were previously visible to the company’s content moderator’s army.