In November, Facebook announced a new plan that would reorganize how the company makes content policy decisions on its social network – it will begin sending some of the most contentious decisions to an independent review board. The board will serve as the latest escalator for calls regarding reported content, acting as a supreme court on Facebook. Today, Facebook shares (PDF) more details on the structure of this forum and the functioning of the review process.
Earlier, Facebook had explained that the review committee would not take the first, or even the second, decision on reported content. Instead, when someone reports content on Facebook, the first two calls will still be handled by Facebook's internal control systems. But if someone is not satisfied with Facebook's decision, the case can be reviewed by the new review committee.
However, the board of directors may not decide to take charge of all the cases raised in the chain. Instead, he will focus on those who, in his view, are the most important, the company said.
Today, Facebook explains in more detail the composition of the board and the management of its decisions.
In a draft charter, the company states that the board of directors will include experts with experience in "content, confidentiality, freedom of expression, human rights, journalism , civil rights, security and other relevant disciplines. The list of members will also be public. the board will be supported by full-time staff who will ensure that its decisions are properly implemented.
While decisions on the composition of the board have not yet been made, Facebook now suggests that the board should have 40 members. These will be chosen by Facebook after publicly announcing the qualifications required to join. It also states that it will pay particular attention to factors such as "geographical and cultural origin" and "diversity of origins and perspectives".
The board will also not include former or current Facebook employees, Facebook casual employees, or government officials.
Once this council is launched, it will be responsible for the future selection of members after the end of their term.
Facebook believes that the ideal term of office is three years, automatically renewable once, for those who wish to continue their participation. Board members will also serve "part time" – a necessary consideration, as many will likely have other roles than watching Facebook content.
Facebook will finally allow the board to have the last word. It can reverse Facebook's decisions, if necessary. The company can then choose to incorporate some of the final decisions into its own political development. Facebook can also seek advice from the board of directors, even when a decision is not urgent.
The advice will be referred to cases both through the user recourse process, as well as directly from Facebook. For the latter, Facebook will probably send the most controversial or controversial decisions, or those where the existing policy seems to be in contradiction with Facebook's own values.
To further guide board members, Facebook will issue a final charter including a statement of its values.
The council will not decide, however, where this would violate the law.
Business will be heard by smaller groups of odd numbers and rotating members. Decisions will be assigned to the Review Board, but the names of the members of the board that adjudicated an individual case will not be attached to the decision – this is likely something that could protect them from directed threats and harassment.
The decisions of the board will be made public, but that will not compromise the privacy of users in its explanations. Once the decision is made, the board will have two weeks to publish its decision and explanations. In the case of non-unanimous decisions, a dissenting member may choose to publish his or her point of view with the final decision.
In the same way as a superior court, the jury will refer to its prior opinions before finalizing its decision on a new case.
After having decided on their list of cases, the members of the first panel will choose a list of cases that will be heard by the next panel. This panel will then select the third set of cases, and so on. Most members of a group will have to agree that a case will be heard to be on the roll.
Because 40 people can not reasonably represent the entire planet, nor the 2 billion and more Facebook users, the board will rely on consultants and experts, if any, in order to bring together the "linguistic, cultural and socio-political expertise" necessary for its decisions, says Facebook.
In order for the board of directors to remain impartial, Facebook plans to establish rules for challenges in cases of conflict of interest, and will not allow the board to lobby or accept incentives. However, the board will receive a standardized fixed salary before its term.
None of the plans announced is final, it is only the initial proposals of Facebook.
Facebook publishes them as a project to gather feedback and indicates that this will allow external stakeholders to submit their own proposals in the coming weeks.
The company also plans to hold a series of workshops around the world over the next six months, during which various experts will meet to discuss issues such as freedom of expression and technology. , democracy, procedural fairness and human rights. The workshops will be held in Singapore, Delhi, Nairobi, Berlin, New York, Mexico City and other cities not yet announced.
Facebook has been criticized for addressing issues such as the calls for violence that led to genocide in Myanmar and riots in Sri Lanka; electoral interference of actors supported by the State of Russia, Iran and elsewhere; its inability to abolish child abuse posts in India; the militarization of Facebook by the Philippine government to silence its detractors; Facebook's approach to dealing with Holocaust denials or conspiracy theorists like Alex Jones; and much more.
Some might say that Facebook is now relieving itself of its responsibilities by referring difficult decisions to outside counsel. This, after all, could potentially prevent society from being held responsible for war crimes, etc. But, on the other hand, Facebook has not shown itself capable of making reasonable political decisions concerning, for example, hate speech and propaganda. Perhaps it is time for him to call on the experts and let someone else make the decisions.