In this article, we’ll delve into Meta’s response to the Oversight Board’s recommendations regarding its cross-check program.
Key Takeaways:
- Meta has agreed to modify the cross-check program on Facebook and Instagram, which shields high-profile figures from the company’s automated moderation system.
- The Oversight Board gave Meta 32 suggestions on how to enhance its cross-check system. Meta has committed to implementing 11 of those suggestions completely and adopting 15 of them partially.
- The cross-check program has faced scrutiny for allegedly shielding politicians, celebrities, and popular athletes from the automated moderation system.
- Meta has committed to making the cross-check system more transparent through regular reporting and better accounting for human rights interests and equity.
- The Oversight Board expressed some dissatisfaction with the changes the company is willing to make.
Meta Agrees to Modify Cross-Check Program after Oversight Board’s Recommendations
In response to the Oversight Board’s recommendations regarding its cross-check program, Meta (formerly known as Facebook) has agreed to modify the program on Facebook and Instagram, which shields high-profile figures from the company’s automated moderation system.
The Oversight Board is an independent group responsible for evaluating Meta’s content moderation choices.
After examining Meta’s cross-check program, they presented 32 recommendations to help enhance it. Meta agreed to fully implement 11 of the suggestions and partially adopt 15 more.
Background on Meta’s Cross-Check Program
Following a 2021 report by The Wall Street Journal revealing that Meta used its cross-check program to protect politicians, celebrities, and athletes from its moderation system, the program has come under scrutiny.
The Oversight Board criticized the program, saying it appears to prioritize business concerns over the company’s human rights commitments.
Meta’s Response to Oversight Board’s Recommendations
In response to the Oversight Board’s recommendations, Meta has agreed to make the cross-check system “more transparent through regular reporting” and adjust the criteria used to add people to the program “to better account for human rights interests and equity.”
The company has also committed to reducing the cross-check program’s backlog, which the Oversight Board found could cause harmful content to stay online longer than it should.
The Oversight Board’s Assessment of Meta’s Response
While the Oversight Board called Meta’s response a “landmark moment,” it expressed some dissatisfaction with the changes the company is willing to make.
“Several aspects of Meta’s response haven’t gone as far as we recommended to achieve a more transparent and equitable system,” the Oversight Board writes.
Meta’s Explanation of Cross-Check Program
According to Meta, the cross-check system was built to prevent potential over-enforcement and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe, such as when journalists are reporting from conflict zones.
Meta has teams and resources devoted to improving the cross-check system, including a standardized system, expanded eligibility for cross-check reviews, and more controls.
Meta’s Commitment to Addressing Recommendations
To fully address the number of recommendations, Meta has agreed with the board to review and respond within 90 days.
Holding Meta accountable for its content policies and processes, as well as its decisions, is exactly why the Oversight Board was established.
Meta welcomes the Oversight Board’s recommendations and the independent oversight they provide, and will continue to engage with them to answer their questions. Per the bylaws, Meta will publicly respond to the recommendations within 30 days.