
Anyone who shares hate speech or nude images on Facebook will be banned. But Facebook doesn’t apply those rules to everyone. Some celebrities or politicians are allowed a lot more, and Facebook lies about that.
The Wall Street Journal was able to view documents and spoke to dozens of (former) employees about XCheck or cross check. A program where people with some status get an exception to the rules within Facebook. The list now numbers about 5.8 million people worldwide.
Anyone who calls for violence today, such as former US President Trump, or posts nude images of a woman (and certainly without permission), such as Brazilian footballer Neymar in 2019, will normally be temporarily or permanently suspended. In addition, the content will be content be taken offline. In the case of Trump, that content was kept several times, and in the case of Neymar, it took at least another 24 hours before Facebook (by higher-ranking moderators) took action.
These are just a few of the many consequences that come with XCheck, a program that Facebook concealed from lawmakers and regulators and its own independent Oversight Board. The consequences are not minor. In the case of the nude photo, 56 million users were shown the image of a woman who had not asked for it. Conversely, unknown photographers can get banned for a month if they post a photo with a half-naked nipple.
The favouritism also has serious consequences, as posts claiming vaccines kill, things debunked by Facebook’s independent fact-checkers pass silently when shared by certain public figures. Documents the Wall Street Journal looked at about an internal review of the program speak of “a breach of trust.” “We don’t do what we say we do,” Facebook realizes itself. But the program has not yet been completely discontinued.
Facebook has struggled with its own popularity since its early years. This is because such a large number of people share photos, videos, and messages. It is up to Facebook to moderate quickly and efficiently to detect atrocities such as beheading videos, pedophilia, or human trafficking. But it must also act against threats. For example, cases that may be reported, but are so numerous that it is impossible for them to happen only by people. It therefore also uses artificial intelligence, but AI also makes mistakes from time to time.
A second reason is the image of Facebook. XCheck was introduced, among other things, to prevent VIPs from being sanctioned too quickly, which, thanks to their reach, could put Facebook in a bad light. Those who are too newsworthy, influential, popular or ‘PR risky’ are allowed more. Reports from such persons are at best pardoned or forwarded to more trained moderators and deleted much later.
In the internal documents, Facebook itself indicates that the system is actually not defensible. They also show that Facebook is very well aware of the many issues on its platform. Matters of which the company has allegedly fallen out of the blue in recent years and has repeatedly promised to adjust them.