The level of biased censorship in social media has reached a fevered pitch, with mega-companies like Facebook leading the charge. But new internal documents show that Facebook has a secret list of VIP users who are exempt from the rules. It’s called XCheck.
The internet has become a seemingly essential tool for communication and the dissemination of information. Gone are the days of newspapers, libraries, and town hall meetings: if you have something to say or a question to answer, you need to get online.
But your ability to speak freely online is quickly evaporating. Since a year ago, companies like Facebook have been aggressively censoring information and opinions. And this “nazi-esque” style of censorship was hailed as a brave fight against terrorism and misinformation.
Facebook Founder and CEO Mark Zuckerberg has said that the social media platform allows its billions of users to communicate on a level playing field with the celebrities of media, politics, and pop culture.
Turns out, that was a big, fat LIE.
The program, known as “cross check” or “XCheck,” was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists.
Today, it shields millions of VIP users from the company’s normal enforcement process, the documents show. Some users are “whitelisted” — rendered immune from enforcement actions (kind of like Big Pharma with vaccines) — while others are allowed to post rule-violating material and outright lies, pending Facebook employee reviews that often never come.
But if you cite the CDC’s VAERS data or dare to post a vaccine package insert, you’re likely to get banned for “posting disinformation”…
In response to what the documents describe as “chronic under investment in moderation efforts,” many teams around Facebook chose not to enforce the rules with high-profile accounts at all — a practice referred to as whitelisting. In some instances, whitelist status was granted with little record of who had granted it and why, according to the 2019 audit.
“This problem is pervasive, touching almost every area of the company,” the 2019 review states, citing the audit. It concluded that whitelists “pose numerous legal, compliance, and legitimacy risks for the company and harm to our community.”
A 2019 internal review of Facebook’s whitelisting practices, marked attorney-client privileged, found favoritism to those users to be both widespread and “not publicly defensible.”
“We are not actually doing what we say we do publicly,” said the confidential review. It called the company’s actions “a breach of trust” and added: “Unlike the rest of our community, these people can violate our standards without any consequences.”
In 2020, XCheck included almost 6 million users, ranging from politicians and journalists, to athletes and musicians. But when Facebook’s oversight board asked about the special system, the company said in writing that XCheck was used in “a small number of decisions.”
They show that Facebook knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands. And it seems that Facebook has no interest in fixing them.
At least some of the documents have been turned over to the Securities and Exchange Commission and to Congress by a person seeking federal whistleblower protection, according to people familiar with the matter.
For ordinary users, Facebook dispenses a kind of rough justice in assessing whether posts meet the company’s rules against bullying, sexual content, hate speech and incitement to violence. Sometimes the company’s automated systems summarily delete or bury content suspected of rule violations without a human review. At other times, material flagged by those systems or by users is assessed by content moderators employed by outside companies.
Users designated for XCheck reviews are generally exempt from these rules. Facebook designed the system to minimize what its employees have described in the documents as “PR fires”… negative media attention that comes from botched enforcement actions taken against VIPs.
If Facebook’s systems conclude that one of those accounts might have broken its rules, they don’t remove the content (at least not right away). They route the complaint into a separate system, staffed by better-trained, full-time employees, for additional layers of review.
Another major concern is the ease with which employees can add users to the program. We’ve seen what happens when regulators have personal bias regarding their subject matter. Scientists with ties to the Wuhan Institute of Virology insisted that the virus was not man-made, inhibiting our ability to fight the illness.
Most Facebook employees were able to add users into the XCheck system, the documents say, and a 2019 audit found that at least 45 teams around the company were involved in whitelisting. Users aren’t generally told that they have been tagged for special treatment. An internal guide to XCheck eligibility cites qualifications including being “newsworthy,” “influential or popular” or “PR risky.”
While the program included most government officials, it didn’t include all candidates for public office, at times effectively granting incumbents in elections an advantage over challengers. The discrepancy was most prevalent in state and local races, the documents show, and employees worried Facebook could be subject to accusations of favoritism.
Facebook recognized years ago that the enforcement exemptions granted by its XCheck system were unacceptable, with protections sometimes granted to what it called abusive accounts and persistent violators of the rules, the documents show. Nevertheless, the program expanded over time, with tens of thousands of accounts added just last year.
In addition, Facebook has asked fact-checking partners to retroactively change their findings on posts from high-profile accounts, waived standard punishments for propagating what it classifies as misinformation and even altered planned changes to its algorithms to avoid political fallout.
The shocking truth is that Facebook is not doing what they say they’re doing. Exceptions are routinely made for powerful people, directly changing the public discourse.
And many Facebook executives agree…
One of the fundamental reasons I joined FB Is that I believe in its potential to be a profoundly democratizing force that enables everyone to have an equal civic voice,” wrote Samidh Chakrabarti, an executive who headed Facebook’s Civic Team. “So having different rules on speech for different people is very troubling to me.”
FB’s decision-making on content policy is influenced by political considerations,” wrote an economist in the company’s data-science division.
Separate content policy from public policy,” recommended Kaushik Iyer, then lead engineer for Facebook’s civic integrity team, in a June 2020 memo.
Still, Facebook went to great lengths to create the appearance of integrity and transparency. Facebook misled the Oversight Board, says Kate Klonick, a law professor at St. John’s University. The board was funded with an initial $130 million commitment from Facebook in 2019, and Ms. Klonick was given special access by the company to study the group’s formation and its processes.
“Why would they spend so much time and money setting up the Oversight Board, then lie to it?” she said of Facebook after reviewing the XCheck documents. “This is going to completely undercut it.”
In a written statement, a spokesman for the board said it “has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts.”
Is it any surprise that people don’t trust the media anymore? We’ve seen over and over again how people can be silenced or banned for “misinformation” only to have those same policies reversed later. We saw Mark Zuckerberg jump at the opportunity to help Anthony Fauci disseminate any message he wanted.
Big Tech is protecting their interests. Famous, powerful people are their bread and butter, and until we have true transparency and accountability to the First Amendment, companies like Facebook will continue to do whatever they want.
Cheryl says
Thanks for the well informed articles. Appreciated.