Facebook is attempting to silence coronavirus vaccine skeptics through an algorithm that targets and demotes their remarks on the site, according to two employees who work for the social media giant and leaked internal memos to Project Veritas.
The leaked documents purportedly show that Facebook has tested a beta version of its algorithm on about 1.5% of its nearly four billion users worldwide to find those who are hesitant about the vaccines.
When asked about these claims, a Facebook spokesperson told the New York Post, “We proactively announced this policy on our company blog and also updated our help center with this information.”
According to the leaked memos, the vaccine hesitancy ratings are divided into three tiers, with the lowest one an explicit violation of Facebook policy. Those in the next tier are labeled as “Alarmism & Criticism.”
The third tier is considered “Indirect Vaccine Discouragement,” which includes, for example, “shocking stories” that have the potential to deter others from taking the vaccine, even if the stories turn out to be true.
The remarks are then assigned by the algorithm, a score that could remove the comments altogether or at least have them be hidden from the “most relevant” category.
One of the anonymous Facebook workers explained to Project Veritas founder James O’Keefe that based on the assigned score, “we will demote or leave the comment alone depending on the content within the comment,” adding that “They’re trying to control this content before it even makes it onto your page, before you even see it.”
One of the leaked memos describes the company’s goal as drastically reducing “user exposure to vaccine hesitancy in comments,” emphasizing that “reducing the visibility of these comments represents another significant opportunity for us to remove barriers to vaccination that users on the platform may potentially encounter.”
The memo also noted that the algorithm was the only method the social media giant knew of at the time to contend with “the high prevalence of vaccine hesitancy in Health comments,” and that a better “detection” tool would be used when it was developed.
Facebook has previously admitted that it may lessen the visibility of content that may not violate company policy but discourages someone from getting vaccinated by sharing information out of context, according to the New York Post.
Facebook CEO Mark Zuckerberg announced in March that the company would add labels to posts concerning vaccines in order to fight against the spread of misinformation.
“For example, we’re adding a label on posts that discuss the safety of COVID-19 vaccines that notes COVID-19 vaccines go through tests for safety and effectiveness before they’re approved,” Zuckerberg wrote.
Source: Newmax