I recently found an ad on the back of one of the few actual hard-copy magazines I read. It was from Facebook, and it read, “Facebook is taking action to keep its platform safe. We have 40,000 people working on safety and security. That’s more than the size of the FBI.”
That number may be more than the FBI, but it will be less than half that of the IRS. I don’t know about you, but I certainly sleep better at night knowing that Zuckerberg’s security team is making sure that what we see is carefully curated for our own good, including posts like this one from House Republicans that simply stated: “If you take out a loan, you pay it back. Period.”
WOW: @facebook says our post about paying back loans violates their “community standards.”
Big Tech’s at it again. pic.twitter.com/Oo5lESfxwU
— House Judiciary GOP (@JudiciaryGOP) August 25, 2022
Those words were posted on the House Judiciary GOP page on Wednesday. Not only were people unable to share it, but Facebook censored the post stating: “Your post goes against our Community Standards so only you can see it.” At least one person claimed to have gone directly to FB jail for trying to share the post. As of Thursday morning, the post was back up on the page, probably because enough people, particularly members of the House Judiciary GOP, complained.
Some people blamed the FB algorithm for the mistake, and that is probably a safe bet. But you can read, re-read, and even diagram the sentence: “If you take out a loan, you pay it back. Period.” and you won’t find anything inflammatory, revolutionary, or violent about it. Yes, the post is critical of Biden’s student loan forgiveness plan, but it is not urging armed insurrection, massive civil disobedience, or even a boycott. If anything, it is pretty vanilla. Biden’s name isn’t even mentioned.
So what triggered the algorithm? An algorithm does not exercise judgment. It does what it is programmed to do, and in the case of the web, an algorithm looks for certain indicators and then executes an action based on those indicators. That is a very basic explanation based on my experience writing content websites for a private business.
So what is the algorithm programmed to see? That of course is up to the person who wrote it. Since there is nothing ostensibly triggering (if you will excuse the term) in the post, is it possible that the algorithm is set to flag anything from Republicans or conservatives for review? And if that is the case, did a member of Zuck’s Army with the requisite man-bun, a bio full of pronouns, and a hefty student loan tab decide to censor the post as a matter of principle, figuring they could get away with it? If so, how many things are never seen simply because no one ever challenges the powers that be?
Granted, Facebook is well on its way to the same graveyard as Myspace, thanks to TikTok and other apps. I once read a comment that Facebook was simply a place where “the olds share conspiracy theories.” That isn’t quite true or fair. There are plenty of other useless and banal things on Facebook besides conspiracy theories. But the people who are writing these algorithms are the ones who benefit from left-wing policies, and the ones who will control the channels of speech.
They are the people at Yale Law School who created so much chaos, including threats that led a panel discussion on finding common ground on free speech to be canceled and the panelists escorted from the room. They are people like Kristen Gonzalez, the winner of the primary for New York’s Senate District 59 who according to the Daily Caller said, “I know we’re saving the speeches for a little later, but today we really prove that socialism wins and we are not going anywhere… We are not going anywhere and we will not stop until we see a socialist slate across this city.”
These are the people who believe that freedom means they have the right to do what they want to do, while you have the right to shut up and condone and pay for it. And if you object, you may be suspended from social media. Or even subjected to the Yale Treatment.
Source: PJ Media