Mark Zuckerberg, CEO of Meta, recently spoke on a podcast with Joe Rogan, where he discussed the circumstances leading up to and including the removal of the news about the laptop belonging to Hunter Biden from Facebook.
Shortly after the story's release, Twitter notoriously banned it, despite widespread consensus that it played a significant role in the 2020 election. Facebook made the announcement that they were limiting its dissemination at the time, without providing much specifics as to why or how this was happening.
In late October of 2020, Zuckerberg testified before the Senate on why Facebook briefly stifled the story: the FBI had told them to be on heightened alert about hack and leak activities running up to the election.
Despite the fact that Zuckerberg did not provide any particularly fresh material with Rogan, their talk has generated a lot of buzz and focused further attention on the government's role in censoring social media.
Zuckerberg explained the context of the situation as follows: I think the FBI came to us, some guys on our team, and was like 'Hey, just so you know, you should be on high alert. The 2016 election seemed to be rife with Russian propaganda, at least to us. It's been made clear to us that something like that leak is imminent, so everyone should just keep an eye out.
Mark Zuckerberg emphasized that unlike Twitter, Facebook did not completely block the story but rather limited its reach.
The number of individuals who could share it on Facebook was cut in half for a period of about five or seven days while it was being evaluated whether it was true or not. And you may still consume and share it, Zuckerberg elaborated.
So when you say the dispersion is lessened, what do you mean? Just how does that thing function, anyway? This is what Rogan needed to know.
Fundamentally, the news stream position was lower. So it reached a smaller audience than it otherwise would have, Zuckerberg explained.
We need to know, By what proportion?
Not that I can think of right now. But it's… it has meaning.
After calling the narrative "hyper-political," Zuckerberg again stressed the importance of the FBI's feedback in reaching the conclusion.
We just sort of thought, 'hey look, if the FBI' — which I still see as a respectable organization in this country, it's extremely professional law enforcement — "if they come to us and tell us that we need to be on alert about something, then I want to take that seriously.
Did they warn you to be wary of that story in particular? This is what Rogan needed to know.
But it fit the pattern, Zuckerberg said, even though I can't recall the details.
For a long time, those who make their living making material have suspected that Big Tech sites like Facebook restrict it behind the scenes. Because of the opaque nature of these services, we can't say for sure why certain content becomes popular while similar content fails to gain traction. There have been persistent rumors that "the algorithm" gives preferential treatment to certain stories and accounts over others, although proof of this has been difficult to come by without access to the system's backend.
Not much has changed in terms of our lack of understanding of Facebook's filtering practices or the types of content that have been removed since Zuckerberg's recent comments. Reading between the lines of his words, however, we can determine the following with absolute certainty: there is a group of people that regulate which content is propagated, and if they are even suspicious of your stuff, it is going to hurt.
Something as simple as this should send shivers down your spine. It is profoundly worrisome to think that a panel of "experts" is determining which stories make it to national news and which ones don't. There is a risk of people being exposed to "wrong" or "misleading" information, but that risk pales in comparison to the risk of letting a small, possibly prejudiced minority govern the national debate.
The preceding is a summary of an article that originally appeared on Headline Wealth.