The Invisible Influence of Big Tech on Politics and Elections: Allum Bokhari

How do big tech companies impact the political beliefs of their users? In this episode, we sit down with investigative journalist Allum Bokhari, author of “#DELETED: Big Tech’s Battle to Erase the Trump Movement and Steal the Election.”

Bokhari conducted a number of interviews with big tech insiders to learn more about how Silicon Valley tech giants can influence users, and even elections.

This is American Thought Leaders, and I’m Jan Jekielek.

Jan Jekielek: Allum Bokhari, such a pleasure to have you on American Thought Leaders.

Allum Bokhari: Hi, Jan. Great to be on.

Mr. Jekielek: Allum, you’ve written a very, very, very interesting and extremely timely, I’ll say, book, “#Deleted.” I’m going to ask you to tell me—there’s actually a number of issues that you cover in here. A big one of them is big tech censorship. There’s also the visible censorship, and there’s also what you describe as the invisible censorship. But the big question on a lot of people’s minds—and you actually cover this as well—is what does this all have to do with our election process? Let’s get started there.

Mr. Bokhari: Well, as I cover in the book, and first of all, I’ll say this book is not just my opinions about big tech. It’s based on interviews I’ve conducted with people inside Facebook, people inside Google, whistleblowers who are concerned about the direction of these companies and the influence they’re having on democratic politics—which is completely unaccountable influence.

There’s no oversight of these companies. And one of the things my sources have told me is that 2016 changed everything inside Silicon Valley because there was this enormous panic inside these companies, most of which, well, all of which, really leaned to the far left.

They did not expect Trump to win, and as soon as the 2016 election happened, there was this huge push among the most anti-Trump employees to create initiatives against misinformation, against fake news, against hate speech. This is when this all started, this trend started to accelerate.

The focus has been on, I think, as the book claims, suppressing the Trump movement, making sure 2016 doesn’t happen again. You could say this is partisan hyperbole. A lot of people said my title, “Big Tech’s Battle to Erase the Trump Movement and Steal the Election” is alarmist. But if you look at the events of the past two weeks, I don’t think you can call it alarmist.

We see these big tech companies censoring one of America’s oldest newspapers, The New York Post. We see them shutting down the White House Press Secretary’s account. We’ve seen them censor the President’s account on numerous occasions on Twitter. So it’s pretty obvious what they’re doing. They have an agenda.

There’s no regulation stopping them from interfering in the election, and they have a vast amount of control over the flow of information. So they have the motive to use that power. There’s no regulator stopping them using that power. And it’s going to have a big impact on what Americans are allowed to see, what Americans are allowed to read about as we come up to this crucial election.

Mr. Jekielek: This point, that you’ve actually spoken with a whole suite of industry insiders, whistleblowers, essentially. Why don’t you give me an example of the one that you think is the most significant and what they told you, before we move further?

Mr. Bokhari: Well, I’ll focus on one of the things that one of my Facebook sources told me in the book—and by the way, if you want to check out the website, it has a full extensive interview with this guy and many others inside Silicon Valley. One of the things he told me is that one of Facebook’s priorities after 2016 was this so-called depolarization effort.

Facebook was getting blamed in the media quite a bit for contributing to a culture of hyper-partisanship and polarization. So what my Facebook source told me is that they’ve been looking at people who have gone from the so-called far-right to the center on Facebook, looking at the kind of stuff they read, looking at the kind of videos they watched. And my source says they can build a model of users like that to influence other so-called far-right people on the platform.

And of course, this is Silicon Valley we’re talking about. It’s a very far-left environment. That’s something even Mark Zuckerberg admits, that the climate’s very far-left in Silicon Valley, so their definition of far-right is going to be different to ours.

But the general gist of what my source told me is that they’ve effectively developed these methods for invisibly influencing people because they know so much about us and what we watch and how our opinions and our interests and our likes and dislikes change over time, that they’re building models now to change our political opinions that we can’t even see, that we’re not even aware of.

That struck me as particularly Orwellian. It seemed like a kind of brainwashing model that Facebook is working on. And they call it depolarization, of course, which is a very nice sounding word. It makes it sound politically neutral. “We’re just trying to make everyone less partisan, less polarized,” but it conceals, I think, a very insidious program.

Mr. Jekielek: Well, it all depends exactly where you envision the center to be in your depolarization effort.

Mr. Bokhari: Precisely. And I’ve talked to Facebook people off the record before, left-wing Facebook people, and they say, “Well, Black Lives Matter isn’t a political movement. Everyone should agree with it. We don’t consider this to be political.” And they say the same things about feminism and all sorts of other far-left causes. So as you say, where Silicon Valley perceives the political center to be is probably way off where most Americans perceive the political center to be.

Mr. Jekielek: Well, so tell me a little more. This is actually very interesting. Tell me a little more about how this kind of invisible influence works. You’ve talked about creating models to try to shift people towards this hypothetical center that the powers that be, I guess at Facebook in this case, would have decided upon. But how does this actually work in practice? Do you have an example, for example, of an instance where this is actually being enacted, this kind of invisible censorship? Oh, in fact, I know you do. But why don’t you tell me about that?

Mr. Bokhari: Well, it’s difficult to find examples. But one very obvious thing that we’ve seen that Google is doing—a different company to Facebook, obviously—but we’ve seen Google completely erase links to conservative media. Breitbart News, for example, we published data back in July showing that visibility to Breitbart News links from searches for anything had gone down by 99 percent compared to the same period in 2016.

They’ve completely erased Breitbart News, a great news organization. I’ve worked with them for five years. They’ve published all my biggest scoops about these big tech companies. They publish exclusive interviews with the President. And you search for these headlines on Google and you can’t even find them, even though it’s an exclusive interview with the President.

Imagine if you’re an undecided voter and you’re trying to find out more information about Joe Biden or about Donald Trump, or even about the person running in your local district, you’re just going to [use Google]. It’s very unlikely to find any media that isn’t progressive, that isn’t mainstream, that isn’t corporate.

And that’s one way that Google influences people because people don’t assume that the search results they’re seeing are biased. They think they’ve found the information themselves so their critical faculties are not on high alert as they would be if they were reading a story on The New York Times or reading a Fox News story, where you kind of know that there’s a certain perspective along with it. People don’t assume that with Google, and that’s how they can influence people invisibly.

Mr. Jekielek: Well, this is something that reminds me of Dr. Robert Epstein’s research. At least on the Google side, he’s done a number of things, but he was showing how reordering the relative ranking of search results actually has influence, especially when they’re results about political candidates. It actually influences significantly how people would ultimately decide to vote.

Mr. Bokhari: That’s absolutely correct. His research shows double digit margins, undecided voters being swung by double digit margins by a search engine, which is more than enough to swing a close race, which this coming election will be a close race. We’re really going to see if big tech can swing an election because they’ve certainly haven’t concealed what they’re doing, especially over the past two weeks.

But I really want to get down to like the core of invisible censorship because it really affects absolutely everything you see on the internet. And I’ll focus on just two aspects of it, one of which, discussed extensively in the book, is the quality ranking. Now the quality ranking is something that everything you post on Twitter or Facebook, every website you try and put on Google, it’s all going to be given a secret score.

And this score is how the algorithms of these platforms determine what’s going to appear at the top of search results, what’s going to appear at the top of your Facebook feed or your YouTube feed or your Twitter feed. This is how big tech sorts through the vast amounts of content that’s on the internet because they have to decide what’s going to appear at the top of your feed. They have to make some sort of decision. And the quality ranking is how they do it.

Now, in the past, the quality ranking used to be impacted by things that were pretty straightforward and reasonable. If your website had malware and viruses, if it had spam or phishing or other unsafe material, that would impact your quality score. You probably wouldn’t appear anywhere close to the first page of Google. And the same goes for Twitter and Facebook—if your posts contain unsafe material, it’s not going to be shown to people. And that’s pretty reasonable.

But over the past four years, they’ve introduced all sorts of political criteria to the score as well. Now the algorithms are looking at other things that are not so straightforward. So they’re asking, does the website or the post or the YouTube video contain misinformation or hate speech or fake news or conspiracy theories.

So now, this quality ranking is determined in part by your conformity to the values of Silicon Valley, to the values of the ruling elite. And that, I think, strikes me as a system that’s becoming eerily similar to the Chinese social credit system where again, you are ranked based on your level of conformity to the values of the ruling elite. That’s essentially what Silicon Valley is moving towards.

And there’s really no escape from it, because Breitbart News published the news a few months ago that all of these companies are now agreeing on a common definition of hate speech. They’ll probably do the same with misinformation and fake news conspiracy theories. So there’s going to be no escape on alternative platforms from this either.

Mr. Jekielek: This is really fascinating. And actually, you mentioned this whole kind of censorship effort around the Hunter Biden emails, right, that has happened recently. And that’s very interesting because I think transparently even, the Facebook representative said, “We’re going to suppress this even prior to a fact check,” however dubious that fact check may have been anyway, but just simply because it seemed, to me, to be politically inconvenient based on, as you describe, the values of the people that hold the levers to the decision making.

Mr. Bokhari: Yes. And Facebook, by the way, is completely compromised on this issue. We’ve published a story recently showing that Anna Makanju, Facebook’s global director for content moderation, used to be Biden’s top advisor on Ukraine. She was actually on the calls that Biden had with the top Ukrainian politician.

And she’s also at the Atlantic Council—the Atlantic Council, she’s a fellow there. Atlantic Council took over $400,000 from Burisma, the company at the heart of this very scandal of The New York Post story. So to have someone like that at such a senior position inside Facebook.

Facebook by the way, has not commented on this story at all. They were completely silent. They completely stonewalled us—that just strikes me as hopelessly compromised, complete conflict of interest. And as you said, even though Facebook has spent the last four years building up this censorship apparatus to deal with so-called misinformation and fake news, building up this network of third party fact checkers, they actually went around that system in order to censor the story.

And they were even condemned by the Poynter Institute, which accredits their third party fact checkers, for going around them. So it was really weird. Even for Facebook, this was an extraordinary, unprecedented move.

Mr. Jekielek: Of course, now just as we’re recording here, yesterday, we had the news of Glenn Greenwald seemingly being effectively forced out of his own news organizations, which was ironically founded specifically to speak truth to power. I mean, that was the reason he felt he needed to have a separate organization. Seemingly—again, we don’t know the entirety of the story here, but we’re getting more and more—editors simply refusing to put anything that was negative related to, let’s say the Biden family.

Mr. Bokhari: It was also very threatening to the mainstream media because suddenly you had a situation where independent blogs, YouTubers, people on Periscope could compete with them on an even playing field. And that was a huge, obviously a huge threat for them.

And it used to be the case that this stuff would get out through the internet, it would get out through these social media platforms. This was the case in 2016. And I think that’s why they’ve worked so hard—and by they, I mean both the tech companies and the mainstream media, which always pressures these tech companies to do even more censorship than they’re currently engaged in—that’s why they’ve spent the last four years trying to regain control over these platforms because the sheer momentum that I think the anti-establishment populist movement had in 2016 was very threatening.

And we’re now seeing the end result of all the work they’ve done over the past four years to build these systems, to contain the viral spread of information. I’m not sure they succeeded in the case of The New York Post. I think The New York Post story still gained a pretty wide reach. Didn’t gain a wide reach on Facebook, actually, compared to other stories from the mainstream media.

But people still read about it, they still shared it, they still found it. But there are so many people who get censored on these social media platforms and no one ever hears about it because they’re not as prominent, and they’re not in the public eye.

Mr. Jekielek: You talked about how the rating system for information or sources … that these tech companies use is eerily similar to what the Chinese Communist Party has been developing. It just made me think, there’s an intersection of this in this group that The New York Post revealed in Facebook, which I think was called the hate speech engineering group [Hate Speech Engineering Team], which actually has numerous Chinese university people acting to, essentially, it seems, police hate speech. And what do you make of this reality? Or these revelations?

Mr. Bokhari: Well, this is something that my sources told me about, my Facebook sources told me about in the book as well: that a lot of Facebook’s foreign employees see the efforts around hate speech and the efforts around misinformation as a way for them to influence American elections because Facebook’s foreign employees are part of the company culture. They were just as upset by Donald Trump’s win as Facebook’s American employees.

I also spoke to Ryan Hartwig—he’s a Facebook whistleblower—and he told me he was allowed to moderate content relating to elections in Venezuela, in Mexico, and Canada, all around the world. So it really shows the lie of this effort, of these efforts around foreign interference that the media has made such a big stink about over the past four years, obviously, completely focused on Russia.

But the real danger of foreign interference is all these foreign employees who work for these tech companies, including, as you said, Chinese employees. Do we know: are their sympathies with the interest of America or the interests of the Chinese Communist Party? We don’t know.

Just last week, I published a story about a former operative for the Venezuelan government, actually worked for their government for many, many years, and is now working for Facebook. So this whole panic around foreign interference seems to me to be a complete red herring when there’s been no focus on individuals like that.

Mr. Jekielek: Let me get this straight. Your contention is that the biggest source of interference to the election, bar none, is going to be from big tech.

Mr. Bokhari: Absolutely, yes. And here’s why. Big tech has been able to rob the Trump movement of its momentum. And let me break that down a little bit more. So I talked a little about  quality rankings. There are two other things that we should be aware of with Silicon Valley uses, two other key tools in their arsenal of censorship. One is network analysis, and another is language analysis.

Language analysis, you briefly touched on this before, is how Silicon Valley trains its algorithms to recognize certain types of speech. So training algorithms to recognize hate speech, training algorithms to recognize misinformation. Facebook even held a competition this summer with a $100,000 prize to the programmer who comes up with the best way to detect a hateful meme. So they’re even training algorithms to recognize images.

Then there’s network analysis. Network analysis is how Silicon Valley trains its algorithms to recognize the connections between people, so who’s following who on social media, who’s linking to who. And taken together, this is a formula for censoring and suppressing entire political movements.

So consider what happens when a big account on Twitter or on Facebook gets banned for hate speech or for misinformation. Their followers might not get banned. But what’s happening is, as soon as that big account gets banned, it sends a signal to the algorithm saying all of the people who follow this account, they’re now in the system as having followed an account known for posting hate speech or known for posting misinformation.

And while they won’t be banned themselves, their quality ranking might go down, so they might appear less in search, they might appear less in Facebook and Twitter news feeds or anywhere else where Silicon Valley, where these tech platforms rank and sort people’s information. So that’s a formula for suppressing entire networks and entire political movements. And that I think, is what has prevented the Trump movement from building on the momentum it really had in the digital world in 2016.

Mr. Jekielek: Okay, fascinating. So are you saying that you actually have evidence that exactly these sorts of systems are in play? Or is this a theoretical construct based on the capabilities that exist today?

Mr. Bokhari: Well, they don’t try to hide it. Like I said, Facebook even had a public competition advertised on their website for programmers to train algorithms to detect hate memes. So the language analysis stuff is right out in the open. As far as network analysis, all the big tech platforms use this. You can Google “network analysis.” It’s a very mainstream field in computer science.

And again, much like with how the quality ranking used to be used for pretty benign purposes, like weeding out sites that contain viruses and malware, the network analysis used to be used for detecting networks of people who go to Chick-fil-A or networks of people who buy books on Amazon.

But now, of course, that’s being used as a tool to suppress hate speech and misinformation. And again, the companies are very open about what they’re doing. These are just the tools that they’re going to use to do it.

Mr. Jekielek: I’m just thinking of something we talked about offline as we were preparing for this, and I was mentioning that I’m very unhappy often with the schadenfreude that some  conservative commentators have when seeing people unhappy at the fact that a Republican won or something in this vein. Because looking at some of these people, especially some of the favorite meme videos,some of these people look legitimately traumatized and in really rough shape, right. And so when I look at these videos, I think about the power of the ability to shape information. Tell me a bit about what you’re thinking with respect to this.

Mr. Bokhari: Well, I see what you’re saying, and that I suppose you can make an argument that social media does sort of polarize people. The problem is Facebook doesn’t seem to be going after the people who polarize the left and is leading to these moments we see on social media, people just completely losing control over politics.

Now, I don’t think Facebook should control anyone’s politics. They shouldn’t try and depolarize anyone, but there’s no doubt that this is leading to an uptick in that trend. But it’s really not just social media; you have to understand these people are being indoctrinated on college campuses.

They’re being indoctrinated by the media, the entire education system, from when they’re born to when they graduate college is feeding them propaganda. So that’s a big part of it, too. But the real worrisome aspect with regards to big tech, I agree that it’s kind of cruel to mock these people for their obsession with politics because they seem genuinely convinced that the US is about to descend into a fascist dictatorship, they seem genuinely distraught, as you’re saying.

The problem is, all of these young people who are coming out of these campuses completely radicalized, they’re going into Silicon Valley. Back in 2018, I obtained a video published on Breitbart News of Google’s top executives reacting to the Trump election, and it was much the same. They were breaking down; they were in tears. They were talking about what needs to be done about the election.

One executive said he wanted to make the populist movement a blip in history. Those were his words. The co-founder, Sergey Brin said he found the Trump election offensive, and he compared Trump voters to extremists. So it’s one thing to mock a random member of the public who’s caught on video screaming about the Trump election, but when the leaders of the most powerful tech company in the world are doing it, then we have a problem because they have an extraordinary amount of influence over the election, over political information, over the news media. And not just in America, but all around the world as well.

Mr. Jekielek: Looking at all of this, a lot of people might feel very helpless, right? Should they feel helpless?

Mr. Bokhari: Well, it’s certainly a very serious situation we find ourselves in because the tech platforms control the means of political organization. Most political speech and activity and organization today, most grassroots activity takes place on these platforms. So if you’re going to try and organize against them, well, they control the platforms where you’ll be organizing. So it’s a really difficult situation.

The other big difficulty—and there’s a whole chapter in the book on this—is the way Silicon Valley has just flooded Washington DC with lobbying money. So they fund conservative institutions; they fund progressive institutions; they fund Republican politicians; they fund Democrat politicians. And this is why although we have these endless committee hearings, nothing much gets done on Capitol Hill to rein in these tech giants.

So I’m being very pessimistic now. But the one glimmer of hope is that the Trump administration does seem to be taking meaningful action on the issue of big tech’s power. So the FCC has come out and said they’re going to look at doing a rulemaking change around Section 230, which is the critical law that allows tech companies to censor almost anything without any legal ramifications. So that’s very important.

The Trump administration has also been making some key appointments to the federal bureaucracy. Nathan Simington, who’s very good in the social media issues, being appointed at the FCC. Adam Candeub, who’s a great free speech lawyer—he’s actually fought free speech cases against Twitter—he’s been placed in a key position inside the federal government as well, specifically to deal with these tech giants. So we do have an executive branch now that seems serious about tackling the problem.

The question is, who’s going to win the election in a few days? Because Joe Biden is going to use that same executive power to force the tech companies to censor even more. He spent his entire campaign demanding Facebook censor his political opponent. That’s absolutely going to happen if he gets elected. So internet freedom is definitely on the ballot in a few days. That’s the one thing I think that people can do. Probably the most important thing they can do is go out and vote in this presidential election.

Mr. Jekielek: To your point, I recently saw Senator Hawley—it seemed like he was calling out to Congress saying, “We need to do something about this,” which I thought was fascinating because of course, he’s a US senator. And there seems to be this sense that there’s been this, unexpected perhaps, lack of action around all these issues through the legislative branch.

Mr. Bokhari: Yes. So it’s not that unexpected to me because the Democrats’ response to the crisis of big tech’s power has essentially been the complete opposite of the Republicans’—the one sole exception being perhaps Tulsi Gabbard, who seems to understand the censorship issue.

But most Democrats, while they’re condemning the tech platforms, are condemning the tech platforms for the opposite reasons that the Republicans are condemning the tech platforms. The Democrats, most Democrat politicians, when they talk about the tech giants, their argument is, “Well, why aren’t you censoring more hate speech? Why aren’t you doing more to rein in disinformation?”

Whereas the Republicans are asking them, “Why are you censoring?” So that’s why we have the impasse. You can’t get a majority in a divided Congress, when one side wants the tech companies to stop censoring and the other side wants them to censor even more.

Mr. Jekielek: Allum, we just saw very, very recently Senate hearings with the leadership of the top tech giants. Being called in to a Senate hearing is supposed to be something that strikes the fear of God into a person and serious consequences will result. Just watching these hearings, it didn’t seem to me like these leaders seemed terribly concerned about being there. This might just be my impression, right. What do you expect will be the impact of these senate hearings, given everything we’ve just talked about?

Mr. Bokhari: I think you’re right. I think they’re not afraid of the Senate, and I think you saw that based on their behavior, especially Jack Dorsey seemed to be telling the Senate as well, “If you don’t like it, just get off Twitter.” So I don’t think these people are scared of the lawmakers at all. They’re not scared of Congress. They’re not scared of the Senate.

They know these people are not going to do anything to them. They know the votes aren’t there. They know that their lobbying money, which goes to both conservatives and Democrats, has paid off. So I think the only branch of the government they’re afraid of right now is the executive branch, because there simply aren’t enough votes in the Senate or in Congress to rein them in.

Mr. Jekielek: You mentioned that a number of the Democratic Congressional members are basically saying that they want more censorship in certain areas. On the Republican side, I see a lot of people speaking about free market—how can you try to constrain a company in terms of how it does business, right? So if we’re talking about regulating the tech giants, of course, there’s this recent antitrust suit against Google that the Department of Justice has launched. But there doesn’t seem to be a lot of interest in doing this regulation on the congressional side in general. What are your thoughts?

Mr. Bokhari: Yes, even on the Republican side, while there are a few good senators and Republican congressmen who understand the issue—people like Hawley, Cruz, Blackburn—many of them just don’t get it. Some of them are actually opposed to regulating the tech giants because they’re still stuck in this 1980s free market mentality.

And you also have a lot of Republican conservative think tanks like the Heritage Foundation, like the American Enterprise Institute, that are also feeding them talking points saying, “Well, if you interfere with big tech, then you’re interfering in the free market.” It’s, of course, a totally nonsense argument because big tech owes its current position, they owe their current power to Section 230 of the Communications Decency Act, which is a special government privilege given to them in the 1990s.

So it’s totally nonsensical to say that on the one hand, interfering in this law is interfering in the free market, when at the same time, it’s actually a government handout, a special privilege that no other type of company enjoys. So Republicans are really going to have to get out of this mentality if these tech giants are going to be reined in.

This is a big issue, I think, for Republican primary voters. If they care about this issue, they need to start looking at who’s funding their local representatives. Don’t assume that you’re going to have a powerful voice against big tech as your representative just because there’s a red “R” next to their name.

Mr. Jekielek: So let’s talk a little bit about prescriptions here because obviously, you describe—and this is detailed very, very extensively in the book—you describe a very, very serious problem, an unprecedented issue in American society. These large companies have effectively become the public square, yet they’re private entities that don’t have a lot of accountability at the moment in terms of any actions they would take to control speech, for example. Here we are, right? And, of course, Section 230 is being looked at. We know the FCC is actually studying rulemaking around Section 230 at the moment. And what should happen next, according to Allum Bokhari?

Mr. Bokhari: I think there’s a very simple solution for social media censorship, and that’s simply to say to these companies that if you want these special legal privileges that your entire business model depends on, you have to allow users to opt out of all of your filters on legal, constitutionally protected speech.

If you want to make a hate speech filter or a misinformation filter, that’s fine. But you need to prove that users chose to have that filter switched on. You can’t switch to “on” on their behalf and forbid them from turning it off, which is what they’re doing right now. We know they’re able to do that because both Twitter and Google, for example, allow users to turn off their obscenity filter, so you can turn off Google’s SafeSearch button. But they don’t do that when it comes to fake news or misinformation.

And that’s very telling because it shows they care more about controlling political information, controlling the flow of news, than they do about maximizing consumer choice. So that has to be the standard across the entire industry for social media.

The second thing you have to do is, and especially for search engines, impose transparency and oversight. We need to be able to see behind the curtain, to see exactly how these algorithms are being trained. We need to see what the sort of things that lower your quality score [are]. Your quality ranking can’t be hidden from users. So those are the two prescriptions I would make: number one, opt-outs of all the filters, and number two, more transparency.

Mr. Jekielek: Another interesting thing—we were discussing this on a Periscope the other day—is actually the insight that you really need when it comes to how these algorithms are being trained, actually into what data is being fed to those algorithms to basically represent, say, hate speech or some other issue that’s being trained for these algorithms.

Mr. Bokhari: Yes, well, algorithms are actually not as complicated as you might think. I actually managed to explain what algorithms are in the space of a couple of pages in the book. Essentially, they’re just trained. For example, a hate speech algorithm, you’re just training it to recognize hate speech.

What does Silicon Valley consider to be hate speech? What does it not consider to be hate speech?” Then we’ll know if the algorithm is biased or not. We can both guess at  the kind of things they ignore. They probably don’t categorize Antifa as hate speech.

You’re feeding it a bunch of data, a bunch of examples saying, “This is hate speech; that is hate speech. This isn’t hate speech.” And then you’re sending it out to test it, and then you’re correcting its errors, and you’re improving it over time. So the way to impose some sort of oversight on an algorithm like that is simply to say, “Well, we should be able to see what sort of data it’s being trained on.

In fact, I actually talked to a Facebook whistleblower who confirmed exactly that. They’re not allowed to categorize Antifa as hate speech. They’re specifically told not to. At least, this was the case at the start of 2019, when he was last employed there. But that’s one way to look at the algorithm, to look at the data they’re being trained on.

Mr. Jekielek: The other question I have is actually, how can you actually remove all filters? Because whatever it is that we’re getting through social media, I mean, there’s so much data there, right. There has to be some system to compartmentalize or summarize it for us to be able to consume and make it a good user experience. All of this is kind of legitimate, but it is a type of filtering, absolutely, isn’t it?

Mr. Bokhari: Of course. And I don’t think a lot of people want their search results or their Twitter feed to be flooded with obscenity. And that’s why both Twitter and Google filter that out. But even with that, you still have the option to turn the filter off. So I’m not saying we should have no filters whatsoever.

Of course, we don’t want to be flooded with trolling and harassment and all this stuff. I’m just saying it should be up to the user to turn those filters on or off. That needs to be the standard. And I think the ultimate end game for me is for these tech platforms to completely open up the filtering system. So even allow third parties to make their own filters that you can then plug into Google or Twitter or Facebook.

That would be a true free market. That would maximize consumer choice. But the tech companies will never do it on their own because if they did that, they’re going to sacrifice all their control, all their power over political information and over news.

Mr. Jekielek: That’s actually a very fascinating solution because this is really the free market solution, to basically, I guess, democratize filters, and people can choose whose or which companies’ or which individual’s even viewpoint they would like to bring into the picture. Otherwise, it’s some unnamed, potentially hate speech engineering employee that could be determining these things. Right?

Mr. Bokhari: Exactly. This is what we need to be aiming towards and what people should be demanding that their representatives impose on these tech companies because as I said, the tech companies will not do this on their own.

They have way too much power now. They’re enjoying that power. They’re using that power, we’re seeing in this election campaign. And the only way to take that power away from it, from them is to simply impose it on them via regulation. And I know conservatives kind of cringe at the idea of using the power of government to rein in the private sector. But let’s remember, the president most famous for doing that at the close of the 19th century was Teddy Roosevelt, a Republican.

Mr. Jekielek: Allum, any final words before we finish up?

Mr. Bokhari: I will just reinforce how much of a threat these tech companies are to the independent media, to free expression, and to free and fair elections because unlike the bias of academia or of the mainstream media, it’s not just that these companies are able to control what information we see, they also know so much about us. And they’re using that information to subtly manipulate our political beliefs.

I strongly encourage people to find out more about that and what these tech companies are doing. You can find the book “#Deleted: Big Tech’s Battle to Erase the Trump Movement and Steal the Election” at and yes, just learn everything you can about these tech giants because I think this issue is going to dominate American politics for a lot longer.

Mr. Jekielek: Allum Bokhari, such a pleasure to speak with you.

Mr. Bokhari: Jan, great to be on.

This interview has been edited for clarity and brevity.

American Thought Leaders is an Epoch Times show available on YouTubeFacebook, and The Epoch Times website. It airs on Verizon Fios TV and Frontier Fios on NTD America (Channel 158).