Live chat

Google Vote Reminders Only Went to Liberals, Not Conservatives for at Least Four Days: Dr. Robert Epstein

During his 2020 election monitoring project, Dr. Robert Epstein found stark data showing Google election bias, he says, and he’s only just started to pour through all the data that was collected.

In this episode, Dr. Epstein, a Senior Research Psychologist at the American Institute for Behavioral Research and Technology, explains what he’s found so far, and why he believes the public has the ability to stop big tech election bias, even without legislative action.

This is American Thought Leaders, and I’m Jan Jekielek.

Jan Jekielek: Dr. Robert Epstein, such a pleasure to have you back on American Thought Leaders.

Dr. Robert Epstein: It’s my pleasure, of course.

Mr. Jekielek: Dr. Epstein, you have some pretty significant allegations of activity from Google that suggests that Google was disproportionately targeting Democratic voters with messaging about getting out the vote. I’m hoping you can actually speak to that to start us off. Of course, this is all part of a large monitoring project that you’ve been involved in for many years now.

Dr. Epstein: First of all, let me say I don’t have any allegations, I have some data that my team and I have collected. I should also say—I think it’s relevant to this issue—that I was contacted by a pretty prominent Washington DC attorney a few days ago who thinks I should go into hiding. But I have no allegations. I do have some data. I think they’re possibly quite important data.

Mr. Jekielek: So tell me about this data, please.

Dr. Epstein: Well, you need a little background. We’ve set up systems since 2016 to try to determine what the big tech companies were showing people in the days leading up to the 2016, 2018, and now the 2020 election. So to do that we recruit people we call field agents. We had 95 of them back in 2016.

We equip them with special software, which allows us in effect to look over their shoulders as they’re doing election related activities on the internet, for example, doing searches on Google, Bing, and Yahoo. This year what we did was far more ambitious.

In 2016, we were able to preserve 13,000 of what Google calls ephemeral experiences. In this case it is searches on Google, Bing, and Yahoo. Ephemeral experiences—that’s a very important phrase. It means those experiences we have online that involve very fleeting content that impacts us, disappears, and is gone forever and can’t be analyzed because it disappears. That’s the whole point.

And we know from leaks from Google that ephemeral experiences are used quite deliberately and strategically to shift people’s thinking and behavior. That’s beyond any doubt at this point.

The point is, this year we recruited not 95 field agents, but 733 field agents, deliberately in three very critical battleground states. Those are Arizona, Florida and North Carolina. We wanted to go where the action was, where we would be most likely to find some evidence of bias or manipulation.

We have indeed found evidence of bias. We’ve also found what some people might want to call a “smoking gun.” We found that during the week of October 26th—that’s quite close to the election—only our liberal field agents were getting vote reminders on Google’s home page. I deliberately began to go public with this information on Thursday, October 29th.

I shared a lot of information with a reporter from the New York Post. That was quite strategic on my part, because I knew that all New York Post emails are shared with Google. I wrote about that in an article a couple of years ago. The same is true for the New York Times, The Guardian, and a lot of other major media sources. Not you, not your source, but definitely the New York Post—those emails are shared.

So I knew that all the information I was giving to this reporter would be seen by Google with my name in there. I assumed an algorithm would immediately boot it upstairs to real people. And then real people would look at all this information.

The point is that two things happened that night, that was Thursday night, October 29th. Number one, the article, which I gather was about to go to press, got pulled from the New York Post. So think about that. That’s a fairly conservative news outlet. In effect, I got censored by this news outlet.

Relevant, perhaps, is the fact that the New York Post gets 32 percent of its traffic from Google. Maybe someone reminded someone else at the New York Post that they’re quite vulnerable to being harmed by Google. In other words, Google has the power literally to put them out of business almost overnight.

The other thing that happened was that targeted messaging on Google’s homepage disappeared. It was like someone flipped a light switch, and it just disappeared. So starting at midnight on October 29, just days before the election, all of our field agents began to receive that vote reminder on Google’s homepage. That continued until the very end of Election Day on November 3rd.

So that was certainly one of the interesting findings that we’ve detected so far. But then it took a step in another direction, because at that point, I was receiving calls from various members of Congress and some AG’s that I work with. I explained to them what we had found.

Next thing I knew—this was November the 5th, I believe, right after the election—three U.S. Senators sent a letter to the CEO of Google talking about some of my findings, and basically accusing him of lying before Congress, which is a felony, when he said, “We never, ever, tilt any of our content toward one political party or another.”

They’re going after him based on my data. And that could explain why that Washington, DC attorney told me I should go into hiding.

Mr. Jekielek: Dr. Epstein, is there any chance that this data was somehow wrong? The other question is, did you just start monitoring this from October 26th? Or did the results as you describe them just start on October 26th?

Dr. Epstein: October 26th is the day we decided that we were fully operational. At that point we had over 500 field agents. It takes a while to get a system like this going. There’s a lot of recruitment involved. It’s a lot of technical challenges. We were being attacked digitally, more so than ever before in previous monitoring projects. One DDOS attack— that’s a “distributed denial of service attack”— was pretty serious. So we had some outages.

But basically, on Monday the 26th, we decided we were fully operational. We’ve just been focusing our analysis of data we received from that point forward. Over time, of course, we’re going to look at everything we have. But at that moment, basically about a week before the election, our focus was to see what was going on.

Keep in mind that the most dramatic kinds of manipulations are going to occur very close to the election, because that’s when you want to do three things if you can. If you’re supporting one candidate, of course, you want to mobilize the base. In other words, you want to get those voters off of their sofas, if they haven’t yet voted by mail.

Secondly, you want to discourage supporters of the candidate you oppose from voting. So you want to keep those people home. But most important of all, and this is where our monitoring project is very important, you want to impact the people who are still undecided.

And it’s those last few days, those are absolutely critical. That’s where you’re going to apply the most pressure you can, in every way you can, to try to nudge those undecided voters in one direction or the other. Those are the people who end up deciding in a close election who wins.

Mr. Jekielek: Is there any chance that your data is wrong? This is pretty stark. You’re basically saying that for three days at least, voting reminders were going to one ideological group?

Dr. Epstein: Well, it would be four days of the 26th through the 29th. I don’t know how to answer the question. I was startled when I saw the numbers appear on my screen. I can’t say for sure what people around the country were getting.

But I can say that we had recruited a diverse group of 733 registered voters, Republicans, Democrats and independents. And I can say that during that period of time our field agents who identified themselves as liberal all got this vote reminder in Google’s homepage. I can say that, among those who call themselves conservative, not a single person saw that reminder on the homepage.

That’s a strange thing to see on a computer screen when you’re looking at data because you have 100 percent in one group, and zero percent in the other group. One doesn’t see that very often. Let’s put it this way—you don’t need to do a statistical analysis to see if there’s any difference between the two groups—100 percent versus zero percent.

Let’s think about it another way. Let’s say that Google was monitoring us. And if I were Google, I would have been monitoring us. I mean, they wouldn’t alter our data. To have that pattern of this just doesn’t make any sense.

We also have an enormous number of security precautions in place when we do these projects. I won’t go into detail. The point is that although we can be attacked, and our systems frozen for short periods of time, there’s no way for anyone really to change the data.

So we’re seeing what the field agents are seeing. In effect, we are looking over their shoulders using software. This is with their permission, we’re looking over their shoulders, and we’re seeing what they see on their screens. What we saw during that period, was that those vote reminders only went to people who call themselves liberal. Not one of our conservative field agents received a vote reminder.

Again, I’m not making any allegations, I’m simply reporting what we found. Of course, we also found a bias in Google search results and other things. We’ve preserved so much data, it’s going to take us months to understand what we have. The main point of this project is to preserve content that is normally lost forever. These are all this ephemeral content.

The folks at Google know this. In a leak of emails to The Wall Street Journal in 2018, one Google employee says to others, “How can we use ephemeral experiences to change people’s views about Trump’s travel ban?” Now, that’s from inside Google.

They know that ephemeral experiences like search results, reminders on the homepage, search suggestions, and news feeds, are ephemeral. They appear in front of your eyes; they impact you; they disappear; they’re gone forever. No one can go back in time and see what these companies were showing people or saying to people on their personal assistant devices.

So that’s what we’ve done that is quite unique—we have preserved in 2016, 13,000 ephemeral experiences which are normally lost forever. And this time, we preserved more than half a million, of all sorts, by the way. We preserved home pages, Google search results, Bing and Yahoo search results, and also thousands of YouTube sequences.

We preserved Facebook homepages—what kinds of messaging was Facebook sending to its users? We’ve only taken a quick stab at that, we’re not really sure yet. That’s because the Facebook homepage is quite complex. Whereas the Google homepage is quite simple.

So we will, at some point, have an answer. We’ll know whether there was any kind of targeted messaging on Facebook’s homepage as well. If there was, we’re talking about the possibility that maybe a number of Silicon Valley companies were all brazenly pushing votes in just one direction.

I calculated months ago that if all the Silicon Valley companies, the most powerful two being Google, and Facebook, are all pushing in the same direction, that could easily shift in this election 15 million votes. Which means they, in effect, decide who the next president is going to be.

I lean left myself. I think it’s great that they’re pushing causes and candidates that I like and that my family likes. But I put democracy and I put the free and fair election and I put our country ahead of any personal preferences I might have for a candidate or a party.

The fact is, if we allow companies like Google to control the outcome of our elections, then we have no democracy, there is no free and fair election, all of that is illusory. In effect, we have behind-the-scenes technology lords—technology masters who are actually running the show.

To me, that is unacceptable. Even if they’re supporting candidates and causes that I like, that is unacceptable. In my opinion, that should be unacceptable to all Americans, no matter what your party is. My Democratic friends and family, if you’re out there listening, and you don’t like what I’m saying, then shame on you.

Because you don’t know tomorrow who these companies are going to be favoring. What they do is not transparent. And they’re not accountable to the American public. Unlike our elected officials, they’re not accountable. What they do is highly secretive and you don’t know what they’re going to be doing next. You don’t know what they’re doing in other countries.

For example, we have evidence that in Cuba, they don’t support the left. They support the right in Cuba, because the left are in power and the left doesn’t like companies like Google. They support the right in China, Mainland China. Google has worked with the Chinese government to help spy on and control the Chinese people.

There was a tremendous revelation just over a year ago of a secret Google project called Dragonfly, in which Google was basically going back into China working with the Chinese government to help them control their population. So again, speaking out to my Democrat friends and family, I love you. But if you’re mad at me for what I’m saying, then shame on you.

Mr. Jekielek: This is hard to take in, in its entirety, for myself and for other viewers. I want to ask you one more quick question about what you call the “smoking gun data.” How many of these 733 field agents were conservative and liberal according to metrics? I’m just wanting to get a sense of the sample size we’re talking about.

Dr. Epstein: I don’t have those numbers right in front of me at this point, but we had a very diverse group. Very roughly, we had about one third liberals and about one third conservatives. Then we had a smaller group that called themselves moderates and a smaller group still that call themselves “other.” I don’t have the exact numbers here.

Eventually Chris will write all of this up, and we’ll release lots and lots of details. In this world, there are good problems and bad problems. We have a good problem. We have so much data. But what we accomplished in a very short time dwarfs what we’ve done in previous monitoring projects.

This project tells me basically two things. It tells me, yes, it is possible to do large scale monitoring of these companies—to do to them what they do to us 24 hours a day. We can be monitoring year round on a very, very large scale, pretty much everything that they are showing to users and that they’re telling users through their personal assistant devices. We can be looking 24 hours a day. We can be looking for manipulations, for bias, for shenanigans of all sorts.

This project also tells me that if we are detecting any kinds of irregularities, and then we’re exposing those irregularities, we can get these companies to back down, we can get them to stop. This is without laws or regulations. This is just using tech—good tech to fight bad tech.

And you know, laws and regulations move very slowly. Tech moves literally at the speed of light. How are you going to think ahead to the future, not just Google, but to the next Google and the one after that? How are you going to protect humanity, democracy, free speech, from companies like Google, whether they’re well intentioned, or whether they’re truly evil? How are you going to protect humanity?

You do it with monitoring systems, because monitoring is tech and it can keep up with whatever new technologies are emerging. It can keep up with those technologies, and on an ongoing basis, 24 hours a day. It can protect us from manipulation from undermining democracy. And over time, even undermining human autonomy.

One of the disturbing things that leaked out of Google in 2018, was an eight minute video called the “selfish ledger.” This was never meant to leave that company. It talks about the power that Google has to re-engineer humanity. This video also includes the phrase “company values”—to re-engineer humanity in a way that reflects company values.

In effect, they’re acknowledging that the kind of thing we’re finding is real; it’s deliberative; it’s strategic. They’re acknowledging a kind of utopian thinking, maybe they know better. How would the rest of the people in the world, the people outside of Google, how would they feel? How would they weigh in on this kind of possibility, this kind of plan, these kinds of actions? How would the rest of the world weigh in?

I really don’t think people around the world would welcome a private company in the United States exercising the power it has to rig elections, to re-engineer humanity according to company values. I think it’s outrageous.

We need transparency, obviously. There are all kinds of ways we might get more transparency. Various countries have tried and so far failed. But most importantly, we need to be protected from these manipulations. I know, because of what’s happened with our current monitoring system, I know we can be protected by having large scale permanent monitoring systems in place and not just in the US, but in countries around the world.

I know not only that we can do it, I know that we must do it. I don’t think this is optional anymore for humanity. I think that the technological elite is now in control. Whether you know it or not, they’re now in control of our elections. I know of a way to stop them. I did stop them. At least with one manipulation, it appears that I did stop them.

By the way, a whistleblower, a leak from the company, subpoena documents, court discovery, any number of different methods could someday confirm that what happened on the night of Thursday, October 29, really happened the way I am speculating. We might actually get confirmation of that at some point that Google became aware of the monitoring, they became concerned, and they turned off a blatant manipulation, which is I’m told, probably a violation of campaign finance laws, a felony that can be punished with fines or even prison time.

Mr. Jekielek: It’s also possible that there was some sort of rogue actor within Google who was partisan, who would have been involved in something like this. I don’t presume to speculate. But I’m sure a lot of our viewers right now are very curious. Do you have any estimate of that particular observation of this homepage, basically voting encouragement? How many votes could that have shifted based on your experience or your estimates?

Dr. Epstein: I haven’t done those calculations yet. But I can tell you a couple of things. Number one, that home page is seen in the United States 500 million times a day. If that kind of reminder was being used systematically over a period of time, it affected more than who voted on election day, it affected who sent in mail-in- votes, it affected who registered to vote.

Imagine that kind of targeted messaging continuing for a long period, it can shift a lot of votes, directly by getting someone to vote, and then indirectly by getting more and more people with one particular political orientation to register to vote.

Those are calculations that we know how to do—at some point, I will do them. I certainly did not expect to find this targeted messaging on Google’s homepage. It might also be on Facebook’s home page. We’ve preserved thousands of those home pages. We know who they went to; we know the demographics of the people that they went to. So we’ll be able to look into that as well.

But the point is, I wasn’t expecting this. We have so much data, we have such an incredible wealth of data that we’re gonna find all kinds of things—the things that that we just didn’t expect, like YouTube sequences. Imagine the power that YouTube sequences have over opinions and votes, because 70 percent now of the videos that people watch on YouTube around the world, 70 percent are suggested by YouTube’s up next algorithm.

People sit there letting YouTube, which is part of Google, feed them videos. Imagine the power that a sequence of videos has to shift the thinking of someone who’s vulnerable, someone who’s undecided, someone who’s trying to make up their mind about something. This is a way to take people down rabbit holes. And there are documented cases now of individuals who have turned to right wing extremism or Islamic extremism because of a sequence of videos they saw on YouTube.

We’ve done something no one’s ever done. We have recorded, in the days before a political election, thousands of YouTube sequences. And again, we know the demographics of the people who were watching these sequences.

Just to show you that we’re not dreaming here. One of the videos that leaked from Google in 2018 was a two minute video in which the CEO of YouTube, Susan Wojcicki, is talking to her staff, or so it appears. She’s explaining that they’re just not going to let any of this fake news affect our YouTube users anymore.

So they’re modifying the up next algorithm to elevate, to push up content that they think is valid content and to demote content that they think is not. We’re talking about, again, deliberate efforts within the company, to engineer the kind of content that people can see, and that people cannot see.

And that second category, content people cannot see, that’s very, very dangerous. Some people would call that censorship. What’s truly dangerous about it is, you don’t know what you don’t know. You don’t know what’s being suppressed because you can’t see it.

We know that there have been deliberate efforts within the company to alter the YouTube up next algorithm. Obviously, according to once again, there’s that phrase, company values. I mean, how would outsiders have made judgments about which videos people should be able to see or not say? How would people of different political orientations have made those decisions? What exactly was the process they used to make those decisions?

This is unacceptable. We cannot have private companies that are not accountable to the public, deciding what 3 billion people around the world can see and cannot see. By the way, that number, by 2022, will exceed 4 billion people. This cannot be allowed to continue.

Mr. Jekielek: We are in absolutely unprecedented times in terms of the raw ability of private corporations to control information. Absolutely. How soon can we expect to get some results from your work? That’s the first question. And the second question is, I imagine there’s viewers that are watching who might want to help you do some of that processing, if you’re looking for people to do that. Because, as you said, who knows what could be found in these data.

Dr. Epstein: Unfortunately, we can’t really accept volunteer helpers. We would love to, because we get people offering every single day. And I’m sure they’re wonderful people, and they’re very sincere and wanting to help us. But the problem is that if someone volunteers, we don’t know whether they have some connection with the big tech companies.

Because Google not only employs 100,000 people, but also employs that we know of more than 10,000 outside contractors, just regular old people for all kinds of purposes. If I were Google, and Dr. Epstein was looking for volunteers to analyze data, I’d be very generous, I would send him dozens and dozens of volunteers.

We have no way of distinguishing a good volunteer from a bad volunteer. So we have to take all kinds of questions. We find our people through networking and we’re extremely, extremely cautious. We do background checks, we have people sign very strict NDA’s. We’re very cautious.

I saw Alan Dershowitz give a talk once in which he said, “There’s a fine line between paranoia, which is a bad thing, and caution, which is a good thing.” We’re very, very cautious. You asked when we’re going to release lots and lots of results. I can’t even give you a date on that because, frankly, we are overwhelmed, and in a good way. We’re overwhelmed with how much information we were able to preserve.

We have to invent new ways of analyzing this kind of data. Literally, just as we’ve had to invent new ways to study online manipulations since 2013. We had to invent how to set up a monitoring system that allows us to look over the shoulders of real computer users. That took us about 10 months back in 2016. We started in January. And we’re still figuring that out.

Obviously, we’re getting better. We’re getting better at doing what we’re doing. But in my mind, all of these projects and the numbers, they’re not going to really solve any problems. What we have done and the real value in what we’ve done is to show what’s possible. You can think of these projects as proofs of concept.

We are showing that it is possible to look very, very carefully at what these companies are showing people—that it can be done on a very, very large scale. If it could be set up permanently in all 50 states, it could be collecting massive amounts of data and the data could be analyzed.

It’s possible and it can be done in real time, using algorithms. In other words, we can do the same kind of thing that Google does. Again, using good tech to fight bad tech. We can do large scale monitoring. It can be set up so that it is permanent, constantly looking for irregularities, and then reporting and exposing them.

When this happens, this is not an option for humanity. It’s a necessity and it’s a requirement and it must be done. When this happens, just think about it. Think about these companies— how they’ll react. They wouldn’t dare to try rigging an election. They wouldn’t dare to try re-engineering humanity. This is the way to protect ourselves. It’s by setting up monitoring systems that detect and expose threats to our freedom.

Mr. Jekielek: Dr. Robert Epstein, such a pleasure to speak with you again.

Dr. Epstein: Always a pleasure, anytime.

This interview has been edited for clarity and brevity.

American Thought Leaders is an Epoch Times show available on YouTube, Facebook, and The Epoch Times website. It airs on Verizon Fios TV and Frontier Fios on NTD America (Channel 158).

Read More
Comments
Related Videos
Precision Medicine: The New Fountain of Youth? | The Dr. Monti Show | Trailer
Special Episode: A Conversation With Keyboard Trust
Exclusive: Christmas With the Pups—Discover the True Meaning of Christmas | Official Trailer
Exclusive: Fight for Freedom | NTD Cinema | Official Trailer
The Korean Medicine Clinic Next Door (Ep. 2)| Official Trailer
The Korean Medicine Clinic Next Door (Ep. 1) | Official Trailer
Episode 5: Austria – Riegersburg Castle
Bobbie Anne Cox: How I Secured a Landmark Victory Suing the Governor of New York Over ‘Quarantine Camp’ Regulation
Your Medicine Is in Your Pantry | Eat Better
Episode 4: Denmark – Tranekær Castle
Layering Basics: Hyaluronic Acid, AHA, BHAs & More!
NTD Good Morning (Nov. 4): Trump Teases 2024 Presidential Run; Brooklyn Nets Star Kyrie Irving Apologizes After Suspension
Are Intelligence Services Controlling America?
Why California’s Largest County Wants to Leave the State | Curt Hagman
Unmasking the Christian Left | Church & State