“Ephemeral experiences”: You might never have heard this phrase, but it’s a very important concept. These are brief experiences you have online in which content appears briefly and then disappears, leaving no trace. Those are the kinds of experiences we have been preserving in our election monitoring projects. You can’t see the search results that Google was showing you last month. They’re not stored anywhere, so they leave no paper trail for authorities to trace. Ephemeral experiences are, it turns out, quite a powerful tool of manipulation.
Are people at companies like Google aware of the power they have? Absolutely… In emails leaked from Google to the Wall Street Journal in 2018, one employee says to others, “How can we use ephemeral experiences to change people’s views about Trump’s travel ban?” There is that phrase, “ephemeral experiences.”
During a period of days before the 2020 election, we found that on Google’s home page, it was sending “go vote” reminders just to liberals. That’s a powerful ephemeral message, and not a single one went to conservatives. How do we know this? Because we were recording the content our 700 “field agents” were seeing on their computer screens. That was a diverse group of registered voters we had recruited in three key swing states. Google was sending those vote reminders only to liberals. That’s a powerful manipulation that’s entirely invisible to people — unless a group like ours has found a way to monitor what people are seeing.
A preliminary analysis of the more than 500,000 ephemeral experiences we preserved in Arizona, North Carolina, and Florida, turned up some disturbing things. Number one, we found a strong liberal bias in the search results people saw on Google when they searched for political topics; this bias was absent on Bing and Yahoo. 92% of searches are conducted on Google, and we know from years of experiments we’ve conducted that biased search results can easily shift the voting preferences of undecided voters, and those are the people who decide the outcomes of close elections. In experiments, we can easily shift 20% or more of undecided voters after just one search by showing them biased search results.
In a national study we conducted in 2013, in one demographic group — moderate Republicans — we got a shift of 80% after just one search, so some people are especially trusting of search results, and Google knows this. The company can easily manipulate undecided voters using techniques like this — in other words, people who are vulnerable to being influenced.
Even before people see search results, biased search suggestions — those phrases Google flashes at you when you start to type a search term — can shift thinking and behavior. We have shown in controlled experiments that biased search suggestions can turn a 50‑50 split among undecided voters into a 90‑10 split, with no one having the slightest idea they have been manipulated.
People have no idea that manipulations like these are being used. They are simply doing what they always do — typing in a search term, clicking (sometimes) on a search suggestion, and then clicking on a high-ranking search result, which takes them to a web page. They are trusting what is high in search results, usually clicking on the first or second item and trusting that this is the best answer to their question.
Unfortunately, people mistakenly believe that computer output must be impartial and objective. People especially trust Google to give them accurate results. Therefore, when people who are undecided click on a high‑ranking search result and are taken to a Web page that supports one candidate, they tend to believe the information they’re being shown. They have no idea that they may have been driven to that web page by highly biased search results that favor the candidate Google is supporting.
Dwight D. Eisenhower did not talk about his accomplishments in his famous farewell speech of 1961. Instead, he warned us about the rise of a “technological elite” who could control public policy without anyone knowing. He warned us about a future in which democracy would be meaningless. What I have to tell you is this: The technological elite are now in control. You just don’t know it. Big Tech had the ability to shift 15 million votes in 2020 without anyone knowing that they did so and without leaving a paper trail for authorities to trace. Our calculations suggest that they actually shifted at least six million votes to President Biden without people knowing. This makes the free-and-fair election — a cornerstone of democracy — an illusion.
I am not a conservative, so I should be thrilled about what these companies are doing. But no one should be thrilled, no matter what one’s politics. No private company should have this kind of power, even if, at the moment, they happen to be supporting your side.
Do these companies think they are in charge? Are they planning a future that only they know for all of us? Unfortunately, there are many indications that the answers to these questions are yes. One of the items that leaked from Google in 2018 was an eight‑minute video called “The Selfish Ledger,” which should be accessible here. I also made a transcript of the film.
This video was never meant to be seen outside of Google, and it is about the power that Google has to reshape humanity, to create computer software that “not only tracks our behavior but offers direction towards a desired result.”
How do we protect ourselves from companies like this? It’s more difficult than you might think. How do you control a mind control machine, after all? You might have heard the phrase “regulatory capture” — an old practice in which a large company that is facing punishment from the government works with the government to come up with a regulatory plan that suits the company.
When you are talking about, for example, “breaking up” Google, all this means is that we will force them to sell off a couple of the hundreds of companies they have bought. On average, Google buys another company every week. We force them to sell off some companies, the major shareholders are enriched by billions of dollars, and the company still has the same power and poses the same threats it does today — threats to democracy, to free speech, and even to human autonomy.
Tech moves at the speed of light, but regulation and law move slowly. It’s doubtful that regulations and laws will ever be able to protect us from emerging technologies. But imagine if these companies knew that we were monitoring them on a large scale 365 days a year — that we were, in effect, doing the same thing to them that they do to us and our children 24 hours a day.
Imagine that we were, in effect, looking over the shoulders of thousands of real people (with their permission), just as the Nielsen Company does with its network of families to monitor their television watching. Imagine if these tech companies knew that they were being monitored — that even the answers they are giving people on personal assistants like Amazon’s Alexa and Apple’s Siri were being monitored. Do you think they would risk sending out targeted vote reminders to members of just one political party? I doubt it very much, because we would catch them immediately and report their manipulation to authorities and the media.
On October 30, 2020 — a few days before the November 3rd election, we went public with some of our election monitoring findings, and we got Google to back down. From the 31st on, Google started sending those vote reminders to everyone, not just to liberals.
Remember that all the usual election shenanigans are inherently competitive: tampering with votes, mail, and voting machines. But the kinds of influence that I have been discovering and studying since 2013 is not competitive. That is the difference. In other words, if Google itself wants to favor one cause or one candidate, there is no way to counteract what they are doing. In fact, without monitoring systems in place, you can’t even detect Google’s manipulations, even though they can shift the opinions and votes of millions of people. And people have no idea they’re being manipulated, which makes these kinds of manipulations especially dangerous. People end up concluding that they have made up their own minds when in fact they have not.
We have conducted controlled experiments with tens of thousands of people covering five national elections. We know how powerful these new forms of influence are. We know that people cannot see them. We know that people mistakenly end up believing that they have made up their own minds when in fact we were the ones who decided which candidate they were going to support.
What can we do? In my opinion, the solution to almost all the problems these companies present is to set up large‑scale monitoring systems and to make them permanent — not just in the United States, but around the world. Because monitoring is technology, it can keep up with whatever the new tech companies are throwing at us, and however they are threatening us, we can get them to stop.
I am envisioning a new nonprofit organization that specializes in monitoring what the tech companies are showing to voters, families, and children — protecting democracy and the autonomy and independence of all citizens. There might also be a for‑profit spinoff that could serve as a permanent funding source for the nonprofit. The for‑profit spinoff could provide commercial services to campaigns, law firms, candidates, researchers, and many others.
And there’s another way to completely eliminate the threats that Google poses to democracy and humanity. As I noted in an article I published in Bloomberg Businessweek in 2019, and as I testified before Congress that year, our government could quickly end Google’s monopoly on search by declaring that the database Google uses to generate search results is a “public commons,” accessible to all. It is a very old legal concept, and it is a light-touch form of regulation. It would rapidly lead to the creation of thousands of competing search platforms, each appealing to different audiences.
On November 5, 2020, three U.S. Senators — Senator Mike Lee, Senator Ron Johnson, and Senator Ted Cruz — sent a letter on U.S. Senate stationary to the CEO of Google. The letter talks about some of the findings from a 2020 online election monitoring project in which my team and I had discovered several things.
We had detected — just as we had in previous elections — a strong liberal bias in Google search results, but not in search results on Bing or Yahoo. That is important for comparison purposes. It was a liberal bias sufficient to have shifted at least six million votes over time toward Biden and toward other Democratic candidates.
We also found a smoking gun. This is what the Senators’ letter focuses on. We found that for a period of days before the election, on Google’s home page the company was sending a “go-vote” reminder just to liberals. Not a single one went to conservatives. How do we know this?
Because we had recruited 733 field agents in key swing states: Arizona, Florida and North Carolina. The agents were registered voters. They were diverse, politically and in other ways demographically. We knew who the liberals were, who the conservatives were, and who the moderates were.
With their permission, we had installed special software on their computers that allowed us, in effect, to look over their shoulders as they were doing politically related things on the Internet. We aggregated that data. What we are particularly interested in are what are called “ephemeral experiences.” That phrase comes right from a leak of emails from Google to The Wall Street Journal.
Ephemeral experiences — it’s a very important concept. It’s how Google and other tech companies shift opinions and votes without people knowing. We were preserving these fleeting events that impact us every day and that normally then disappear, leaving no trace. Normally, these kinds of events — like search results, search suggestions, newsfeeds, or messages coming from Facebook or Google — normally, events like these appear, they impact us, they disappear, and they are then lost forever. You can’t go back in time and see what these events were. You can’t look back at the search results Google showed you last month.
I have been conducting randomized controlled studies on the impact of ephemeral experiences on behavior, thinking, and voting now for almost eight years, so I have learned a great deal about how they work, and they are powerful. Are people at companies like Google aware of the power they have? Absolutely.
In leaked emails from Google in 2018, one employee says to others, “How can we use ephemeral experiences to change people’s views about Trump’s travel ban?” There is that phrase: “ephemeral experiences.”