Are Algorithms Dictating Your Life?

If you’ve ever clicked the first search result you received, then the answer is yes.

Have you ever heard of the search bubble? It’s the term for what happens when you use too much personalized search.

Search engines like Google keep track of the things you search for and the websites you click on. This is so that when you search for something, they can filter out the results to show you what they think you’ll find most interesting.

So the more Google personalizes search, the more you click on similar websites, and the more you click on similar websites, the more Google shows you similar results. It becomes a closed, ever-tightening loop.

And while there is nothing inherently wrong in visiting similar pages all the time, the problem arises when you’re missing out on good, well crafted pages just because Google decides you won’t be interested.

But the issue runs deeper. A study conducted by the American Institute for Behavioral Research and Technology (AIBRT) has found something called the Search Engine Manipulation Effect (SEME). The study showed that the results of an election could be influenced by the order in which candidates appear on the search page. After just one session, the number of people likely to vote for a specific candidate increased from 27% to 63%.

Robert Epstein, senior psychologist at AIBRT, says that if Google wanted to, they could rig entire elections. And because Google is the preferred search engine of a majority of the population world-over, any tweaking would affect the globe.

Of course, Google may not want to rig the elections at all. They certainly say they don’t. But what Google may want to do and what is happening are two entirely different things.

If there is one thing this study showed us, it was how easy it is to influence not only what we see, but how we think and perceive information. And it’s scary because, most of the time, we don’t even know that we’re being influenced at all.

And when we do know that we’re being influenced? It only gets worse. Of the people in the study who knew about the influence the search engine could have, 45% shifted their support to the ‘chosen’ candidate, as opposed to 37% of those who did not know.

In an article for politico magazine, Robert Epstein explains the three ways an election could be rigged by Google. The first is by the company and the second is by a rogue employee who is able to tweak the placements, but the third is by far the scariest: when nobody does anything. After all, even without any third-party tweaking, the results appear in some order, and it is very likely that this order is largely responsible for the outcome.

Search engines are meant to be biased. Their entire premise is to give importance to the more relevant results. It’s why Google is as popular as it is: its PageRank algorithm not only gives results, but prioritizes the relevant ones.

But who decides what is ‘relevant’? Mostly, a series of complex algorithms that look first at what people are clicking on and then at what you’ve been clicking on.

What does this mean? Let’s break it down. The first three listings that show up receive 61% of all clicks and less that 10% occur onpage number two. This means that the majority of people are all seeing the same set of results and the more people click on it, the more likely it is that more people will click on it. Again, a closed, ever-tightening loop.

It’s the same personalized-search bubble problem, but this time, we’re all trapped inside. And the bubble is influencing not only our elections and what pages we look at, but also the thoughts we think.

And that brings me to the scariest thought of all: is our reality being dictated by a computer program?


This article is based on information from “You Are What You Search”, first published in Sirius #206 8–21 November 2015 “You Are What You Search”.

Have something to say? At Snipette, we encourage questions, comments, corrections and clarifications — even if they are something that can be easily Googled! Or you can simply click on the heart below, so we know you liked reading this.