Blowing Bubbles: Google Filters
Kate O’Brien investigates the ethics of search engines filtering their content…
Are you living in an online bubble where your online content is filtered to reflect your taste? Filters are for refining searches, sale giving you what you want. But is that a good thing when you can’t control it? The priority is on getting the most ‘relevant’ information out there to the individual browsing the web by filtering their search. However, sildenafil this raises ethical concerns, much like journalism – editors of newspapers coordinate a diverse number of viewpoints in an effort to deliver unbiased information. However, Algorithms aren’t so egalitarian.
This is where the controversy of a filter bubble comes in. It’s when websites like Google decide what information a user would like to see. This is done by using an algorithm based on information about the user (ie. location, past websites visited and search history). As a result, users can become separated from information that disagrees with their viewpoints. For example it could be showing them articles about pokémon being great or suggesting articles from just one side of the same sex marriage debate. This effectively isolates the user in their own ideological bubble.
The term was coined by internet activist Eli Pariser. The filter bubble concept has been defined as a personal ecosystem of information that’s being supplied by these algorithms. It’s a sphere that is created by your past, generating your present and future. This isn’t exclusive to search engines. Amazon and Netflix’s recommendations also create a continuous feedback loop for us. They perpetuate our selection biases. While Facebook’s news feed progressively shows content from feeds of those in our network we frequently click on, in the process suppressing those whose content we do not.
And what of the consequences of such adjustments? Well to begin with users get less exposure to conflicting viewpoints encouraging extreme opinions to be cultivated on complex issues. It can close us off to new ideas, subjects, and important information which lie outside our sphere of interest.
A world constructed from the familiar is a world in which there’s nothing to learn … (since there is) invisible autopropaganda, indoctrinating us with our own ideas. — Eli Pariser in The Economist, 2011
It could be potentially harmful to both individuals and society. It favours items that trend. Eli Pariser criticized Google and Facebook for offering “too much candy, and not enough carrots.” They facilitate our impulsive selves that watch entire seasons of family guy and distances us from who we would like to be in the long run – that sophisticated human being who has read all the classics. The sites think about what we want to see and not what we need to know. Pariser’s conclusion is that the creators of the modern web need to strive for an ethical standard that includes showing a diversity of viewpoints, giving users a say in what information is filtered out, and creating opportunities for users to challenge their standard viewpoint.
How about using the filter bubble for good? A website can use your web history to target advertising to you. Making their banners appear at any opportunity after you’ve left their site. It’s as if the temptation is following you. Businesses and even bloggers utilize web analytics to customize their sites. They learn about you to better adapt to you. Is this a bad thing?
Upworthy is a site trying to promote online content that’s highly shareable and actually important. More people than ever are discovering news through Facebook personalized feeds which is a climate where heavy issues like homelessness or climate change can’t compete with goofy viral videos. Upworthy aims to bring attention to issues that matter. Or how about search engines that aren’t using these kinds of filters like DuckDuckGo and Blekko.
Filters are a tool and like most tools it is not the tool itself that can be labelled good or bad but how we use it.
Whether to live inside the bubble or pop it, is your decision.