Have you ever been scrolling through your Facebook feed next to your friend and realised how very different the news feeds are?
Have you ever looked up information for a report at school or university on Google and noticed how completely different your results to the person next you?
Sites such as Google, Facebook, YouTube and multiple other websites track our online usage. They track information such your current location, thoughts, interests, and specifically what you click on. These websites use algorithms to personalise your content on the Internet and your Internet search engines.
I’m sure it’s not much to your surprise that these algorithms also personalise advertisements and are generally based on what you as the individual clicks on and searches for.
But what you don’t realise is that this has a negative effect that results in a form of discrimination and censorship. In Eli Pariser’s TED talk he describes what he calls the ‘Filter Bubble’. You as a user are exposed to only filtered information based on your past search history, rather than a broad range of content that should be made available to all.
You as an individual are invisible, as this algorithmic designed self, cannot be seen and the individual does not have a choice whether to enter or not within the bubble. Yet without these algorithms it would be close to impossible to filter anything in our post broadcast age.
William Merrin raised a similar concept in his blog post Studying Me-dia: the problem of Method in a Post-Broadcast age. He suggested that we are now faced with various problems in our current age as listed in one of my previous blog posts ‘What’s the real problem’. Rather than having a long list of what isn’t working now I think it needs to be simplified to one simple reason, and I have come up with an answer (of course inspired by the above ted talk). Our real problem is the filter bubble that we are all stuck in. Unlike in the pre-broadcast age where we had human gatekeepers, we are now left with the algorithms, which lack the ethics of people. These algorithms shouldn’t only be based on what they think we want to see, but rather what we need to see.
Other points of view