I was really intrigued by the TED Talk given by Eli Pariser on “filter bubbles”. We know from such movies as Terminator that sometimes technology can get too smart for our own feeble human good. But as more and more companies, media conglomerates, mobile, application, and Web 2.0 users race towards hyper-local, geo-targeted selling and content; it begs the question, is it really a good thing?
Eli shares that the information superhighway is now gated not by human filters, but by algorithms. Search results are edited based on a number of factors, from content you typically read, what you have searched for in the past, what you have clicked on in the past, and so much more. In the past, you had editors of publications as gatekeepers to information. You heard and learned what they wanted you to know. Then, with the Internet, the world was opened up. Or, so we think – we now have algorithmic gatekeepers that decide what we get to see. Think about the loss of control and information you don’t even know about, based on searches you make and the content you’re consuming on the Internet.
Eli says, “…we are moving very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see.”
Personalized content is great – it helps filter through the noise. But what if that noise is something struggling to be heard, that NEEDS to be heard? If you spend your entire day searching for stories on celebrity gossip, does that mean you’re missing out on hearing what celebrities are doing to raise funds for charities and causes? Or, learning about a cause that might directly impact you?
Part of what makes the Internet and social networking so great is the ability to learn about other people’s points of view, and share them. For example, I’m as guilty as the next person of rolling my eyes at a Facebook friend who has a radically different political view than me, or even hiding their updates from my Facebook newsfeed during election season. But, that’s the difference – I’m CHOOSING to hide that from my feed. I don’t want Facebook (or, a search engine) to hide it for me.
So what do we need to do to make sure that algorithms have a sense of civic responsibility? Eli says…
- Transparency – see what the rules are that determine what gets filtered
- Control – we can decide what gets through, and what doesn’t
What does this mean for social media? Perhaps social networks force us to be more closed off in nature, feeding the filter bubble. Think about how Facebook works – your newsfeed most often shows posts that come from people you frequently interact with, or visit their profile page a lot. When you search for a friend, it auto-fills the ones you interact with the most. Frankly, Facebook always knows where you have been, and exactly what patterns you follow – from the time of day you log-in to the types of posts you generally click on (tech news? Political stories?). But, does it mean the parameters you have set based on your usage are causing you to missing out on other content? Perhaps.
This is not to say that the filter bubble is all bad. The online world is a massive space that grows every day. Maybe we need the help of those algorithmic gatekeepers to focus the information and content we receive. Even the Drudge Report gives us only a glimpse of the world at large.
I’m inclined to agree with Eli in that it’s a responsibility of users and content producers/hosts to be open about what’s being filtered, and how to change that. We all need to be challenged, and to view things in a different way. The power of the Internet still lies in the fact that it’s the information superhighway (or, a series of tubes). The fact that social media has become an integral part of search (and results) just means that we’re all becoming a bit more connected. So maybe instead of our search algorithms choosing content for us, we should be given the power to decide for ourselves.