Technology is in and of itself neutral, but sometimes the way it we use it can have unintended consequences.
A good example of this might be the personalization of the Internet. In his 2011 book, The Filter Bubble: What the Internet is Hiding from You, Eli Pariser argues that the great promise of the Internet as a democratizing medium may be having the opposite effect. Rather than expanding our horizons, bringing people together, fostering collaboration and understanding and driving positive change, it is polarizing society by creating individual, self-affirming feedback loops for all of us.
Last week, I recreated an experiment Pariser writes about by asking my friends to search the term ‘Saskatchewan’ on Google. The outcome was not astonishing, but the subtle differences between results illustrates how Google uses whatever information it has gleaned from our online activities to personalize what it sends back. For example, the very top result I got was the Government of Saskatchewan. There is no way that Google’s algorithms could know for certain it was me sitting at the computer, but I am its primary user and as a Saskatchewan newspaper reporter, I frequently do research on the Saskatchewan government site. None of my friends got this as their top link.
In my results, anything tourism-related was well down the page, whereas for out of province friends it was right at or near the top.
As I said, it was subtle, but it goes to show that we have moved beyond what was the original intent of search engines to return the most objectively relevant information about a subject to what is likely most relevant to me.
On the surface this seems like a nice feature. Who has time to scan through a bunch of stuff they don’t care about? The problem according to Pariser is that as personalization becomes more sophisticated it is editing out different points of view, stuff that we might not like, but that we certainly need to know about. It’s all out there, but with personalization it means you actually have to go looking for it. And who has time to do that when all the cat videos you could ever want to watch are served up to you without you having to ask?
Pariser illustrates this with the example that when Facebook started to run its personalization code, his conservative friends dropped off his newsfeed, presumably because he didn’t ‘like’ their posts nearly as frequently as those of his liberal friends.
That is the filter bubble, isolation in a self-affirming “You Loop” as he calls it, which you don’t even know you’re in because you can’t see how everyone else’s experience of the Internet differs from yours.
Access to everything was supposed to be one of the strengths of the Web. No longer were we to be bound by gatekeepers of information such as newspaper editors. At least most of these people, though, were bound, and I like to think (or maybe hope) most still are, by a code of journalistic ethics to operate in the public interest and present fair and balanced information.
It is not that filtering is anything new. We self-filter all the time. I am sure, for example, there is not huge overlap between the readers of this column and Mike Stackhouse’s. Nevertheless, we are both presented here in this paper and the filtering is up to the reader, not some algorithm designed to a great degree to make money for the website’s advertisers by presenting what it thinks we want to see rather than what we probably need to see.
Facebook founder Mark Zuckerberg has boasted that Facebook is now the number one source of news in the world. I don’t doubt it, but it’s scary when I think about my own experience with Facebook. If my primary source of news was Facebook and I was not the type of person who verifies everything I read before believing it and seeks out differing points of view, I would be very ill-informed and misinformed indeed.
The Internet still holds the promise of unprecedented access to important information, it’s just getting harder to find it.