Mathew Lowry

“The Filter Bubble”, by MoveOn.org foreign policy director Eli Pariser, shows that the forces creating the Brussels Bubble are about to be reinforced by technology, operated invisibly – and with impunity – by a handful of companies.

When I launched Blogactiv in 2007 I had some interesting experiences engaging with eurosceptics of the more rabid variety. As I wrote a few months later, I learnt to appreciate:

“…the power of groupthink. Put similarly-minded people in the same environment (gated suburb, online community, religious school, whatever) for long enough, and watch extremism rise as all members reinforce each other’s worldview and raise the volume higher and higher to be heard.

… I don’t think for a moment that eurosceptics are the only ones prone to it. Go into any cafe within shouting distance of the Berlaymont here in Brussels, after all, and you’ll find yourself in a large group of similarly-minded people, most of whom have been in the same environment for a very long time…” more

I resolved then to explore groupthink and one of its causes – online Echo Chambers – via this blog, but haven’t really done so properly. While almost every post about the Brussels Bubble touches on groupthink (the Bubble is partly constructed from the mutual incomprehension of people inside and outside EU affairs), I haven’t really dug into groupthink psychology.

Groupthink in action

Partly because it’s scary. Back in 2008, one of the eurosceptics I mentioned in my first post – Richard North – provided cogent, intelligent criticism of BlogActiv, and had recently co-written the Great Deception, “one of the best eurosceptic histories of the EU available”, according to @litterbasket.

Two years later, he was riffing off a murderous rampage by a British nutcase called Derrick Bird, who killed 12 people in a shooting spree. Two days after the event, North can be found online, using this terrible tragedy as a spur to suggest that slaughtering officials was justified, and posting links to bomb-making equipment.

From publishing an intelligent book to inciting murder – a long trip in two years. Is this groupthink in action?

Before the internet you needed to get people into the same room to get the Echo Chamber effect required for groupthink to take hold – every nutjob US cult, after all, needs its retreat in the wilds of Montana.

When I was visiting EUReferendum 4 years ago, rebutting some of the loonier assumptions made there about BlogActiv, I realised that Echo Chambers are now one mouseclick away. Anyone who spends long enough in the same online space populated only by people with the same views will probably find those views steadily become more extreme over time.

What the “Filter Bubble” shows is that this effect will now spread to the entire Web.

Enter the Filter Bubble

The Filter Bubble is an excellent book – I recommend you read it. I won’t review it here for reasons of space, so go read some reviews, and/or watch this short TED talk by Eli Pariser, its author:

Now the above 10 minutes barely scratches the surface of what's going on here, but the easiest way of looking at it is this:

1) We're all become used to seeing online Ads on Google that match the search terms we just typed in.

2) And you've probably more recently noticed that after you search for something, then all sorts of web sites suddenly start displaying ads (and not just Google Ads) relevant to your search. The website you're viewing knows what you've been looking for, and is serving you relevant, personalised ads.

3) You're probably also aware that this is driven by a lot more data than your latest search terms. In fact, a small handful of companies have massive 'profile data' on you: the sites you visit, what you write online, what you search for, who your Facebook friends are, what you read, where you were educated, and much, much more. They hold much more data than any government has ... which is why governments have made it easy for themselves to take a look:

"Google has received 4,601 user data requests from the US Government over the most recent six-month period and has complied with 94% of those requests..." more

4) These companies sell websites this data so the websites can personalise their ads, increasing clickthrough rates and hence revenue.

So far, so OK, right? Why not have relevant, personalised ads? Beats the generic crap on TV.

But what happens when it's not just the ads that are personalised? What happens when sites use your profiles to present the content - the news articles, op-ed pieces, etc. - they think you are most interested in?

Quite a lot happens, actually. You're in a filter bubble, seeing a Web tailored to your interests - or what the profile data thinks are your interests. And you are impoverished as a result.

Now the Echo Chamber comes to you (and you can't even see it)

There is no Front Page - there is only your Front Page.

Human editors have been replaced - as gatekeepers and curators to what's interesting, importing and relevant - by algorithms. Now they decide what you see ... and they don't have the same public service ethos written into their code that editors (theoretically) had trained into them through journalistic culture.

The algorithms' only priority are financial.

And remember, not only can you not see the machines or affect how they work, you can't see the Web they're not showing you.

So say goodbye to news and views which challenge you to think differently, or indeed content on any subject which Mark Zuckerberg thinks won't be relevant enough for you. You remember the Facebook founder, who's definition of privacy is pretty elastic, and who's definition of relevance can be encompassed by this quote:

"‘A Squirrel Dying In Your Front Yard May Be More Relevant To Your Interests Right Now Than People Dying In Africa"

So say goodbye to news about difficult, long-term, complex stories - more people, after all, will LikeTM stories which are likable, not tough ones like famine and war. And some of those people will be in your social graph, so that's all you'll see as personalisation rolls out across the Web like a suffocating blanket.

After all, Facebook didn't roll out an 'Important' button.

I'm just scratching the surface, but you get the drift. In the little world of EU communications, eurosceptics will see only content negative about the EU, and those within the Brussels Bubble crowd won't see anything to disturb their worldview, either.

You can argue, of course, that this is already the case, but its disheartening to think that Web Personalisation will make things even worse.

But personalisation is a much bigger problem. Greater political and social polarisation is in noone's interests - the last thing this planet needs is for the world's first global medium to fragment into 7 billion echo chambers, each with a population of 1.

Tweet about this on TwitterShare on Facebook0Share on Google+0Share on LinkedIn0
Author :
Print

Comments

  1. These are important and intriguing ideas. But personalisation algorithms may have difficulty matching the power of real world mechanisms to limit access to information and ideas that challenge those of the tribe (religion, political party, nationality). Right-wingers always did read right-wing newspapers. At least at present, algorithms seem to be better at identifying subjects I might be interested in (such as EU affairs) than the point of view I espouse.

  2. Well, to be fair to the book, one has to read it – I only skimmed the surface, above.

    It’s true that people have always gravitated to media matching their views, so personalisation technology is not required for online echo chambers to exist, as I found years ago.

    But this is something they choose to do. And there was always mainstream media available – newspapers, magazines, TV stations – that had to aim at large markets, as the technology didn’t allow them to do otherwise. As a consequence, they had to present their viewers with a spectrum of content, because they were broadcasting, not narrowcasting. Even regular newspaper readers will be exposed to new subjects, because the paper’s Editor decides that it’s important.

    But when web personalisation becomes the norm (it’s not there yet), then this environment basically ceases to exist, and is replaced with one where you only see content you already think is important, and agree with the overall sentiment. You aren’t even aware that the rest exists.

    One important thing I didn’t have room for above is the vicious circle nature of this – your profile means you only see content matching your beliefs and preferences; and when you click on one of the proferred links that aspect of your profile is reinforced.

    There are plenty of other implications. I urge you to read the book. I can pass you my copy if you like 😉

  3. Dear friends,
    Still an interesting article here, I am delighted to read
    « who your Facebook friends are, what you read, where you were educated, and much, much more » you wrote in your interesting comment
    I would also add that the press he said it was not long that it was difficult to control Facebook
    I also think it’s a danger, because sometimes you do not know with whom you speak and with whom you are dealing, what do you think?
    Do you remember rallies around people of Facebook in France?
    I have read in ‘Charente libre » (France)
    This comment
    “The giants of the Internet world meet in Paris tomorrow and Wednesday for the first edition of “e-G8” including discussions on the challenges of the digital economy, society and the media are intended to feed the G8 in Deauville (27-28 May 2011).
    Facebook to Google and Amazon via e-Bay, all the big names of the internet will defend, at a series of round tables and workshops, their vision of the Internet and the economic model to be place to ensure its development and sustainability.”
    Kind regards
    Anna

  4. Nice post Mathew, as always.

    This might take this off topic a touch, and I apologise for that, but the point you raise about an ‘Important’ button is interesting. I have seen a few times recently, people ‘Like’ things that are actually unfortunate incidents in the lives of their friends. Things like not feeling well, or having more free time because they lost their job. Did you really ‘like’ that?

    With other services, StumbleUpon for example, a click on the button helps an algorithm to understand what you want to see and what you think others might be interested in. In the SU, digg, reddit worlds, news stories often rate very highly. And they are a great way to see unconventional news from smaller publications that we might otherwise miss.

    But as we use just a few services more and more, and we all are, we lose even this. This of course oversimplifies the personalisation of services that you highlight and that digg et al don’t offer in the same way. Still, where we are going with this does not seem to be a place where all information is available to everyone and they look for it, but one where a little bit of information is available to one person and they do not feel the need to look much further. Not the Alexandrian utopia we were all hoping for I think…

  5. The book has a lot to say about Facebook. As you can imagine, not much of it is positive. Their choice to call it a Like button – rather than, say Important – makes a huge semantic difference to what is shared and prioritised by you and your social graph, which in an era of ubiquitous personalisation has massive potential negative consequences for society. They don’t seem to care much.

    The problem is that FB is so huge. There are plenty of other services out there that provide more interesting and higher-granularity information, but FB just drowns out the signal.

    He also makes an interesting Facebook vs Twitter comparison, actually, which has helped me understand why I have always instinctively preferred Twitter to FB for my own use.

    Of course, these companies are now lobbying massively, as Anna points out. As they all have a lot of money to make from personalisation, they can all push in the same direction when it comes to legislation, so they don’t cancel each other’s efforts out as much as you’d hope.

    Those warning of the social dangers of personalisation, on the other hand, don’t have deep pockets. I wonder whether the EC is aware of these issues …

  6. Yep, this is one of the key differences between FB and other services – one of the things which made Eli investigate the impact of personalisation was when he noticed that Republican-oriented friends were disappearing from his FB news feed.

    A Democrat, he’d included them in his social graph because he wanted to understand what Republicans and conservatives think. But he didn’t click them as often as his Democrat friends, so FB decided to cut them from his Feed.

    And now there are rumours(?) that Twitter will personalise search results. And Google has been personalising search results since 2009.

    Anyway, what works, what you need and what you get are not always the same thing. I’m surrounded by stuff I don’t need and which doesn’t work! ;-(

    And there is absolutely no reason you will be given a choice. Personalisation offers sites better advertising returns. If a site adopts personalisation, you can’t opt out – all you can do is stop visiting that site. IF you realise that’s what it’s doing.

    Unless, of course, legislators decide otherwise.

  7. Dear friends,
    Thank you for your interesting comments, I really like
    I feel a certain fear in following sentences in the article
    I read these “worrying sentences” in this interesting article with interest and iall is normal, I think
    I wonder if we are afraid in general about all kind of informations we will contact without freedom and it would be a pity, what do you think?
    Sentences:
    “Personalisation offers sites better advertising returns. If a site adopts personalisation, you can’t opt out –
    And they are a great way to see unconventional news from smaller publications that we might otherwise miss.
    Still, where we are going with this does not seem to be a place where all information is available to everyone”
    Have a nice day
    Anna

  8. Interesting. I had noticed that my friend feed seemed to be very heavily weighted towards my Brussels pals. Since they are my more immediate friends with whom I share the most now, they appear much more. I had wondered if this was luck or my reticular activation just seeing what I wanted to see. But it seems not…

    Suddenly I trust my friend feed much less than I did!

  9. @Anna, Good point – personalisation could dig out content from “smaller publications that we might otherwise miss”. But only if it matches what the profile the ‘discovery service’ has on you says you want to see. You don’t get to change that profile, or even see that it’s happening.

    Anyway, there’s a huge difference between ‘personally relevant’ and ‘actually Important’. The risk is you’ll completely miss the very important piece of news because it doesn’t match your interest profile. It could be really important to you, or society in general.

    But you may never see it, because you’ve never registered a strong interest in it, so it’s not in your profile. And you probably never registered that interest because you never saw anything about it.

    And speaking of services which help discover content from across the Web…

    … @FinancialGuy, do you trust Google? Here’s an interesting experiment. Ask a friend or two to do a search in Google for the same search terms, and send you a screenshot of the results. Do the same search yourself. Compare.

    Google’s been personalising search results for over a year, tailoring the results to what they think the profile they have on you says you want to see. PageRank is just one of almost 60 different ‘signals’ they use to rank search results. There is no ‘vanilla Google’, where everyone gets the same results, any more.

  10. Oh no Mathew, it is over 200 search signals. But you are quite right, the simple act of being signed into Gmail – or not – has a large impact on results. As does the pc and cookies being used. And the country you are in. And, and, and…

  11. According to Eli’s book, it’s about 57, but I guess it depends on what you and I mean by the word ‘signal’ in this context. Clearly being logged into your google account will provide a whole lot more – they’ll have all of yoru content to index against your name, for starters.

    But it goes a lot further than that. There are algorithsm that can accurately forecast your political views on a small handful of seemingly irrelevant signals.

    This is a serious problem in a world of social media connections – banks refusing loans to someone because their friends have missed payments is a good example.

  12. Very interesting exchange, I should have noticed it earlier. Thanks to Matthew (BTW, BlogActiv CO-founder, if I may 🙂 )

    The worrying trends you describe may require either some regulation (not too much) or some European industrial policy, triggering some ‘privacy and independence-driven competitors’ (idem). And, even more importantly, consumer or citizen education.

    I also like this comment by BrusselsBlogger: ‘In my view, 100% personalisation will always fail. You will need to an algorithm (or better a human being) to add some serendipity.

    These human can be bloggers, but, mostly, we are tlaking of…journalists, working for bona fide media!

    One point that wasn’t addressed above – post already long – is language diversity. While having continentals reading English is good, opening their mind, at some degree it does contribute to groupthink as well.

    Do continue this debate!

    Christophe Leclercq

  13. Hi Christophe, thanks for dropping by, and while I’m at it thanks for introducing me to the concept of groupthink when I was Blogactiv Launch Director (I never claimed to be the founder, just the guy in charge of the site launch. It’s your baby).

    I certainly agree with the idea of some regulation, in theory – but regulating the global Internet from a national or EU level is pretty difficult in practice, particularly coupled to it’s light-speed technical development.

    Yes for consumer/citizen education (it’s partly why I posted about this), but I’m not sure about industrial policy ‘triggering competitors’ – what do you have in mind?

    Brusselsblogger is right to say that personalisation will not succeed when compared to human curation, but as I pointed out “there is absolutely no reason you will be given a choice [or the chance to make the comparison]. Personalisation offers sites better advertising returns. If a site adopts personalisation, you can’t opt out – all you can do is stop visiting that site. IF you realise that’s what it’s doing.

    As for the role of media, my post points out that personalisation is replacing journalists and editors as gatekeepers, as the media is forced by Internet-driven competitive pressure to adopt personalisation as a business model. Moreover, journalists may not be able to escape the Filter Bubble themselves, and thus reinforce it.

    As for multilingualism, that’s a book in itself. As a general rule, my thesis is that the Filter Bubble will reinforce existing barriers in EU debates. Personalisation, for example, will probably result in monolingual Filter Bubbles, once it’s been rolled out across the web – i.e., even if you speak 6 languages, you’ll probably only be presented with content in your ‘most commonly used’ language. Not helpful to the development of an EU online public space.

Leave a Reply