Nearly every high school choral organization routinely performs anthems based upon some version of a familiar trope. The piece is designed to be trendy musically, even while being more than a bit late (when I was in school, each had an obligatory “hard rock” section). Meanwhile, the lyrics are an earnest and perhaps cloying ode to the ability of the young to create a better and brighter tomorrow. One such title from my school days was in fact “Hope for the Future.”
Unfortunately, the promise always seems better than the execution.
Despite the enormous (and most often negative) impact that our behavioral and cognitive biases have on our thinking and decision-making, the prevailing view is that we can’t do very much about them. In his famous 1974 Cal Tech commencement address, the great physicist Richard Feynman emphasized the importance of getting the real scoop about things, but lamented how hard it can be to accomplish. “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Even Daniel Kahneman, Nobel laureate, the world’s leading authority on this subject and probably the world’s greatest psychologist, has concluded that we can’t do much to help ourselves in this regard.
But today — maybe — there might just be a tiny glimmer of hope (for the future).
As reported recently in MIT’s Technology Review, the concept of a “filter bubble” entered the public square back in 2011 when the political and internet activist Eli Pariser coined it to refer to the way recommendation engines shield people from certain aspects of the real world. In a practice that is surprising to many, internet search results are often tailored to conform to the interests and prejudices of users based upon prior habits and preferences. While it is understandable (to a point) that search engine designers might want to give their users what they want, this insidious practice merely confirms and accentuates our own tendencies. We all suffer from a more than healthy dose of confirmation bias, our tendency to notice and accept that which fits within our preconceived notions and beliefs while ignoring and rejecting everything else.
But new research by Eduardo Graells-Garrido (Universitat Pompeu Fabra in Barcelona), Mounia Lalmas and Daniele Quercia (both of Yahoo Labs) suggests that there may be ways to counteract the filter bubble. These researchers built a recommendation engine that points people with opposing views on sensitive subjects but other common interests towards each other on social media based upon their own preferences, resulting in exposure to many more viewpoints than they would get otherwise. Because this approach is based upon their own interests, users even end up being satisfied with the results.
“We nudge users to read content from people who may have opposite views, or high view gaps, in those issues, while still being relevant according to their preferences,” say the authors. The results suggest that people may be more open to ideas that oppose their own than is generally thought. “We conclude that an indirect approach to connecting people with opposing views has great potential.” There is no information on if and how anybody might obtain access to such a recommendation engine.
If we are overcome our inherent biases, it has to start by being exposed to different data sets and viewpoints — whether by our own choices or via a recommendation engine –consistent with the benefits of what Kahneman calls “adversarial collaboration.” This research offers nothing like a fix, much less a quick fix to the problems we create by being so willing and able to fool ourselves. But it does offer a least a little bit of hope for the future.