I try to read opposing views often, as that pretty much fits my job description for running WUWT, but not everyone does this. Some people are so steeped in tribalism that they won’t even venture outside of their comfort zone to see what the other side is saying, and when offered information by “outsiders”, flatly refuse to even consider it or even become combative towards anyone that suggests it. They tend to prefer being surrounded only by people they like and content that they agree with, and consider giving attention to any other views as “false balance”. Joe Romm and his Climate Progress blog is a good example of this, which is why he has such few comments these days. WUWT often posts press releases generated by the opposite side of the debate verbatim, so that we can consider the merit, I also post articles where I disagree with some of the content, but we also have our own problems like any collection of like minded people. On the plus side, love it or hate it, WUWT is read almost equally by both sides of the climate debate, if it weren’t, it would not have so many blog spawn.
From MIT technology Review, h/t to Steven Mosher
How to Burst the “Filter Bubble” that Protects Us from Opposing Views
Computer scientists have discovered a way to number-crunch an individual’s own preferences to recommend content from others with opposing views. The goal? To burst the “filter bubble” that surrounds us with people we like and content that we agree with.
The term “filter bubble” entered the public domain back in 2011 when the internet activist Eli Pariser coined it to refer to the way recommendation engines shield people from certain aspects of the real world.Pariser used the example of two people who googled the term “BP”. One received links to investment news about BP while the other received links to the Deepwater Horizon oil spill, presumably as a result of some recommendation algorithm.This is an insidious problem. Much social research shows that people prefer to receive information that they agree with instead of information that challenges their beliefs. This problem is compounded when social networks recommend content based on what users already like and on what people similar to them also like.
This is the filter bubble—being surrounded only by people you like and content that you agree with.
And the danger is that it can polarise populations creating potentially harmful divisions in society.
Read the entire article here: http://www.technologyreview.com/view/522111/how-to-burst-the-filter-bubble-that-protects-us-from-opposing-views/
Ref: arxiv.org/abs/1311.4658 : Data Portraits: Connecting People of Opposing Views
Social networks allow people to connect with each other and have conversations on a wide variety of topics. However, users tend to connect with like-minded people and read agreeable information, a behavior that leads to group polarization. Motivated by this scenario, we study how to take advantage of partial homophily to suggest agreeable content to users authored by people with opposite views on sensitive issues. We introduce a paradigm to present a data portrait of users, in which their characterizing topics are visualized and their corresponding tweets are displayed using an organic design. Among their tweets we inject recommended tweets from other people considering their views on sensitive issues in addition to topical relevance, indirectly motivating connections between dissimilar people. To evaluate our approach, we present a case study on Twitter about a sensitive topic in Chile, where we estimate user stances for regular people and find intermediary topics. We then evaluated our design in a user study. We found that recommending topically relevant content from authors with opposite views in a baseline interface had a negative emotional effect. We saw that our organic visualization design reverts that effect. We also observed significant individual differences linked to evaluation of recommendations. Our results suggest that organic visualization may revert the negative effects of providing potentially sensitive content.