Your opinions are the result of years of paying attention to information which confirmed what you believed while ignoring information which challenged your preconceived notions.
Half-a-century of research has placed confirmation bias among the most dependable of mental stumbling blocks.Journalists looking to tell a certain story must avoid the tendency to ignore evidence to the contrary; scientists looking to prove a hypothesis must avoid designing experiments with little wiggle room for alternate outcomes.
Without confirmation bias, conspiracy theories would fall apart. Did we really put a man on the moon? If you are looking for proof we didn’t, you can find it.
Another study at Ohio State in 2009 showed subjects clips of the parody show “The Colbert Report,” and people who considered themselves politically conservative consistently reported “Colbert only pretends to be joking and genuinely meant what he said.”Of course, I am only showing the parts of the article that confirm my own biases.
In science, you move closer to the truth by seeking evidence to the contrary. Perhaps the same method should inform your opinions as well.I don't watch Fox News, but then again, I don't watch any other news either (the Daily Show does not count as news). In my normal life, I don't seek out differing opinions, although I would like to think I am open to new ideas if they are presented to me. In my blogging life, I probably spend way too much time at the same set of sites to get my info (slate, NYT, fivethirtyeight). In my professorial life, I do seek out the stuff that disagrees with me. OK, if it is well executed (I don't read Robert Kaplan or Sam Huntington unless I am compelled to do so, usually by guys in uniform). The reviewers will hammer me if I ignore the opposing arguments, so my confirmation tendencies are overwhelmed by the system that ensures a broader reading.
Perhaps we all need anonynomous reviewers to push us to think outside of our boxes?