What you think you think may not be what you think but what you feel

There’re these discussions that take place on Facebook, political in nature, that often brings out the best and worse in people. I’m sure you’ve seen them, maybe even participated in a few, they go something along the lines of…

Me: (Referring to the recent democratic debate on CNN) 15.3 million of us must’ve seen something worth watchin last night.

The discussion thread that transpired after that one statement kind of stomped on the toes of one individual. He and I engaged in a series of back-and-forths that probably didn’t mean anything to anybody but us, and yet I think it said a lot about how our nation has become terribly fractured and divided over so many issues that define who we are.

When I stated that 15 million people must’ve seen something worth watchin my detractor took it to mean that I was making an assumption about whether or not the 24 million who watched the GOP debate had a better venue. He said:

So that means, by your statement, that since 25 million people watched the Republican Debate, that MORE people saw something worth watching in the Republican Debate.

Which is not what I said at all. But trying to dissuade my detractor was like trying to hit only one grain of sand with a baseball bat after tossing up a handful! And then a close personal friend posted a link to an article that provided me with a new perspective on why such discussions bring out the best and worse in us.

In his article “The Science of Why We Don’t Believe Science,” Chris Mooney (The Washington Post) talks about how we get “so emotionally invested in a belief system” and then go into denial when our belief or conviction is quashed or unsubstantiated. Many of us, myself included, don’t think about it but our “reasoning is actually suffused with emotion,” and lots of it!┬áBefore our brains even have a chance to think about it, our positive and negative feelings towards something causes us to rationalize instead of reason.

Even when we think we’re reasoning, our biased thought process causes a really weird phenomenon:

‘Confirmation bias,’ in which we give greater heed to evidence and arguments that bolster our beliefs, and ‘disconfirmation bias,’ in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

The Science of Why We Don’t Believe Science,” is a very intriguing read, especially if you want to glean a better understanding of why you defend things that you think are worth defending. As for my detractor, I’m going to have to find a better method of persuasion or give up trying, I think the latter would be less dangerous!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s