I think this is an important place where sociology can correct a problem inherent to economics. Mainstream economics is built on the assumption that, all else being equal, people behave rationally, that is, given the opportunity, people will maximize their rewards and reduce their costs. It seems simple enough; however, it turns out that we have a lot of evidence to the contrary: people do some pretty stupid shit. If we take seriously, as sociology instructs us, the notion that people are actually irrational actors, we come to a radically different conclusion about the motivations for those who cling to theories and worldviews that are inconsistent with both logic and empirical fact. Insisting on beliefs that are wrong does not make one a bad person; it makes him human.
Consider these three interrelated sociological concepts:
The confirmation bias, simply defined, is the bias toward accepting information that confirms our worldview without critique while at the same time being overly critical of information that counters [our] preconceived notions of the world.
The fundamental attribution error is the idea that each of us as an individual is biased toward viewing our behaviors within the context of our circumstances. However, when we view the behaviors of others we attribute their behaviors to who they are as a person or to their character.The backfire effect:
...[T]he corrections [of misinformation] fail to reduce misperceptions for the most committed participants. Even worse they actually strengthen misperceptions among ideological subgroups in several cases....[C]itizens engage in "motivated reasoning"....[D]irect factual contradictions can actually strengthen ideologically grounded factual beliefs (emphases from original).In a very PoMo kind of way, facts are suddenly irrelevant. Of central and sole importance is one's worldview and experiences. I'm not only predisposed to believe evidence that is consistent with my beliefs and to dismiss/downplay/ignore evidence that is inconsistent with my beliefs, but any attempt to present countervailing evidence or to correct any internal inconsistencies to my beliefs are more likely to make me retrench than to persuade me. Compounding all of that, I am unlikely to recognize my own errors but quick to challenge the character of those who make similar errors to me. In other words, "they" are dumb idiots who cherry-pick only the facts that back them up and should be ashamed for being such horrible people, while "we" are intelligent, informed, and fair and should take a timeout to pat each other on the back for being so virtuous.
Stepping back from the brink, facts do matter, and we should take them seriously. My larger point here is that the only way to be serious about considering facts is to avoid error and bias, and sociology shows us that the only way to do this is to make sure that we are being equally critical of ourselves as we are of others, a task that is not as easy as it might appear. If we see people as innately rational, we must attribute irrational beliefs and behaviors to some Machiavellian intent; if, on the other hand, we acknowledge that people are actually irrational, we can accept that we--all of us--can have the best of intentions and still get it horribly, horribly wrong.