Today, we couldn’t publish article that we found fascinating because we could not get sufficient facts. It’s not easy to accept that what one thinks isn’t true and rarely does one search out facts that disproves our beliefs. In fact, most commonly, people search our information that supports their beliefs, a disposition called “conformation bias.”
Confirmation bias (also called confirmatory bias or myside bias) is a tendency of people to favor information that confirms their beliefs or hypotheses
“Confirmation Bias” 
Confirmation bias is all too common. It is particularly prevalent in emotionally charges subjects, as people selectively interpret incoming information, giving undue weight to evidence that resonates with their preexisting belief. Along with seeking out corroborating evidence, confirmation bias also manifests in peoples’ predisposition to overlook or dismiss conflicting evidence.
If you want to assert a truth, first make sure it’s not just an opinion that you desperately want to be true.
There are three major ways that the confirmation bias manifests itself: search, interpretation, and memory. When exploring a subject, people commonly focus on avenues of inquiry that confirm their beliefs. They approach a question with the presumption that they are right rather than challenging it. People can also be primed for confirmation bias by wording questions so that people are searching for certain evidence—are they happy or unhappy, is a certain person a good parent or a bad parent—rather than looking at an issue as a whole—”how do you feel?”  Even when presented with the same information, people will subject conflicting information to higher scrutiny and be more accepting of corroborating evidence. Some research indicates that this may be a reaction to relieve cognitive dissonance—a discomfort experienced when holding two conflicting beliefs.  Even if one can diplomatically process new information, quite often, it is subject to “selective recall,” in which people remember information that reinforces their beliefs. In one such study, believers in Extra Sensory Perception (ESP) had particularly poor recall of evidence provided that disproved ESP.
So it would be natural for him to enter the “research” process with a working hypothesis that he’s emotionally attached to. Hence a form of confirmation bias that confronts all journalists, and I don’t think any of us can honestly claim we always surmount it. Certainly I can’t.
Though it may be hard to change ones mind, a good starting point is seeking out opposing view points to see how one’s own hold up. Scientists have structural processes that expose research not only to internal rigor but external critique. Businessmen like Warren Buffet have encouraged and sought out criticism. While companies may solicit outside professionals to review their practices.
Their skepticism is what helps us get to our goal faster. And this is true of every field of science. Every criticism hurts like hell, but after the bruises have healed, we find that our results are more accurate.
Physicist and Science Writer
 “Confirmation Bias.” Wikipedia. Wikimedia Foundation, 28 Sept. 2013. Web. 01 Oct. 2013.
 Dooley, Roger. “ How Warren Buffett Avoids Getting Trapped by Confirmation Bias.” Forbes. Forbes Magazine, 07 May 2013. Web. 01 Oct. 2013.
 Lee, Chris. “Confirmation Bias in Science: How to Avoid It.” Ars Technica. N.p., 13 July 2010. Web. 01 Oct. 2013.
 Lord, Charles G.; Ross, Lee; Lepper, Mark R. (1979), “Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence”, Journal of Personality and Social Psychology (American Psychological Association) 37 (11): 2098–2109
 Russell, Dan; Jones, Warren H. (1980), “When superstition fails: Reactions to disconfirmation of paranormal beliefs”, Personality and Social Psychology Bulletin (Society for Personality and Social Psychology) 6 (1): 83–88
 Shafir, E. (1993), “Choosing versus rejecting: why some options are both better and worse than others”, Memory and Cognition 21 (4): 546–556, PMID 8350746 via Fine 2006, pp. 63–65
 Westen, Drew; Blagov, Pavel S.; Harenski, Keith; Kilts, Clint; Hamann, Stephan (2006), “Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election”, Journal of Cognitive Neuroscience (Massachusetts Institute of Technology) 18 (11): 1947–1958
 Wright, Robert. “How ‘Confirmation Bias’ Can Lead to War.” The Atlantic. N.p., 25 July 2012. Web. 01 Oct. 2013.
The Reason We Reason [wired]