Imagine you are trying to grow tomatoes. You believe that you'll get the juiciest, most succulent tomatoes if you prep your soil with ample compost and then water your bushes every evening, right before sunset, for 50 minutes. You've done this every year, and your tomatoes seem to be fine.
Then imagine that someone else tells you that it's actually optimal to water your tomatoes just 2 gallons per week, and that your relative overwatering is actually contributing to root rot. This person gives you miles and miles of studies from many different reputable sources showing that the optimal watering is, in fact, exactly 2 gallons per week and that your current practices are not only wasting water, but lowering the quality of your tomatoes.
Faced with the evidence, you'd amend your previous watering behavior, right?
Well, it turns out that most people may, when faced with facts that contradict a belief they hold, actually cling to their initial, incorrect belief even more strongly than ever. There's an interesting article from The Boston Globe right now where Joe Keohane writes that when people are confronted with contradicting facts, instead of changing their minds, they often become even more entrenched in their original beliefs. A lot of it seems to be about defensiveness: it's hard to admit that you were wrong; easier is to argue ever more forcefully that the facts you are being presented with--not your own beliefs--are flawed.
It has some interesting ramifications, though. Of course I'd like to believe that people will ultimately be swayed by and make their decisions based on the truth (with a little t, I suppose:). I'd also like to believe that if I hold incorrect beliefs, I would be able to recognize them as such when confronted with reputable contrary evidence. But honestly, with the glut of information available to us right now, it's often hard to tell which "facts" are real and which are simply fabrications. And beyond that, we also tend to be hugely cynical about facts (perhaps wisely) since depending on how they're presented, facts often seem to imply very different things--an ambiguity that makes it ever easier to dismiss the facts that don't align with what you already believe. But if we're all just working from our own beliefs, how can we ever make informed decisions that are based on what's actually going on in the world?
The interesting thing is that people with a good self-concept, or people who feel good about themselves (I know, a very amorphous concept), are more likely to change their minds when facts seem to contradict their beliefs. I like the implication: if you want to change someone's mind, the best way to do it is not to belittle their beliefs or attack them, but to approach them from a place where they don't feel threatened, where they don't immediately jump to the defensive.
I guess for my own life, I want to be vigilant about knowing why I believe what I do, and I want to make sure that I'll change my mind if what I believe is wrong. What's working against me, though, is the fact that I'm stubborn, and that when it comes down to it, I usually think I'm right. heh. Okay, that's more glib than I mean, but I do have a pretty high regard for my own thoughts, and--especially if I'm being directly challenged--it usually takes me more than it probably should to convince me that I'm wrong. But I don't want to be the person who keeps watering her tomatoes like a superstitious maniac even as everything around her tells her that she's doing the wrong thing for the outcome she wants. Keep me honest, guys, and each other too.