The Willingness to Accept Challenges to Your Beliefs
Charges of "flip-flopping" can be pretty meaningless sometimes (the '04 election did nothing to help this). I'm not a Keynesian, but one of my favorite quotes is from him:
When the facts change, I change my mind. What do you do?
While debating anyone with different worldviews than your own can often be frustrating, it has great potential for learning on both sides, provided neither party suffers from excessive confirmation bias. I fear this in myself, and try as best I can to keep an open mind in situations where the data seems to indicate a failure of libertarian thought (or wherever I happen to stand). Tim Lee channels Jane Jacobs and makes some interesting points about worldviews and people's ideological blinders.
Somewhat relatedly, here is an interesting case of Federal Air Marshalls proving that if you look hard enough for something, you're going to find it. Especially if your job depends on it.
3 Comments:
Two comments:
First, re: the Jacobs interview, I think one of my more frightening realizations from my brief and humanities-friendly foray into cognitive learning patterns (but in which I saw a lot of hard and damning evidence on the way we teach kids), these world-views or, better put, systems by which we understand the world, form far, far earlier that we like to think. From young ages we don't process information in an absolute sense, but filter it to make it make sense to us. This can lead to radical misconceptions throughout life (one of the classic examples being that a FRIGHTENINGLY high percentage of Harvard physics majors can't answer a basic question about gravity on their graduation days), and the evidence tells us this warping of information to fit our own preconceptions starts very young. So the comment Jacobs made might be a tad optimistic, I think. We're making our worldviews (admittedly on a much more literal level) long before high school.
Second, I liked this comment a lot from Tim Lee's page:
"The key, I think, is to detach one's sense of moral or self-worth from one's worldview. Because you are wrong about, say, evolution or the effects of the minimum wage doesn't mean you're an evil so-and-so. And being right about those things won't make you a good person either. They just don't have much to do with each other."
Indeed that is a good comment.
As I have pointed out here somewhere before, they've done studies on people's brain patterns while watching political debates, and discovered that the reasoning portion of the brain is essentially inactive, while the emotional centers are highly active. Which is freaky, in a way, but it could be a reasonable approach to dealing with the complexities of the modern world: we can't possibly know enough to comprehend most issues, so we outsource our "thinking" on it to someone who we have decided we trust.
But still...
Right, but what's scary is how literal this can be for kids, and, really, for all of us.
An example (the one we saw was on photosynthesis with 5th graders, but this is a parallel example): you ask an 10 year-old kid what he thinks Alaska is like, and he'll probably say "it's all covered in snow, and everybody lives in an igloo." Given the stereotypes to which kids are exposed, this is a reasonable preconception (i.e. he's not saying it's like the Sahara).
So, as a teacher, you know the preconceptions, and you try to teach this kid about how much MORE there is to Alaska. You spend three weeks doing a unit on Alaska, talking about different flora and fauna during different seasons, all the different industries, its history and settlement patterns, its diversity, etc. Great. The kid aces that test, cause he studied hard, and he got it, and he knows a lot about Alaska now.
Except, studies show, three months later, at best, he'll think that it's all snow except for three weeks in July, when the snow melts a bit and people come outside, plus a few of the Alaska natives have decided to live in cabins now, but only a few, and most of them still use sleds a lot. (In the photosynthesis example, a kid who got the highest grade in the class on the unit had reverted to his preconception that trees grew because of food (energy) coming through the roots).
It seems at best, we alter preconceptions, but we can't process information that doesn't fit our internal cognitive models, so we can't divorce our understanding from them entirely.
If this is happening re: trees or Alaska, imagine trying to engage in conversation over a complex debate like evolution v. creationism. Perhaps that's why the conversation rarely seems to get too far.
Post a Comment
<< Home