May 16, 2008

The ethics of expert mumbling

I started this entry a few days ago, when I wrote, "I probably should be doing something else," and obviously I did. But now that I'm back for a few minutes, I want to think aloud (or in text) about policy hubris. I've been batting a book idea back and forth recently, based on the encouragement of a series editor, to explore how a school system has responded to growing demographic diversity over several decades. Like many school systems I have encountered or studied, its key officials have been and are fairly proud of the work the system has done. But while that pride was justified in some circumstances, pride also became a substantial blind spot, allowing officials to ignore problems that festered and to lash out at critics. Pride became institutional hubris.

When we talk about hubris, it's usually in a personal context. Some weeks or months or years ago, I listened to a key legislator or legislative aide talking about education policy. She was sharp: smart, knowledgeable, and quick-witted. How she used that tremendous skill set bugged me; while listening to her respond to questions from the audience, I thought,

She's immersed himself in the reports and materials available at the 40,000-foot level. She knows all of the arguments, and she knows the counter-arguments to push back at others in the conversation. She's cocksure and unaware of where she might be terribly wrong. And she's alienating almost everyone in the room.

I'm obscuring her identity to protect the guilty, but the hubris I witnessed in the room is a personal version of the institutional hubris in a large school system. No one should be allowed to be that cocksure without an occasional whack upside the ego, for the good of the individual (or school system) as well as the world. There's a point to all of the Shakespearean tragedies: hubris hides your flaws, including flawed reasoning.

Some weeks ago, I heard a scientist interviewed who explained his professional epistemology on being open to evidence. He took some of his reasoning from Karl Popper, the giant "science is falsifiable" philosopher of the twentieth century. But falsifiability was not just a stance about testing hypotheses, to this scientist. It was a matter of ethics: you have to be willing to be wrong to be a good scientist.

I think that's true of most disciplines. If you don't get some sense of wonder when a small fact turns over a preconception, you shouldn't call yourself a researcher. If you only go out to prove a prejudice, you're not a researcher. If you ignore evidence that undermines your claims, leave the field.

Unless, of course, you're one of the exceptions whose life story is going to leave me wondering if I should have been so definite in that last paragraph. There's always that possibility...

Listen to this article
Posted in Research on May 16, 2008 5:03 PM |