Tuesday, September 09, 2003

Willful ignorance.

Hylton Jolliffe of Corante pointed me to this great post on one of Corante's weblogs that I don't frequent. Very helpful in understanding issues I encounter every day.

'Tis Folly To Be Wise

I came across an article in my files today that I thought I'd share. It's by the late Calvin Mooers, an information scientist. He addressed his colleagues on the question of why some information systems got so much more use than others - often with no correlation between the amount of use and how useful the tools actually were.

"It is my considered opinion, from long experience, that our customers will continue to be reluctant to use information systems - however well designed - so long as one feature of our present intellectual and engineering climate prevails. This feature - and its relevance is all to commonplace in many companies, laboratories, and agencies - is that for many people it is more painful and troublesome to have information than for them not to have it."

When I first read this, I experienced that quick shock of encountering something that you feel as if you'd known all along, without realizing that you knew it. Of course. It's not a new idea, but we keep having to learn it over and over. Mooers again:

"Thus not having and not using information can often lead to less trouble and pain than having and using it. Let me explain this further. In many work environments, the penalties for not being diligent in the finding and use of information are minor, if they exist at all. In fact, such lack of diligence tens often to be rewarded. The man who does not fuss with information is seen at his bench, plainly at work, getting the job done. Approval goes to projects where things are happening. One must be courageous or imprudent, or both, to point out from the literature that a current laboratory project which has had an extensive history and full backing of the management was futile from the outset."

Oh, yes. Yes, indeed. I've seen these examples made real right in front of my eyes, and more than once. Have I mentioned that Mooers wrote all this in 1959? The problem has not lessened one bit since then. If anything, our vast information resources and the powerful tools we have to dig for it have made things worse. Just try being the person who finds a patent claim that stops a project in its tracks, one that was missed while the work went on for months. Or find out that a close analog of the lead compound was found to be toxic twenty years ago.

We're supposed to be able to find these sorts of things. But everyone assumes that because it's possible to do it, that it's been done. Taken care of: "Didn't we see that paper before? I thought we'd already evaluated that patent - isn't that one one that so-and-so found? It can't be right, anyway. We wouldn't have gone this far if there were a problem like that out there, clearly."

My rule, which I learned in graduate school and have had to relearn a few times since, is to never take anything on faith when you join a new project. Go back and read the papers. Root through the primary literature. Look at the data and see if you believe it. If you let other people tell you what you should believe, then you deserve what you get when it comes down around your ears.

 
I don't think we can afford this kind of behavior any longer either as organizations or as individual knowledge workers, although there's no question we continue to reward it. Two things have changed.
 
One is that the excuse that it is too difficult or expensive to track down and check relevant information is no longer tenable. The problem has changed. The risk today is that the potentially relevant information is too vast and easily obtained and threatens to overwhelm you. This can be managed with modest investment in learning how to search.
 
The second thing that has changed is a requirement to understand what kinds of information pose the greatest risks to an initiative. You may be reluctant to go searching for the "ugly fact" but your competitors may not be so hesitant.
 
What's tricky is that you still operate in an environment of imperfect information. One of the entries in my personal collection of quotes worth thinking about comes from Samual Butler; "Life is the art of drawing sufficient conclusions from insufficient premises." More information may be available but you still have to make a decision and there's always a timetable. But you now have to think explicitly about what information to seek out within the limits of the time available. The old excuses are gone.
[McGee's Musings]

4:25:32 PM