![]() |
Friday, July 23, 2004 |
Don Norman interview to TechWorld: And he is continuing to look into the computer's future. The world wants compatibility now, he says. It wants to communicate, and this means one brand dominating. "This is not the 'computer age' any more; this is the age of very smart chips hooked into a huge worldwide network. Infrastructure is about sharing." This means two or more incompatible ways of doing things is counterproductive. [via MacNN] I hope for the sake of Don Norman's reputation that he is being misquoted. All the evidence from the break-up of the Bell System shows that communication can actually be improved by breaking a monopoly's stranglehold. Old telephone monopolies too protested that allowing third-party devices and other networks to be connected to theirs would hurt quality. But we don't recall any tales of woe from when finally you could buy a phone anywhere to connect to Ma Bell's circuits. The development of the Internet shows too that monopoly is not required for effective communication. In both cases, what was needed was simply the definition of standard protocols. These are not and don't need to be perfect, but they are good enough for the enormous explosion of communication we witness. For example, the mobile phone industry did not need a monopoly to come up with SMS, which by all measures is an extraordinary success, as the Internet did not need monopoly to create the Web. In turn, the Web provides a fertile environment for other non-monopolistic services like search. By construction, search cannot be monopolistic: anybody with the required resources can crawl the Web and put up a search engine, without requiring much cooperation from the indexed resources.
If the interview is accurate, Don Norman has fallen into the trap of believing that valuable services need deep cooperation between many parties, which can only be achieved by having a single supplier for the software deployed by all the parties. I seem to remember that the Soviet Union had the same command-and-control view of its economy. The reality is that deep semantic agreement between many parties is impossible to achieve, even if they have the same software supplier, because the assumptions of parties, which go into how they configure their software, depend on many unstated judgments of value and likelihood. What has worked, on the wired phone system, on wireless, on the Internet, are relatively light weight protocols with few semantic assumptions, which can be implemented by many parties independently. This came about not because of amazing foresight by all involved, but because it is good engineering and good organization to minimize interdependencies between components and participants, and most projects that do not end up floundering.
One interesting side question is why smart people like Don Norman would believe against at historical evidence that particular large enterprises (like Microsoft) are exempt from the problem of over-design that have selected against top-heavy systems. Even Windows and Office, at their best, succeeded because they made few semantic assumptions. An operating system doesn't care what files and packets mean except for a very limited set of protocol distinctions (is this an executable file, is that an ACK packet), not does a word processor need to distinguish between a novel and a newspaper article. To the extent that software tries to go deeper into the intent of users, it fails miserably, and that's why the paper clip was so ridiculed.
To paraphrase George Washington, beware of semantic entanglements. |