![]() |
Saturday, April 10, 2004 |
In Language Log a few days ago, Mark Liberman recommended Peter Culicover's review of Huddleston & Pullum's Cambridge Grammar of the English Language (CGEL). [Language Log] I've just read the review, and I agree. Reviews this well-written, this civilized, this informative, and this thought-provoking are rare. After discussing the extraordinary level of accurate descriptive detail, and CGEL's radical departure from modern syntactic theory in rejecting abstract entities unsupported by direct evidence, Culicover asks: Although CGEL appears to be a descriptive grammar of the traditional sort, there are vast differences in detail, and in this case the devil is in the details. The problem for the theorist is to explain how in the world someone could have all of this stuff in his or her head. This is, to my mind at least, a vastly interesting question, and it becomes more interesting, the more detail there is that has to be gotten into the head. For it is clear that most of this knowledge cannot, I repeat, cannot be a priori. That is, it cannot arise from a setting of parameters, where the learner, given some exemplars from the language, settles quickly on a generalization that accounts for all, or virtually all, subsequent examples from the language.Culicover goes on to give a nice example of the kind of detail that needs to be learned just to be able to use appropriate adjectives with verbs such as fall and turn ( fall asleep is acceptable, but not turn asleep; turn sad is acceptable, but not fall sad; and many other such facts). Using a thought experiment, he eventually concludes that the grammar does not correspond in any principled way to the architecture of the Language Faculty. Rather, it is a description of the way in which the Language Faculty behaves [...] It is an open question of whether the explanations of the observed phenomena are to be found in the architecture of Language Faculty, in the organization of Conceptual Structure, in an account of the constraints on real time processing of natural language [...] in the cognitive and computational constraints on language learners, in the dynamics of information structure in discourse, or in the social and cognitive dynamics of language interaction and change.After this, what remains of the idea of a well-defined "Language Faculty"? 7:10:01 PM ![]() |
Cringley on Microsoft's hegemony [...]Microsoft is going to dominate computing for the next decade or two. The question is whether the company will sink its revenue-sucking needle into everything that contains software, including embedded devices and all information that moves.[Dan Gillmor's eJournal] FUD from the Microparanoia camp. Microsoft has little involvement in most of the computations that find, route, and store bits, in search engines, routers, cell phones, storage farms, little involvement in most embedded computation, little involvement in most high-performance computation in science, engineering, and finance. The up-and-coming hubs of information technology in Asia are quite unlikely to let themselves be rolled over by Microsoft. Re-fighting the last war, Microsoft-obsessed writers show a serious lack of understanding about what computing is becoming, and lack of imagination about what it might become. Microsoft is much more serious about figuring out that future and influencing it than most of its opponents. Which helps understand their staying power. Sun's story was totally predictable: they just repeated in the 80s and 90s the mistakes of DEC in the 70s and 80s. In particular, they never cared to understand software and foster a rich software ecosystem. Useful advances like Java were undermined by arrogance, indecision, and lack of focus. Microsoft didn't win the war, Sun and others lost it. Monopolistic practices grew unchecked because the competition was so clueless. 3:40:19 PM ![]() |