Home | ![]() |
Updated: 5/1/2003; 12:32:39 PM. |
Synthetic Morpheme Christopher Taylor's editorials on Science, Technology, Salsa dancing and more ![]() Chris Winstead made the observation the other day that C++ programs he had written just a few years ago won't compile today without modifications, yet Euclid can still be read and used after 2300 years [comments]. What a timely comment that was [The Hundred-Year Language]. What programmers in a hundred years will be looking for, most of all, is a language where you can throw together an unbelievably inefficient version 1 of a program with the least possible effort. At least, that's how we'd describe it in present-day terms. What they'll say is that they want a language that's easy to program in. There are many things that are going to change because of speed. On the one hand, the tools we use to construct programs should be using that available speed. Emacs should finally die and the editors we use will become more proactive in helping construct the program. Furthermore, interfaces should become simpler since the libraries we use should start to do more work for us. For instance, it isn't necesarry to use specialized data structures that allow for efficient iteration if you have tons of cycles to spare. Instead you can use simple lists and sets and iterate over them in an inefficient manner. The point is to save development time. The desire for speed is so deeply engrained in us, with our puny computers, that it will take a conscious effort to overcome it. In language design, we should be consciously seeking out situations where we can trade efficiency for even the smallest increase in convenience. I've been trying to incorporate these principles in my own development for years. Even now, it is very often the case that time spent optimizing is time wasted. I generally try to find solutions to problems that are easy to implement. After I have a solution in place, then I will go back and optimize trouble spots. The key is to save development time, not processor time. I am not as confident as Paul Graham about how well we will be able to predict the 100 year language. I think the work of Stephen Wolfram has shown that the central paradigm of mathematics has been very limiting to both computer science and science in general. The current generation of computers and software is based on building systems that are predictable and reducible at every layer. The next big shift may start to exploit the potential of systems were the underlying operation is not understood clearly, but only the finaly results. Finding "algorithms" where the underlying process is not clearly understood will require brute force searches within the problem domain. Genetic algorithms and other possible techniques will help in the search, but vast amounts of computing power will bring it all together and make it possible. This approach will mark an important departure from the way things have been done over the past 50 years. 9:38:46 AM
|