Yesterday, I told you about a new concept applied to an old technology: flexible television screens that you'll be able to unroll wherever you want. For details, read "Roll up for the floppy television."
Today, we'll look at an old concept applied to a new technology, a far more geekier subject: computers with no clocks.
Before discarding this web page as a headache generator, please remember you have the whole weekend in front of you to read about the concept and assimilate it.
The two authors, Ivan E. Sutherland and Jo Ebergen, are working for Sun Microsystems. And their research group already has produced interesting results: the UltraSPARC IIIi processor recently introduced by Sun includes some asynchronous circuits developed by their group.
Let them explain the concept of asynchronous computers.
How fast is your personal computer?
When people ask this question, they are typically referring to the frequency of a minuscule clock inside the computer, a crystal oscillator that sets the basic rhythm used throughout the machine. In a computer with a speed of one gigahertz, for example, the crystal "ticks" a billion times a second. Every action of the computer takes place in tiny steps, each a billionth of a second long. A simple transfer of data may take only one step; complex calculations may take many steps. All operations, however, must begin and end according to the clock's timing signals.
Because most modern computers use a single rhythm, we call them synchronous. Inside the computer's microprocessor chip, a clock distribution system delivers the timing signals from the crystal oscillator to the various circuits, just as sound in air delivers the beat of a drum to soldiers to set their marching pace. Because all parts of the chip share the same rhythm, the output of any circuit from one step can serve as the input to any other circuit for the next step. The synchronization provided by the clock helps chip designers plan sequences of actions for the computer.
The use of a central clock also creates problems. As speeds have increased, distributing the timing signals has become more and more difficult. Present-day transistors can process data so quickly that they can accomplish several steps in the time that it takes a wire to carry a signal from one side of the chip to the other. Keeping the rhythm identical in all parts of a large chip requires careful design and a great deal of electrical power. Wouldn't it be nice to have an alternative?
Our research group at Sun Microsystems Laboratories seeks such alternatives. Along with several other groups worldwide, we are investigating ways to design computing systems in which each part can proceed at its own pace instead of depending on the rhythm of a central clock. We call such systems asynchronous. Each part of an asynchronous system may extend or shorten the timing of its steps when necessary, much as a hiker takes long or short steps when walking across rough terrain.
So what are the potential benefits of asynchronous systems? They include include faster speeds, lower power consumption and less radio interference.
Are you intrigued? Convinced? Do you want to learn more about local coordination circuits designed to ensure orderly flows of data? Do you want to explore the concepts of the two most important coordination circuits called the Rendezvous and the Arbiter?
If your answer is yes, read the full article. If your answer is no, you can go to the beach and unroll your TV screen.
Source: Ivan E. Sutherland and Jo Ebergen, for Scientific American, July 15, 2002
5:52:35 PM Permalink
|
|