Robotic speech recognition has made huge advances in recent years, allowing for easy voice interaction with robots. But robotic vision processing is still very rudimentary. Some robots "see," but they need powerful computers and are not very mobile. A team of researchers at the University of Arizona wants to change all this by mixing biology and electronics to create robotic vision. The team has designed a visual navigation system by mimicking insect vision and demonstrated the concept by building a robot named Gimli. Instead of using standard microprocessors, the team devised electronic vision circuits based on a bunch of slower analog processors working in parallel. The next step will be to develop a microchip-based vision system able to do specific tasks, such as following "a moving object like a soccer ball without getting confused by similarly shaped or colored objects." The team thinks the first such microchip might cost $30,000 to produce. But when the price goes down to $20, the market will be huge. Read more...
Charles Higgins, assistant professor of Electrical and Computer Engineering (ECE) at the University of Arizona, thinks that robots can't compete with humans, at leas today, because their lack of vision.
However, Higgins is working to change that. He hopes to make robots more physical by giving them sight and an ability to react to what they see.
"Right now, robots in general are just pitiful in terms of visual interaction," he said. True, a few of today's robots can see in some sense, but they aren't mobile. These vision systems are connected to large computers, which precludes their use in small, mobile robots.
Outside of these few vision-only systems, today's robots see very little. "Wouldn't it be nice to have a robot that could actually see you and interact with you visually?" Higgins asks. "You could wave at it or smile at it. You could make a face gesture. It would be wonderful to interact with robots in the same way that we interact with humans."
Higgins and other researchers at his lab are trying to reach this goal through neuromorphic engineering, which combines biology and electronics.
Higgins and his students are developing an airborne visual navigation system by creating electronic clones of insect vision processing systems in analog integrated circuits. The circuits create insect-like self-motion estimation, obstacle avoidance, target tracking and other visual behaviors on two model blimps.
These circuits don't use standard microprocessors. Instead, they're based on what's called "parallel processing" -- a bunch of slower, simpler analog processors working simultaneously on a problem. In traditional digital computers, problems are solved in serial fashion, where a single fast digital processor flashes through a series of steps to solve the problem sequentially.
Why did they choose to use parallel processing?
The human eye, for instance, processes information at the equivalent of about 100 frames per second (fps) -- much faster than a movie camera, which trundles along at 24 fps or a video camera that runs at 30 fps.
Each frame is processed for luminance, color, and motion, and the resulting images aren't blurred or smeared. Doing that with a conventional computer is extremely complicated, requiring expensive processors and huge gulps of power, Higgins says. "It requires a lot of data moving at a very high rate of speed and in a very small instant of time."
It's a little like sending a digital computer out to play baseball. It has to continually rush between all nine positions on the field sequentially, catching a fly ball in left center and then rushing to first to catch the throw it made from center field.
Parallel processing -- which mimics the way biological systems solve problems -- would play baseball by stationing a slower processor at every position.
||Here is a photo of Gimli, a robot with insect-like vision, which was designed in Higgins lab. (Credit: University Of Arizona)|
||And here, a student is adapting Gimli's eyes (Credit: University Of Arizona).|
The two above pictures were extracted from this short movie (QuickTime format, 6 MB). (Credit: University Of Arizona)
Higgins now wants to develop a microchip-based vision system. If he can pack enough vision processing power in a microchip, he thinks that possibilities are endless.
"I'd like to give engineers a vision chip set like this and see what they would do with it," Higgins said. "My bet is that they would use it for things we could never imagine now. And I know it would be a really big thing."
I wish him good luck. And for more information about reasearch efforts mixing biology and electronics, please read this former entry, "Biomimetic Robots: A Photo Gallery."
Sources: Ed Stiles, University of Arizona Nes, October 22, 2004; and various websites