Roland Piquepaille's Technology Trends
How new technologies are modifying our way of life


jeudi 28 octobre 2004
 

Robotic speech recognition has made huge advances in recent years, allowing for easy voice interaction with robots. But robotic vision processing is still very rudimentary. Some robots "see," but they need powerful computers and are not very mobile. A team of researchers at the University of Arizona wants to change all this by mixing biology and electronics to create robotic vision. The team has designed a visual navigation system by mimicking insect vision and demonstrated the concept by building a robot named Gimli. Instead of using standard microprocessors, the team devised electronic vision circuits based on a bunch of slower analog processors working in parallel. The next step will be to develop a microchip-based vision system able to do specific tasks, such as following "a moving object like a soccer ball without getting confused by similarly shaped or colored objects." The team thinks the first such microchip might cost $30,000 to produce. But when the price goes down to $20, the market will be huge. Read more...

Charles Higgins, assistant professor of Electrical and Computer Engineering (ECE) at the University of Arizona, thinks that robots can't compete with humans, at leas today, because their lack of vision.

However, Higgins is working to change that. He hopes to make robots more physical by giving them sight and an ability to react to what they see.
"Right now, robots in general are just pitiful in terms of visual interaction," he said. True, a few of today's robots can see in some sense, but they aren't mobile. These vision systems are connected to large computers, which precludes their use in small, mobile robots.
Outside of these few vision-only systems, today's robots see very little. "Wouldn't it be nice to have a robot that could actually see you and interact with you visually?" Higgins asks. "You could wave at it or smile at it. You could make a face gesture. It would be wonderful to interact with robots in the same way that we interact with humans."

Higgins and other researchers at his lab are trying to reach this goal through neuromorphic engineering, which combines biology and electronics.

Higgins and his students are developing an airborne visual navigation system by creating electronic clones of insect vision processing systems in analog integrated circuits. The circuits create insect-like self-motion estimation, obstacle avoidance, target tracking and other visual behaviors on two model blimps.
These circuits don't use standard microprocessors. Instead, they're based on what's called "parallel processing" -- a bunch of slower, simpler analog processors working simultaneously on a problem. In traditional digital computers, problems are solved in serial fashion, where a single fast digital processor flashes through a series of steps to solve the problem sequentially.

Why did they choose to use parallel processing?

The human eye, for instance, processes information at the equivalent of about 100 frames per second (fps) -- much faster than a movie camera, which trundles along at 24 fps or a video camera that runs at 30 fps.
Each frame is processed for luminance, color, and motion, and the resulting images aren't blurred or smeared. Doing that with a conventional computer is extremely complicated, requiring expensive processors and huge gulps of power, Higgins says. "It requires a lot of data moving at a very high rate of speed and in a very small instant of time."
It's a little like sending a digital computer out to play baseball. It has to continually rush between all nine positions on the field sequentially, catching a fly ball in left center and then rushing to first to catch the throw it made from center field.
Parallel processing -- which mimics the way biological systems solve problems -- would play baseball by stationing a slower processor at every position.
Gimli, a robot with insect-like vision Here is a photo of Gimli, a robot with insect-like vision, which was designed in Higgins lab. (Credit: University Of Arizona)
Adjusting Gimli's eyes And here, a student is adapting Gimli's eyes (Credit: University Of Arizona).

The two above pictures were extracted from this short movie (QuickTime format, 6 MB). (Credit: University Of Arizona)

Higgins now wants to develop a microchip-based vision system. If he can pack enough vision processing power in a microchip, he thinks that possibilities are endless.

"I'd like to give engineers a vision chip set like this and see what they would do with it," Higgins said. "My bet is that they would use it for things we could never imagine now. And I know it would be a really big thing."

I wish him good luck. And for more information about reasearch efforts mixing biology and electronics, please read this former entry, "Biomimetic Robots: A Photo Gallery."

Sources: Ed Stiles, University of Arizona Nes, October 22, 2004; and various websites


3:45:53 PM   Permalink   Comments []   Trackback []  


Click here to visit the Radio UserLand website. © Copyright 2004 Roland Piquepaille.
Last update: 01/11/2004; 09:07:42.


October 2004
Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            
Sep   Nov


Search this blog for

Courtesy of PicoSearch


Supported by
BigFitness.com

If you're tired to read about technology, it's time to take a break.
Try their exercise and fitness equipment.
Read more


Personal Links



Other Links

Ars Technica
Bloglines
BoingBoing
Daily Rotation News
del.icio.us
Engadget
Feedster
Gizmodo
I4U News
Mindjack Daily Relay
Nanodot
Slashdot
Smart Mobs
Techdirt
Technorati


People

Paul Boutin
Dan Gillmor
Lawrence Lessig
Jenny Levine
Karlin Lillington
John Robb
Dolores Tam
Jon Udell
Dave Winer


Drop me a note via Radio
Click here to send an email to the editor of this weblog.

E-mail me directly at
pique@noos.fr

Subscribe to this weblog
Subscribe to "Roland Piquepaille's Technology Trends" in Radio UserLand.

XML Version of this page
Click to see the XML version of this web page.