Roland Piquepaille's Technology Trends
How new technologies are modifying our way of life


dimanche 28 mars 2004
 

When you enter a room for the first time, you take a look at what's inside. Do you pay as much attention the second time? Of course not. And this is because you have a sense of memory. This is why researchers from Trinity College in Dublin are adding memory to virtual reality characters to give them a more realistic gaze. Technology Research News says the idea is to prevent a virtual human to always looking at the same spot when he enters again and again the same virtual room. In other words, virtual characters will pay attention to their environment the same way as we do.

Researchers from Trinity College in Ireland have added memory to a neurobiological model of visual attention in order to generate more realistic animation for virtual reality characters.
The idea is to endow characters with internal characteristics like memory and attention that can guide their movements, according to Christopher Peters, a computer science researcher at Trinity College [working in the Image Synthesis Group].
The key to providing a character with an internal representation of its environment is memory, said Peters. "The memory system provides a means of storage for information about what the character has previously perceived."

The memory model they used makes a distinction between long-term and short-term memory to select what needs to be stored. But it was a difficult task.

The challenge was figuring out, given an internal representation, or memory, of an environment, what parts of the representation would take precedence in attracting a character's interest, said Peters. "If you're walking down the street, what determines the priority [of what you] look at?" he said.

In order to meet this challenge, they mixed a gaze model from the University of Southern California with their own memory module.

The Trinity researchers combined the scene-based attention metrics from the USC attention module with object-based information from their memory module to find objects in a scene that attract a character's attention because they account for temporal changes in the scene like character or object movement, said Peters.
The researchers' gaze generator module depicts appropriate gaze and blinking motions based on factors derived from psychology literature to provide a final animation for the virtual human, said Peters.
The gaze generator from Trinity College On the left, you can see animation frames from the gaze generator for three different eccentricities. And on the right, the gaze generator is at work in a virtual city environment (Credit: Trinity College Dublin).

So you think this method is ready for prime time? Not so fast.

The researchers are currently refining their models to add task-driven attention requirements. They are also looking into adding an auditory sense, according to Peters. The ultimate goal is to provide virtual humans whose gaze behaviors are indistinguishable from real humans, said Peters.
A real-time virtual human performance that involves a full attention system will be practical in three to six years, said Peters.

This work has been presented during the Siggraph 2003 conference, under the title "Attention-Driven Eye Gaze and Blinking for Virtual Humans" (PDF format, 1 page, 2.22 MB).

For more information about this subject, you also can read two previous papers, "A Memory Model for Autonomous Virtual Humans" (PDF format, 6 pages, 131 KB) and "Bottom-Up Visual Attention for Virtual Human Animation" (PDF format, 7 pages, 546 KB).

Sources: Kimberly Patch, Technology Research News, March 24/31, 2004; Trinity College Dublin website


12:42:46 PM   Permalink   Comments []   Trackback []  


Click here to visit the Radio UserLand website. © Copyright 2004 Roland Piquepaille.
Last update: 01/11/2004; 08:57:32.


March 2004
Sun Mon Tue Wed Thu Fri Sat
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31      
Feb   Apr


Search this blog for

Courtesy of PicoSearch


Supported by
BigFitness.com

If you're tired to read about technology, it's time to take a break.
Try their exercise and fitness equipment.
Read more


Personal Links



Other Links

Ars Technica
Bloglines
BoingBoing
Daily Rotation News
del.icio.us
Engadget
Feedster
Gizmodo
I4U News
Mindjack Daily Relay
Nanodot
Slashdot
Smart Mobs
Techdirt
Technorati


People

Paul Boutin
Dan Gillmor
Lawrence Lessig
Jenny Levine
Karlin Lillington
John Robb
Dolores Tam
Jon Udell
Dave Winer


Drop me a note via Radio
Click here to send an email to the editor of this weblog.

E-mail me directly at
pique@noos.fr

Subscribe to this weblog
Subscribe to "Roland Piquepaille's Technology Trends" in Radio UserLand.

XML Version of this page
Click to see the XML version of this web page.