Sony’s Aibo may be discontinued, but robotic pets of all shapes and sizes continue to stake a claim in the hearts of people around the world. Despite the apparent intelligence of some of these robot pets, their behavior and actions are usually nothing more than pre-programmed responses to stimuli – being patted in a particular location or responding to a voice command, for example. Real flesh and blood pets are much more complex in this regard, even discerning and responding to a person’s emotional state. Robotic pets could be headed in that direction, with researchers in Taiwan turning to neural networks to help them break the cycle of repetitive behavior in robot toys and endow them with almost emotional responses to interactions.
Building fully autonomous artificial creatures with intelligence akin to humans is a very long-term goal of robot design and computer science. On the way to such machines, home entertainment and utility devices such as “Tamagotchi” digital pets and domestic toy robots such as Aibo, the robotic dog and even the Roomba robotic vacuum cleaner, have been developed. At the same time, popular science fiction culture has raised consumer expectations.
In an effort to provide entertaining and realistic gadgets that respond to human interaction in ever more nuanced ways, mimicking the behavior of real pet animals or even people, researchers in Taiwan are now looking at a new design paradigm that could see the development of a robot vision module that might one-day recognize human facial expressions and respond appropriately.
“With current technologies in computing and electronics and knowledge in ethology, neuroscience and cognition, it is now possible to create embodied prototypes of artificial living toys acting in the physical world,” Wei-Po Lee and colleagues at the National Sun Yat-sen University (NSYSU), Kaohsiung, explain.
Related articles by Zemanta
- Emotional robot pets (scienceblog.com)