The robot is able to focus the gaze directions on a certain point, without being fooled by the movement of the neck. As a result, the robot seems as if it has its own intentions in following and paying attention to its surrounding people and environment. Using a camera censor, whilst tracking eyes it has interactive gaze.
In addition, by drawing the curve of the eyebrow using soft elastic wire, I were able to enrich the expression of the robot as if it lives with emotions. With the facial motion tracking and mirroring ability, it gives us connecting impressions.
The purpose of his research and development is not to answer the philosophical theme “Will a robot (or computer) obtain a mind or emotions like mankind”, but to portray the sense of conscious emotion such as a human can produce. He thinks it is possible to represent human-like communications by constructing an adequate interaction system between facial sensing and expressions. If we understand and identify with robots which can learn the functions and usages of emotional expressions from interactions with people, get a good command of them accordingly with situations and context, could we distinguish them from the existence of those with real minds and emotions?
As the first step to realize this, he thought two conditions were necessary: the eyes to detect the appearance of another’s face, and the face to be made appealing to the viewer’s eye.
The contributors of his project were Takanari Miisho as a software coding & Yuki Koyama as a circuit and electronics design.