Abstract:
Within the F-2 companion robot project, an applied cognitive architecture is being developed that should replicate natural human communicative abilities. This architecture processes incoming speech, visual and tactile events, simulates emotional dynamics, supports the natural-language output on the robot, reproduces facial expressions and gestures. The robot's central component utilizes scenarios–production analogs that simulate emotional evaluation. By searching for the nearest emotional situations in the scenario graph, the robot can evaluate the events by their connection to positive or negative events in context. A series of experiments examined the impact of robot behavior patterns on humans: smiles, robot gazes, expressions of emotions, embarrassment, etc.