Author_Institution :
Dept. of Mech. Eng., Georgia Southern Univ., Statesboro, GA, USA
Abstract :
Within entertainment applications, animatronics must be identified as human partners to establish status for dynamic interactions for enhanced acceptance and effectiveness as socially-interactive agents. This research covers the design and implementation for human identification using a depth camera (Carmine from PrimeSense), an open-source middleware (NITE from OpenNI), Java-based Processing and an Arduino microcontroller into an animatronic dragon. Using the data from depth camera, people are identified by approximating a person´s skeletal information. Based on the movements of the individual, the program tracks a human body, or bodies, within the camera´s field of view. Joint locations, in the tracked human, are isolated for specific usage by the program. Joints include the head, torso, shoulders, elbows, hands, knees and feet. The dragon capabilities include a four degrees-of-freedom neck, moving wings, tail, jaw, blinking eyes and sound effects. These outputs instigate a movement in the tracked human, which establishes the cycle of human to animatronic interactions. This animatronic creature design will allow for future research in the effectiveness of interactive elements in themed environments.
Keywords :
cameras; entertainment; human-robot interaction; microcontrollers; middleware; object tracking; robot programming; service robots; Arduino microcontroller; Java-based processing; NITE; OpenNI; animatronic dragon; depth camera; entertainment applications; four degrees-of-freedom neck; guest response analysis; human body tracks; interactive animatronics; open-source middleware; socially-interactive agents; Animatronics; Operating systems; Robots; Animatronics; Arduino; Depth Camera; Dragon; Human Tracking;