2. Brief
The installation tries to capture the movements of the user and
translate those movements into a pre-designed character on the
screen.
The installation uses Kinect for detecting user movements and
translating the motion data using Processing.
4. Concept
There will be multiple scenes which the user can choose from. Kinect will detect
user’s movements and the character in the scene will move according to the user’s
movements.
10. Code + Iteration
Using SimpleOpenNI and processing, 2 iterations were tried
1. Using user-map as a body and image of helmet/ grim’s head
2. Using skeletal tracking and pasting images of body parts on the joints
14. Astronaut Scene
• Use of body parts and not stick figure lines
• Various parts are not connected but disjointed giving a feel of puppet
since the motion is not smooth. Thus, parts were pasted to look like a
puppet.
• Music was added to add fun factor aimed at engaging the user.
15. Grim Scene
• Use of stick figure lines for hands.
• Music was added to add fun factor aimed at engaging the user.
• I tried to add face recognition to Grim’s face and have burning eyes
but due to limits of time could not try that out.
17. Computational Choices
• Trigonometry and Calculus were used to calculate the angle and
pasting images
• Translation of coordinate system to obtain desired anchor point for
rotation
• Scaling factor used to make the character consistent with movements
in Z-axis
18. Suggested Improvements
1. Smooth of joint detection
2. Face tracking
3. Try capturing motion in 3D
4. Use of user maps instead of skeletal tracking.