This document discusses iPlayset, a tangible interface that brings toys to life using computer vision. It describes how iPlayset allows users to use a webcam to detect physical toys and interact with them through a digital environment. Computer vision is used to track the toys without any sensors or gloves. The system supports both turn-based and real-time interaction modes. Algorithms for change detection, color recognition, and moments calculations are used to identify the toys and determine their positions. Preliminary user tests were conducted with children playing digital versions of board games and apps using physical toys.
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
iPlayset: Tangible Interface that brings Toys to life (Eiman Kanjo)
1. iPlayset: Tangible Interface that brings
Toys to life
Dr. Eiman Kanjo, Senior Lecturer
Mobile Sensing and Pervasive Computing
Nottingham Trent University
ED285,Clifton Lane, Nottingham, UK, NG11 8NS
email: eiman.kanjo@ntu.ac.uk
2. 2
Tangible and Table
Top Interfaces
User interacting with the DigitalDesk (Wellner, 1993)Graspable Interfaces: Drawing with bricks (Fitzmaurice, 1996)
EnhancedDesk system (Oka et al., 2002)
3. 3
Tangible and Table
Top Interfaces
Information is being projected on the active Pucks and the
SenseTable (Patten, 2001)
The display setting of the Tangible Viewpoint system (Mazalek et
al., 2002)
4. 4
Tangible and Table Top
Interfaces
Zowie Playset (Shwe & Francetic, 2000)
SAM project (Cassell et al., 2000)
5. • Recently, attention has focused on the development of
computer-user interfaces, which combine digital
information with physical environments.
• In this work, we developed a novel sensing technology
that allows users to use their home computer and an
ordinary web camera to detect multiple small physical
objects, referred to as “Interactive Toys” (e.g. small
dolls, cars, animal figurines) in a collaborative
environment, referred to as “Interactive Toys
Environment” (e.g. playsets or boardgames).
5
From Tangible Interfaces to Interactive
Toys
7. 7
Object tracking
Technologies
Criteria
Type Accuracy Bandwidth Interference
Hazard
Contact/ Non
Contact
Orientation
detection
Mechanical 0.1-2.5 mm > 3000 Hz Physical occlusion Direct Contact Yes
Optical 0.1-0.5 mm
100 -2500
Hz
Occlusion
Markers attached
to objects
Yes
Magnetic ~5 mm 20 - 100 Hz Metal Objects
Detectors
attached to users
Yes
Acoustic ~1 mm 500-1000 Hz Acoustic Sources None -
RFID tags 3-10 cm
100 kHz -5.9
GHz
-
Tags are
connected to
objects
-
Computer
Vision
vary
fps depends
on the
camera
Physical occlusion none Yes
8. • Computer vision relies on the imaging device,
image processing and programming techniques
has been accelerated by the recent integration of
fast computers and the availability of simply, low
cost cameras.
• By choosing computer vision to facilitate toy
tracking and identification, it is possible to
develop the Interactive Toys Environment both
with sensor-free toys and gloves-free hand
interaction.
8
Why computer
vision?
10. 10
Turn Based Mode
(a) User is performing a task
(b) User has made an input to the system
User’s hand interacting with the system in the turn-based mode
11. 11
Single toy
Background capture Bck
and first frame f1
Capture second frame
f2
Change detection
Colour recognition
Multiple
toys
Multiple toys
Moments calculation
Object position, Orientation, length
and width of the toy.
Application
Interactive Panel
Detection
Panel Normalisation
Motion
Detection
Skin
Detection
Regions labelling
Copy second frame f2 to
first frame f1
Warning message
The panel is partially or
completely out of the
camera’s view
Repeat
for all
toys
14. 14
Capture first frame Image f1Capture first frame Image f2
Change Detection
Application
Colour Recognition
Moment Calculations
toy positions and orientations
System diagram for the real-time background-independent interaction mode
18. 18
Image Rectification
(a) The original image (b) The output image
The results of image rectification using rotation, translation and scaling,
(a) The original image (b) The resulting image
An example of image rectification using bilinear warping.
20. 20
(b) Image after
background subtraction
(c) Image after
thresholding
(a) Original image
Change Detection for Toys
Tracking in real-time mode
21. 21
Motion History Approach to Turn
Detection
(a) Input image (b) Corresponding MHI
image
A hand moving a toy
22. 22
Motion History Approach to
Turn Detection
M
A histogram of an MHI image of turn-based interaction mode
23. 23
Region Description using Moments
(c)Real-time
background
independent mode
(b) Real-time-
background
dependent mode
(a) Turn-based
mode
Example images with the results of moment calculations in three interaction modes
24. 24
Colour Recognition
r
g
x Asian
x African
x
Caucasian
Human hand skin samples plotted in Normalised-rg space
x Asian
x African
x Caucasian
Human hand skin samples plotted in Cb-Cr space
25. 25
Skin Detection
(a) Original image with a hand (b) Resulting image
on a bright background
An example of the application of hand skin detection
28. 28
Toy Colour Recognition(2)
(a) Original colour image (b) Resulting image
An example of colour recognition results for multiple colour objects
29. 29
Preliminary user
tests
Three boys playing with the “Mouse Trap” board game during the preliminary user tests
Children interacting with the farm playset during the preliminary user tests
30. 30
Wizard Oz test
Side Camera 1
Top Camera 1 Side Camera 2
Screen
Screenshots of the Wizard Oz test with four children (two boys and two girls)