SlideShare ist ein Scribd-Unternehmen logo
1 von 82
P a g e | 2
KISMET | Project Document
Project Name:
KISMET
(Kinect In Small form for Mobile Envisioned Technologies)
Version: 0.1 (Draft)
Prepared by: Rolly Seth
Location: Hyderabad, India
Date: 14th
July, 2014
P a g e | 3
KISMET | Project Document
Contents
A. EXECUTIVE SUMMARY......................................................................................................................................................4
A.1 PROJECT OVERVIEW ..................................................................................................................................................... 4
A.2 PURPOSE AND SCOPE OF THE DOCUMENT .................................................................................................................. 5
B. SCENARIOS.......................................................................................................................................................................6
B.1 MAIN SCENARIOS.......................................................................................................................................................... 6
B.1.1 Lego Kinect ................................................................................................................................................................................................... 6
B.1.2 Dimensions based retail experience...................................................................................................................................................... 9
B.1.3 Kinect and Cortana integrated personal assistant...........................................................................................................................12
B.1.4 Visual assistant for students.................................................................................................................................................................. 14
B.1.5 Kinect + a small projector = Live Stencils...........................................................................................................................................17
B.1.6 Mobile health-checkup ........................................................................................................................................................................... 19
B.1.7 Mood based environment creation ......................................................................................................................................................21
B.1.8 Auto-sync of multi Kinects.....................................................................................................................................................................23
B.1.9 Kinectduino ................................................................................................................................................................................................26
B.1.10 K-Enchant- A magical framework .....................................................................................................................................................30
B.1.11 Harry Potter style interactions.............................................................................................................................................................32
B.1.12 Self-paced K-12 education ...................................................................................................................................................................35
B.1.13 Kinect enabled real-time, road safety warnings ............................................................................................................................38
B.1.14 3d modelling and gesture enabled print..........................................................................................................................................40
B.1.15 Kinect Active Health...............................................................................................................................................................................42
B.2 ACCESSORY BASED SCENARIOS..................................................................................................................................46
B.2.1 Virtual OneNote Surface........................................................................................................................................................................46
B.2.2 Five senses Kinect.....................................................................................................................................................................................49
B.2.3 Automated Indoor Mapping Personal Assistant ............................................................................................................................. 52
B.2.4 Kinect enabled smart casing – a self-protection device ................................................................................................................ 55
B.2.5 Plug & play Kinect components...........................................................................................................................................................59
B.2.8 Holo-Kinect................................................................................................................................................................................................62
B.2.9 A fashion accessory.................................................................................................................................................................................65
B.1.10 viSparsh – a haptic belt for visually impaired.................................................................................................................................67
B.3 EXTRAS .......................................................................................................................................................................70
B.3.1 Kinect Green ..............................................................................................................................................................................................70
B.3.2 Minority report design for transparent world...................................................................................................................................73
B.3.3 One Microsoft gesture story for home: New XBOX SmartGlass ................................................................................................76
B.3.4 Kinect Personified.....................................................................................................................................................................................79
C. APPENDIX.......................................................................................................................................................................81
P a g e | 4
KISMET | Project Document
A. Executive Summary
A.1 Project Overview
As the validity of Moore’s law still lasts, the ever need to increase the chip performance and decrease in
the device size continues. Microsoft Kinect is no exception. Lot of research is already going on in the
industry to integrate Kinect like device in small sized products. Some major efforts in this direction have
already been visible around. Examples being
- Next Generation 3d Photography in HTC One – The application takes the picture along with the
depth data. The depth data enables adding additional photo-filters based on who is in front or
back or need to do post click focusing etc. This would also allow taking pictures with gestures
without pressing the manual button.
- Dynamic perspective in Amazon Fire Phone – Few days back, Amazon showed its latest phone
which gives the ability to provide 3d parallax effects at places like lock screens and they have
named it ‘Dynamic Perspective’.
- Providing human scale understanding of space and motion to mobile phones - Google’s Project
Tango aims to see the world in 3d to take visual cues from the surroundings and thereby provide
more contextually relevant information.
- Context sensitive information and shopping experience – Yet another feature of Amazon
Smartphone is ‘Firefly’ which takes out contextual information like barcode, audio piece and eye-
tracking to fetch you the related content for shopping.
- Augmented Reality Games – Augmented reality is another area where lot many companies are
banking on. Vuforia by Qualcomm is one of them which creates high fidelity, engaging and 3d
environments on mobiles.
- 3d Computing SDKs – Several companies have already released their SDKs for creating 3d,
interactive experiences on small devices. Intel Perceptual Computing SDK is just one of them.
SoftKinetic is another one which enables 3d vision and experiences for portable devices. The list is
enormous.
The aim of this document is not to focus on them.
The document aims to present new scenarios when Kinect’s capabilities are encapsulated in small
and portable devices like smartphones, tablets and phablets. The project is named ‘KISMET’ which
stands for ‘Kinect In Small form for Mobile Envisioned Technologies’.
Terminology Used: For the sake of simplicity and avoid any confusion, the term ‘KISMET device’ has
been used in place of Kinect enabled future windows mobile devices like Smartphones, tablets and
phablets. Since, mobile devices already have RGB camera and microphones (which have their own
increasing capabilities), essentially use of term ‘KISMET’ would mean adding depth sensor to mobile
devices as shown in the legend below.
Legend: ‘KISMET’= Depth Sensor + Smartphone/tablet/phablet
P a g e | 5
KISMET | Project Document
A.2 Purpose and scope of the document
The purpose of the document is to present scenarios/applications that would meet the following
two criteria:
a. Depth sensor integrated with smartphone, tablet and phablets
b. Address a large consumer base instead of being enterprise centric
The scenarios presented in this document are classified into three main categories:
- Main Scenarios - These scenarios revolve around enhanced applications written on the mobile
device to utilize the depth sensor data along with already existing RGB camera and Microphone
data.
- Accessory based Scenarios – These are not sole applications and require using add-on to the
mobile device for enhancing the capabilities of making use of depth data.
- Extras – These might not directly relate to the use of depth data but provide an understanding
on how social, secure and sustainable experiences can be created for the new world be it things
like initiatives or design changes etc.
Based on the scenario, all or some of the following sections would be covered in the rest of the
document:
i. Illustration
ii. Context
iii. Feature Overview
iv. Use Case
v. Addressed Consumer Need
vi. Other Applications
vii. Target Audience
viii. Technology Considerations
ix. References
P a g e | 6
KISMET | Project Document
B. Scenarios
B.1 Main scenarios
B.1.1 Lego Kinect
B.1.1.1 Illustration
B.1.1.2 Context
Lego bricks are famous worldwide. They are a powerful yet simple DIY (Do It Yourself) bricks. As of 2013,
around 560 billion Lego parts have been produce and each year 20 billion LEGO elements are produced. The
world’s children spend 5 billion hours a year playing with LEGO bricks. Since this number is huge, it
automatically promotes a big hacking culture in the community right through the childhood age. It gives the
imagination in your hands to make what you want.
P a g e | 7
KISMET | Project Document
B.1.1.3 Feature Overview
Basic LEGO bricks are an empowerment tool. However, over the years there has been evolution in LEGO itself.
In order to understand where ‘KISMET’ would fit, let us understand the current LEGO ecosystem that exists
now. As shown in the left side of the image below, LEGO has a base layer which are the plastic bricks. They
don’t have a brain of their own. In order to provide that brain, LEGO came up with Mindstorms series robotics
kit (Layer 2). On top of it, computer program (Layer 3) comes where kids and students can built robots or
other electronics stuff. Since LEGO bricks are the essential building blocks, these can’t be replaced. However,
there is an opportunity area to replace Layer 2 and Layer 3 with KISMET based applications. The mobile KISMET
device through depth sensing will provide capabilities like object detection, line follower, color detection,
speech recognition or any accessory add-ons. Using these capabilities might be challenging and that is where
Layer 3 revamp comes. KISMET device will have touch develop based applications for even youngsters to build
innovative stuff without having programming languages depth.
PS: For fixing KISMET with LEGO bricks, KISMET device edges can be grooved to fit into the LEGO brick or a
special cover can also be designed.
B.1.1.4 Use Case
Use Case ID: UC-1
Use Case Name: Creating obstacle detection DIY robot
Actor: A 6 years old kid
Description: A kid has a number of basic LEGO bricks and other LEGO accessories at home. His
mother gets a new KISMET device (depth camera enabled mobile device). He is
interested in building an ‘obstacle avoidance’ robot with it.
LEGO ECOSYSTEM
Layer 3
Computer Program
Layer 2
Mindstorms NxT
Layer 1
Bricks
Touch Develop App
KISMET uses
- Object
Detection
- Line follower
- Color detection
- Speech
Recognition
- Accessory Add-
on
Present Future
P a g e | 8
KISMET | Project Document
Pre-conditions:  Availability of easy to plug-in phone casing/border design for LEGO
elemental brick
 A touch and develop app for LEGO Kinect in Windows Store
 Availability of small sized depth sensors
Post-conditions: The kid creates moving robot with LEGO bricks and Windows Phone (KISMET device)
Normal Course of Events:  A kid is interested in building an intelligent, LEGO robot which can see and
move in the house
 He combines different LEGO bricks & tyres to make a DIY vehicle
 He downloads the Kinect LEGO app from the Windows Store
 He selects Input Sensor (depth), Command (‘Move without obstacle
avoidance’) and Output (LEGO servo motor) in the app and saves this
configuration
 Plugs the phone onto LEGO built vehicle using specially designed
casing/covering and connect the LEGO Servo motor cord with Phone ports
 Run the saved configuration
 Your cost saving and easy to build robot is ready
 The robot moves in your entire house by doing obstacle through small depth
sensors
Assumptions: Availability of simple to develop app to enable further automation of LEGOs
Windows Phone casing directly fits the device into the LEGO brick or there is a phone
version where instead of smooth device edge, there are grooved edges for fixing in
the brick
Availability of hardware ports to power motor and provide asynchronous, duplex
communication
B.1.1.5 Addressed consumer need
In the recent past there has been outburst of hacking culture and DIY (Do-It-Yourself) revolution. This concept
promotes the consumer need to get the basic building blocks with which they can create something for the
tomorrow.
B.1.1.6 Other Applications
- Build imaginative things like seeing flower vase, LEGO 3d vision helicopters etc.
- Build interactive robots with basic LEGO bricks through the use
- People would be able to create their own creative stands with the Windows device
- Other creative uses that cannot be imagined now.
B.1.1.7 Target audience
P a g e | 9
KISMET | Project Document
In this case the two major audience will be kids (1-17 years old) who stay in 130 countries where LEGOs are
available in the world. Also hobbyist would be interested in exploring the same. Since it will be a creative tool,
there is a high possibility that people of other age groups also get attracted to it.
B.1.1.8 Technology Considerations
- Re-usable port for plugging/powering and running external devices like servo motor
- Power consumption
- Dissipated heat due to small size of KISMET device
B.1.1.9 References
During my last MIT Media Lab visit, I met Prof. Mitchel Resnick (Head, Lifelong Kindergarten group that made
LEGO Mindstorms Robotics kit). He has clear view of the future where he mentioned that don’t make
something that people can use but give something to people with which they can create unlimited things with
their imagination. That is where the realization came that technology’s purpose is not to aid but also empower
people.
B.1.2 Dimensions based retail experience
B.1.2.1 Illustration
P a g e | 10
KISMET | Project Document
B.1.2.2 Context
Online shopping is a growing trend. The number of digital buyers in 2013 were around 157 million and this
number is expected to rise to 180 million in 2017 according to industry estimates. While doing online shopping,
one of the major obstacle is figuring out what size would be of perfect fit for you be it shoes, t-shirts or other
items. With the deals like cash on delivery or one month replacement, consumer are less reluctant to buy
things the digital way. Most of the replacements and order failure happens due to improper size fittings. This
leads the company to spend extra money enabling door to door replacements. There is an opportunity to
save company expenditures on this by providing the right size to the buyer during the first time itself.
B.1.2.3 Feature Overview
3d scanning to know the right size of buying items: This feature would inform you about the right size of a
product that you wish to buy through depth scanning. As each store has their custom sizes, this feature can
further be integrated with online stores for automatic searching of what items will suit a buyer’s requirements.
P a g e | 11
KISMET | Project Document
B.1.2.4 Use Case
Use Case ID: UC-2
Use Case Name: Buying shoes from online retail store
Actor: Youngster
Description: A boy wish to buy a new sports shoes for his upcoming sports event. He doesn’t have
much time to go to the market for buying. Thus, he opens the online store site to buy
the shoes. However, he gets confused on seeing multiple options and is unsure which
of them would provide the perfect fit for his leg size.
Pre-conditions: A boy wants to buy sports shoes
Post-conditions: He orders shoes online using automated personalized fitting options
Normal Course of Events: - Once the boy opens the shoe’s retail site, the site asks the boy to scan his
feet to help get an estimate of his shoe size
- A KISMET 3d scan child windows opens
- The boy scans his feet and clicks ‘Done’
- The child window closes
- The site has a 3d image of boy’s feet along with all the required dimensions.
- The site suggests shoe options based on exact size fitting for the boy’s feet
- The boy chooses a shoe and orders it online
B.1.2.5 Addressed consumer need
Getting the right product through online shopping during the first attempt itself
B.1.2.6 Other Applications
- Multiple retail experiences like buying dress, goggles etc.
P a g e | 12
KISMET | Project Document
B.1.3 Kinect and Cortana integrated personal assistant
B.1.3.1 Illustration
B.1.3.2 Context
Microsoft Cortana is great in doing following three things:
a. Good at finding something if we divide things into verbs and objects
b. Helps you communicate with people in your life or people in the world like Facebook or twitter
c. Helps you to remember things
P a g e | 13
KISMET | Project Document
Cortana is best at handling large amount of information or text as it directly jumps into the data.
B.1.3.3 Feature Overview
Cortana will be able to see the 3d world with the help of depth sensor. It can have facial features mimicking
app to give a sense of you talking to a friend. Depth sensor and RGB camera will see and analyze the
environment and based on picture or environment mapping, Cortana will figure out the contextually relevant
information for the individual. It will also provide suggestions accordingly.
B.1.3.4 Use Case
Use Case ID: UC-3
Use Case Name: Walking on the street with a digital friend named ‘Kinect Cortana’
Actor: A visually impaired
Description: A visually impaired person needs a friend who is always nearby to verbally explain the
environment around and alert in case of danger nearby.
Pre-conditions: - Cortana – personal assistant integration with depth camera and RGB camera to
provide contextual information through voice
- Kinect and Facebook (social network) integration to do facial recognition and search
names
Post-conditions: Visually impaired is able to walk around with his new eyes as Kinect Cortana
Normal Course of Events: - Visually impaired person open the ‘See World’ app
- He puts the KISMET device in his t-shirt’s front pocket such that the Kinect
depth sensor and rgb camera is slightly above the pocket
- He starts walking casually on the road like every day else
- Kinect Cortana does real time environment analysis and provides audio
feedback messages like, “Hey James, watch out. There is a skateboard
coming towards you from front.”, “I guess Rita, your new Facebook friend is
coming towards you.”
- Visually impaired have dialogue with Kinect Cortana. Visually Impaired –“Hey
Cortana, can you see and tell me what is the building in the front named?”
Kinect Cortana-“Sure. It is Macy’s Store. ”
Assumptions: Kinect sensing technology works in sunlight
B.1.3.5 Addressed consumer need
- Context based help
- 24*7 personal assistant availability
B.1.3.6 Other Applications
P a g e | 14
KISMET | Project Document
- Accident prevention on roads
- Real world based search
- Learning
- Increasing social consciousness
- Psychological therapy
B.1.3.7 Target audience
Anyone
B.1.3.8 Technology Considerations
- Kinect working in sunlight
- Advanced pattern matching and machine learning to determine context
B.1.3.9 References
Lend an eye: https://www.youtube.com/watch?v=eeFlpjuv8rs
B.1.4 Visual assistant for students
B.1.4.1 Illustration
P a g e | 15
KISMET | Project Document
B.1.4.2 Context
While many activities have been automated, still there are several tasks which require manual interventions.
Some of these are high precision activities like soldering, welding. These are cumbersome and are more prone
to mistakes as a single mistake can ruin the entire pcb (printed circuit board) or welding material. One way to
reduce the inefficiencies is real-time guidance on where and how to perform the task (like soldering or
welding).
B.1.4.3 Feature Overview
P a g e | 16
KISMET | Project Document
Digital instructor to aid learning and provide real-time feedback: This feature of KISMET device works in three
steps:
a. Pre-feeding of ideal design in the KISMET device
b. Depth & RGB camera compares real time work in progress with the ideal design
c. Real time feedback through audio is provided as to where the person is going wrong
B.1.4.4 Use Case
Use Case ID: UC-4
Use Case Name: Context aware KISMET soldering guide
Actor: Learner
Description: Person has to quickly solder a microcontroller on to a PCB (printed control board). It
is becoming difficult for him to one by one read which pin to be soldered where. He
needs assistance with soldering.
Pre-conditions: Person is learning to solder
Post-conditions: Person is able to complete the soldering accurately
Normal Course of Events: - Person feeds in an image of the circuit in the ‘Kinect Image Guide app’
- He fixes the PCB in circuit board holder
- He hangs the KISMET device upside-down at a distance from the table so
that RGB camera can clearly see the PCB at the bottom
- Person places the microcontroller in the PCB board and started the soldering
process for circuit design
- Kinect Image Guide App maps the original circuit image with real-time circuit
design view from RGB and depth camera of Kinect
- The app provides audio instructions in case the person is not soldering at the
correct pins
- Idea behavior is suggested by the Kinect Image guiding app
B.1.4.5 Addressed consumer need
Shortage of real time instructors for 1:1 personalized and quality teaching
B.1.4.6 Other Applications
- Performance assessment /quality check of items being developed
- Factories like manufacturing, product design can use it to train people or guide the accuracy of the
work being accomplished
- Coaching Institutions can make a customized learning package
B.1.4.7 Target audience
- Student
- Learner
P a g e | 17
KISMET | Project Document
B.1.5 Kinect + a small projector = Live Stencils
B.1.5.1 Illustration
B.1.5.2 Context
Stencils are a great way to learn. If depth sensor and projector are both integrated in the mobile device
(KISMET), we can build ‘Live Stencils’ for learning purposes.
B.1.5.3 Feature Overview
A learning tool to build, project a stencil and provide real-time feedback on the quality of tracing or drawing.
P a g e | 18
KISMET | Project Document
B.1.5.4 Use Case
Use Case ID: UC-5
Use Case Name: Learning alphabet writing through live stencils
Actor: A 3 years old kid
Description:
Pre-conditions: - Depth sensor and small projector integrated in the mobile device (phablet, tablet or
smartphone)
- ‘Kinect Stencil’ App downloaded on the device
Post-conditions: Kid learns how to write alphabets
Normal Course of Events: - KISMET device kept on small stand facing down towards the table
- White paper placed on the table
- ‘Learning stencil’ (alphabets in this case) projected over the paper
- Kid traces the same alphabets on the paper and learns
- RGB camera and Depth sensor tracks if the tracing is complete
- Once completes reviews the final output and gives suggestion to the kid,
‘Good Work’
- Kid says, “Next” or ”repeat”
- The app moves onto the second projection or remains on the audio
command given
B.1.5.5 Addressed consumer need
Lack of intuitive ways for drawing based learning post content digitization
B.1.5.6 Other Applications
In electronics engineering, for soldering purposes
B.1.5.7 Target audience
- Students
- Artists
B.1.5.8 Technology Considerations
- Miniaturizing projector size
B.1.5.9 References
P a g e | 19
KISMET | Project Document
Not directly related (Cube keyboard projection: https://www.youtube.com/watch?v=rCghJvjB7rI)
B.1.6 Mobile health-checkup
B.1.6.1 Illustration
P a g e | 20
KISMET | Project Document
B.1.6.2 Context
This scenario addresses the need to gather health related data for preventive measures and then utilize it to
build free health exercises that are tracked for aiming towards health improvement.
B.1.6.3 Feature Overview
a. Health Tests Application:
It provides capability to do mobile based health check-ups. The application provides a list of options to take
the test like eye scan, dental scan etc.
b. Game based health improvement exercises:
These test reports are used to create custom game exercises on the KISMET device like revolving eyes 10 times,
moving neck up and down etc.
B.1.6.4 Use Case
Use Case ID: UC-6
Use Case Name: Sending eye reports to doctor & improving health
Actor: A man
Description: -
Pre-conditions: A man’s eyes are paining
Post-conditions: He gets his eyes tested and have preliminary report sitting at home
Normal Course of Events: - He opens health test app
- Selects eye scan from the test list & follows the instructions given on the device
- He takes his eyes near the KISMET’s RBG Camera and depth sensor
- On a count of 3, 2 and 1… HD eye scanning starts
- Post scan, preliminary analysis is done by KISMET device to find any major issue like
stressed eye etc.
- In parallel, eye tests reports are sent to the doctor on his email
- Based on preliminary analysis, the app finds three tailored eye exercises for the
person
- User selects an exercise for trying out
- Kinect’s depth camera tracks the eye gestures & guides on real time basis
B.1.6.5 Addressed consumer need
Consumerization of basic health check-ups
B.1.6.6 Other Applications
-
B.1.6.7 Target audience
P a g e | 21
KISMET | Project Document
Everyone (This scenario has more vast reach compared to the previous scenario)
B.1.7 Mood based environment creation
B.1.7.1 Illustration
P a g e | 22
KISMET | Project Document
B.1.7.2 Context
Ubiquitous systems are present everywhere 24*7. They only get actuated based on the person’s needs else
they remain hidden behind the virtual walls of the world. This need doesn’t necessarily have to be explicitly
stated through words like saying, “XBOX On”. Many times the needs can be implicit too. These implicit needs
are very often understood by body language and facial expressions. If we can tap these implicit messages, we
can create a more intelligent and ubiquitous environment.
B.1.7.3 Feature Overview
Mood based environment actions: Non- verbal messages are read and understood by KISMET device on real
time basis. Based on this knowledge, appropriate suggestions are made though the notification bar on KISMET
device or environmental actions are taken.
B.1.7.4 Use Case
Use Case ID: UC-7
Use Case Name: Mood based music listening
Actor: Working Professional
Description: A working professional is driving towards home after hectic day. He needs some
refreshing music. He puts his KISMET device near the car dashboard and switches on
the radio in his car. Slow music starts playing. But the person’s expression shows that
he doesn’t like the music.
Pre-conditions: Person is not liking the music as evident from his expressions
Post-conditions: Person gets to hear music of his choice
Normal Course of Events: - KISMET Device detects changes in facial expressions
- From those expressions, it is inferred that person is not liking something
- To determine this ‘something’, environment is scanned to check the external
dependency
- Microphone detects slow music being played from the car’s radio
- Based on this inference, KISMET device says, “You look dull friend. Shall I put
a more lively music playing radio station for you?”
- Seeing proactive behavior of KISMET device, person’s face glows with a
remark, “Yes”
- KISMET scans other radio stations and switches to a more suited station.
B.1.7.5 Addressed consumer need
Difficulty in managing non-intuitive devices
B.1.7.6 References
P a g e | 23
KISMET | Project Document
Apple’s patent on mood based ad targeting: http://techcrunch.com/2014/01/23/apple-patent-explores-
mood-based-ad-targeting/
Apple’s patent on mood based delivery system: http://appleinsider.com/articles/14/01/23/apple-investigating-
mood-based-ad-delivery-system
Mood based music: http://www.moodfuse.com/
Depression detection through Kinect: http://www.technologyreview.com/view/513126/kinect-powered-
depression-detector-is-amazing-and-creepy/
B.1.8 Auto-sync of multi Kinects
B.1.8.1 Illustration
P a g e | 24
KISMET | Project Document
B.1.8.2 Context
In the recent past, lot of focus has been given to quick syncing between multiple devices. Television
synchronization with other smart devices being the chief among them. With the advent and miniaturization of
smart cameras and sensors, there needs to be a holistic model for quick sync between multiple sensors and
camera data to produce immersive experiences.
P a g e | 25
KISMET | Project Document
B.1.8.3 Feature Overview
Multi –KISMET devices auto sync feature: Based on master slave model, from among several KISMET devices,
one of the device acts as master. All other KISMET devices are slave who send real-time information about the
environment to the master device. This can help in creating a virtual 3d world out of real world 3d data on a
real-time basis.
B.1.8.4 Use Case
Use Case ID: UC-8
Use Case Name: Hollywood style real-time CGI cinematography
Actor: Cinematographer
Description: Using multi-KISMET sync and 3d output manipulation application, the
cinematographer is able to blend real world with virtual world on a real-time basis.
This helps in creating a draft of CGI movies by finalizing the scene view (angle of shoot
for a particular scene)
Pre-conditions: - Development of app for viewing multi-KISMET sync 3d output and doing
manipulation
- For multiple mobile devices, the option to make one of them ‘Master’ which
would receive wireless outputs from all other nearby ‘SLAVE’ devices
Post-conditions: Real time cinematized CGI shot
Normal Course of Events: - A cinematographer places multiple KISMET devices at different angles
- Sets all these devices as the ‘SLAVE’ to send parametric values to a master
device
- He keeps one KISMET device (mobile Kinect) in his hand which will act as the
master device
- He opens the CGI app
- He now can see different slave views in a sliced manner or consolidated in a
3d format
- He manipulates the real-time view by placing smart objects or remove others
- Cinematographer is able to see and understand the CGI shot from multiple
angles and take quick decisions on which would suit the movie requirement.
This will also help in deciding where will virtual objects fit in the scene
B.1.8.5 Addressed consumer need
- People have been facing lot of trouble in order to sync and utilize multi-Kinects data
- Real-time cinematography for quick multi-scene view at the shooting location and make effective
decisions accordingly. Multi-Kinect sync feature can help provide following additional features:
 Zoom in 3D
 Add virtual set in 3d
 Crop and shift items real-time
 Map actor intent to CG actor
P a g e | 26
KISMET | Project Document
 Add visual effects
- Increasing area of influence for life size interactions
- Difficulty in overlapping Kinect data with DSLR
B.1.8.6 Other Applications
- Home Security with room based view and the master device can also be at remote location
B.1.8.7 Target audience
- Hollywood
- Regional movie making enthusiast
- 3d modelers and consultants
B.1.8.8 Technology Considerations
- Real-time Bluetooth communication model for sending high bandwidth data like video
- Battery life sustenance during video feed/HD image sync
B.1.8.9 References
RGBD Toolkit: http://www.rgbdtoolkit.com/
B.1.9 Kinectduino
B.1.9.1 Illustration
P a g e | 27
KISMET | Project Document
B.1.9.2 Context
Arduino is one of the most talked about open source platform for prototyping Internet of things scenarios
within your home. While it can do a brilliant job of communicating with multiple appliances and activate
different experiences at home, they are cut out from the real world knowledge unless the person itself carries
Arduino or other similar development platform everywhere he goes. This provides a difficulty in providing
context aware environment creation by merging outside and inside home ontologies and knowledge. One of
the ways to deal with this is to have two companions- one which provides real world information and other
which uses this information to create context aware personalized environment. The best way to get this
external world information is by integrating with a device that stays, move around with the person in the
external world.
P a g e | 28
KISMET | Project Document
B.1.9.3 Feature Overview
Connecting KISMET (Kinect enabled mobile device) with Arduino or similar development platform:
KISMET devices are mobile like smartphones, tablets and phablets and a user usually carries it along
everywhere in the external world. This provides an untapped opportunity to gather and analyze the contextual
data to activate personalized experiences at home. Thus, the two companions as discussed in the ‘Context’
section above are: Arduino fueling the home & KISMET device understanding the context from the external
world information.
+ = Seamless world creation (by merging the learnings of one world
with the personalization setting determination for the other world)
There are two ways in which two companions can interact:
A. Real-time basis (Proactive): One example of this can be if it is too sunny outside, refrigerator (via
Arduino) is informed on real-time basis to lower down the temperature and have cold water available
once the person gets home.
B. Asynchronous interaction (Reactive): Person gets home after day long work and plugs his KISMET with
Arduino at home. Arduino takes the data dump and converts it to knowledge that can be useful for
the home setting. One example can be determining the person’s mood as per day’s happening. Based
on that the environment is changed by Arduino like if he is upset, curtains will have lively flower
animations running.
B.1.9.4 Use Case
Use Case ID: UC-9
Use Case Name: Preparing for party through real-time feedback from IoT world
Actor: Businessman
Description: A businessman is throwing a party for his friends at home tonight. He needs to buy
some food items for cooking dinner. He is not sure what items are available home
and is he doesn’t want to buy duplicate items. He doesn’t have time left to go to
home, check & again come back to the food store.
Pre-conditions: Businessman need to buy food items which are not available at home
Post-conditions: He buys only those items which are not at home
Normal Course of Events: - Person goes to food store
- Scans cold-drink bottle in 3d and sends these depth based built model to
Arduino connected to the refrigerator at home.
- Arduino connected to the azure platform receives the 3d images and
compares with the items available in the refrigerator.
Internal World
Home appliances
powered by
Arduino
External World
World seen &
understood by
KISMET device
P a g e | 29
KISMET | Project Document
- Post comparing, Arduino figures out that such item (specific brand of Cold
drink bottle) is not available at home.
- The person is responded about the unavailability to item at home.
- Person goes ahead and completes his shopping.
B.1.9.5 Addressed consumer need
- Enabling IOT (internet of things) at home
- Creating more personalized and context aware environment at home based on happenings in the
outside world
B.1.9.6 Other Applications
- Keeping family ubiquitously connected
- Remote access to home
B.1.9.7 Target audience
Anyone
B.1.9.8 Technology Considerations
Arduino is a DIY platform. In actual scenario, more reliable devices can be used for communication via Azure.
B.1.9.9 References
Windows Phone 8 communicating with Arduino using Bluetooth:
http://developer.nokia.com/community/wiki/Windows_Phone_8_communicating_with_Arduino_using_Blueto
oth
P a g e | 30
KISMET | Project Document
B.1.10 K-Enchant- A magical framework
B.1.10.1 Illustration
B.1.10.2 Context
Teaching to kids is not an easy task. Innovative ways have been explored all around to capture their interest
in learning and knowledge acquisition. One of the most effective ways among them is the creation of stories
to teach kids. These stories transport them to a new world where they can relate and learn more. ‘Fairy stories’
are one among them. What if we can provide a framework to the teachers to create 3d fairy stories which they
can play in the school and tell stories using gestures?
P a g e | 31
KISMET | Project Document
B.1.10.3 Feature Overview
Framework for creating ‘fairy tale effects powered stories’ using 3d animations activated by gestures – It can
be thought of as a futuristic version of power point where teachers create a new story line in four parts
a. Select an animated 3d story template
b. Add the text, videos or hyperlinks (basic storyline)
c. Add the 3d animated transitions (made with After effects or other 3d software) like pixie dust
d. Map Kinect gestures with the transitions as a trigger for activating those transitions
B.1.10.4 Use Case
Use Case ID: UC-10
Use Case Name: Teaching history lesson using 3d animated story telling framework
Actor: Teacher
Description: Teacher has to make her students learn about World War II.
Pre-conditions: Teacher has created a 3d animated storyline which will react to teachers’ gestures in
the class. This has been done before the class starts.
Post-conditions: Kids learn about World War II through an engaging and immersive experience
Normal Course of Events: - Teacher connects her KISMET device (where she has created the Kinect
Student Story) to the class projector
- She places here KISMET device right below the projector with Kinect camera
facing towards her
- With her varying hand gestures and audio, different 3d animations happen
on the projected screen thereby making kids feel that they are in that story
era. Few examples being:
. While starting the story, teacher says “and the story starts” to start pixie dust
introduction
. She does zoom in gesture and time travel countdown starts with parameters
re-defined by the teacher
. Teacher raises her both hands and drums beat start playing with sand
particles overlaying the text
B.1.10.5 Addressed consumer need
Need for immersive story creation for better understanding and memorability
B.1.10.6 Other Applications
- Interactive Museums
- Elevator Pitch
- Themes based parks
P a g e | 32
KISMET | Project Document
B.1.10.7 Target audience
- Teachers
- Story tellers
- Presenters
B.1.10.8 References
Marco Tempest- A magical tale:
http://www.ted.com/talks/marco_tempest_a_magical_tale_with_augmented_reality
B.1.11 Harry Potter style interactions
B.1.11.1 Illustration
P a g e | 33
KISMET | Project Document
B.1.11.2 Context
Future is building on Wi-Fi enabled connected environment. Currently, roughly each person uses 2 connected
devices at home which are mainly phones and tablets. However, with the dawn of pervasive computing, soon
there has to be outburst of more connected devices for home. Out of these, not every device would be
required to have huge processing power associated with it. Some would be just low power animated displays
like Wi-Fi enabled photo frames. This would actualize Harry Potter style animated devices too.
P a g e | 34
KISMET | Project Document
B.1.11.3 Feature Overview
Wi-Fi enabled environmental accessory: This would allow form factor independent content transfer for the IOT
world. The environmental accessory would include everyday things like smart wallets, smart walls, smart photo
frames etc. These accessories will be Wi-Fi enabled, low power displays and will have only one mode of
communication (from KISMET device to the accessory). The information process would include two simple
steps:
a. Pick up gesture: Generally a short, animated clip is picked up from a KISMET device.
b. Drop gesture: With KISMET device in hand, the drop gesture directed towards the other device where
this looping clip needs to be showcased. Based on the direction and the depth, automatic Wi-Fi
communication is established with the related device and content is transferred seamlessly.
B.1.11.4 Use Case
Use Case ID: UC-11
Use Case Name: Transfer of short family video clip to smart wallet
Actor: Husband
Description: A husband is going out of station for few months leaving his family at home. He
deeply loves his family and wants to carry a looping, 2 sec, family animated clip along
in his wallet
Pre-conditions: Wallet has low power based flexible display enabled by Wi-Fi
Post-conditions: Husband can see his family’s clip whenever he misses them
Normal Course of Events: - Husband open the clip in his KISMET device
- Using Kinect integrated in the KISMET device, he does a pick-up gesture. This
opens the ‘send to…’ page
- Then he does another drop gesture on the wallet within Kinect’s camera field
view
- Based on direction and depth, the related Wi-Fi module is identified &
automated content transfer is enabled
- Husband’s wallet now has animated clip running
- For saving power, the clip is ON only when wallet is opened
- Whenever wallet is closed, the display also shuts down
B.1.11.5 Addressed consumer need
Need to create more intuitive displays for memory lane
B.1.11.6 Other Applications
- Finding animated wallpaper and putting at home walls
P a g e | 35
KISMET | Project Document
B.1.11.7 Target audience
Anyone
B.1.11.8 Technology Requirements
- Protocol to enable automated Wi-Fi transfer based on Kinect gestures
- Requirement for small sized power displays
B.1.11.9 References
Nil
B.1.12 Self-paced K-12 education
B.1.12.1 Illustration
P a g e | 36
KISMET | Project Document
B.1.12.2 Context
According to world literacy foundation, more than 796 million people in the world cannot read and write and
67 children do not have access to primary education. This untapped potential if empowered with self-paced
education can create many more future leaders and change agents. Several organizations worldwide are
working towards providing easy access to K12 education for all children below the age of 14 years.
B.1.12.3 Feature Overview
P a g e | 37
KISMET | Project Document
Self- paced learning framework enabled by mobile Kinect. This requires building free educational applications
using contextual information present in the environment.
B.1.12.4 Use Case
Use Case ID: UC-12
Use Case Name: Understanding monumental structures
Actor: Kid
Description: A kid goes out for picnic with family. His family gives her KISMET device to be in touch
while they take rest in the park for some time.
Pre-conditions: The kid sees a monument and is interested in knowing more about it
Post-conditions: Kid learns about the monument based on her interest and environment
Normal Course of Events: - Kid raises her phone with Kinect RBG & depth camera facing the monument
- Kid points finger at the monument and asks, ”Cortana, what is this?”
- KISMET devices does 3d scan and depth mapping
- Based on the above analysis, crops the image pointed by finger
- Context is determined of the image based on other environment factors
- Advanced Bing image search is done based on context like GPS location etc.
- Results are presented through audio based on confidence level
B.1.12.5 Addressed consumer need
- Providing pull based mechanism instead of push
- Rural empowerment through free educational games
B.1.12.6 Other Applications
Guide for visually impaired
B.1.12.7 Target audience
Students/Learners
B.1.12.8 Technology Requirements
Seamless integration of RGB and depth sensors with Bing
B.1.12.9 References
P a g e | 38
KISMET | Project Document
Microsoft’s Project Adam
B.1.13 Kinect enabled real-time, road safety warnings
B.1.13.1 Illustration
B.1.13.2 Context
As world is switching towards digital, people are seen more focused on their devices than on the road. By
getting absorbed in reading/seeing something on their mobile device, they become less conscious about the
P a g e | 39
KISMET | Project Document
environment while walking. This makes them accident prone on the road and other similar places. Is there a
way they can be safeguarded while walking?
B.1.13.3 Feature Overview
Environment based message alert model: RGB camera & depth sensor are placed on the back of the mobile
device. Other way can be thought of as rotating miniaturized Kinect on top of mobile device. This (rotating
sensors & camera) will provide the flexibility of integrating multiple scenarios based on the directional need.
Once there is some emergency, alert messages are sent on the notification bar of the device.
B.1.13.4 Use Case
Use Case ID: UC-13
Use Case Name: Preventing road accident
Actor: College Student
Description: A college student is walking on the road. He is busy in texting his friends and his eyes
are always on the smart phone.
Pre-conditions: A speeding car is coming towards the student. There is also an open pot hole 200mts
away which he must escape.
Post-conditions: The student gets saved from being hit and falling into the pot hole.
Normal Course of Events: - RGB Camera & RGB sensor scans the 3d world
- Background app on the device analyzes the environment
- Discrepancies and danger items are identified in the forward walk path
- Notification bar is used to inform the student about the dangers in front
B.1.13.5 Addressed consumer need
Personal safety on the road
B.1.13.6 Other Applications
-
B.1.13.7 Target audience
Anyone
B.1.13.8 Technology Requirements
P a g e | 40
KISMET | Project Document
KISMET device to work in daylight
B.1.14 3d modelling and gesture enabled print
B.1.14.1 Illustration
P a g e | 41
KISMET | Project Document
B.1.14.2 Context
3d modelling and designs have often been left to the mercy of high end softwares which often complicated
and has slow learning curve due to the complexities of the 3d world manipulation. People thus require a new
way of designing the future. NUI (Natural User Interface) shows a promising future in this regard.
B.1.14.3 Feature Overview
- Gesture based 3d modelling
- Gesture based panoramic shot building through 2d data stitching
B.1.14.4 Use Case
Use Case ID: UC-14
Use Case Name: 3d scanning of face
Actor: A man named ‘X’
Description: X in interested in 3d printing his face. He doesn’t have much knowledge of high
costing 3d scanners or know how to use Autodesk like 3d softwares
Pre-conditions: - An application which stitches 2d images asynchronously using depth data
with gesture to define the border and take the snapshot
- Wireless messaging model to transfer 3d model for 3d printing
Post-conditions: 3d model of the person’s face for 3d printing or using in virtual world
Normal Course of Events: - X downloads the 3d scan app
- He puts the KISMET device at a distance
- He then defines the four corners of the real world face which he wants to
capture
- The app utilizes depth data to map only relevant face data from one side
- X repeats the same process from other angles
- Automatic 2d snapshots switching happens using the same app to make the
3d model
- He uses export button to wirelessly send the face model for 3d printing
B.1.14.5 Addressed consumer need
 Cost Effective 3d modelling
 Steep learning curve for 3d application usage
B.1.14.6 Other Applications
Exporting real world characters into digital world for second life or 3d avatar for Skype
P a g e | 42
KISMET | Project Document
B.1.14.7 Target audience
Multi-segments including families
B.1.14.8 Technology Considerations
- Near field sensors miniaturization
- Depth Camera usage for background cutout and border definition (includes automates green screen
editing)
B.1.15 Kinect Active Health
B.1.15.1 Illustration
P a g e | 43
KISMET | Project Document
B.1.15.2 Context
With the advent of digitization, people tend to spend more time in front of electronic devices. Excessive usage
of these devices often bring health problems like back pain, dry eyes, fatigue etc. With the integration of
Kinect’s capabilities in a mobile device, active health feedback about the person can be provided to the user.
Let us see how?
P a g e | 44
KISMET | Project Document
B.1.15.3 Feature Overview
Unconscious health chart preparation of the user: KISMET’s RGB camera and depth sensors track real time
health information about the person in the background while he is working on other items on the mobile
device. Whenever, an emergency is found, a notification message pops up on the device giving real-time
feedback. If the user wants to consciously see his health report, he can quickly get access to graphical health
chart app prepared from Kinect’s data. KISMET based health chart can be navigated through calendar
Health of the user determined using some of the following parameters:
- Body gestures
- Smiles
- Voice changes
- Eye movement
- Heart beat
- Facial changes
- Environmental factors
B.1.15.4 Use Case
Use Case ID: UC-15
Use Case Name: Providing active health feedback to the mobile user
Actor: Mobile user
Description: So far tracking and quantifying real time health was difficult as sensors couldn’t be
present everywhere. However, with possibility of depth sensor being integrated with
mobile device, now there is a means to track and understand patterns of
deteriorating/improving health.
Pre-conditions: A person using KISMET device has been continuously working from past 8 hours.
Without eating food, he his comprising his health for work.
Post-conditions: He gets to see detailed health report on real time basis
Normal Course of Events: - Mobile user is continuously working on the mobile device for past 8 hours
- Mobile user gets a notification message on KISMET Device, It says, “Kinect
Message: You have increased heartbeat. Take rest. To know more, click here.”
- User clicks on the message to see the trend
- Graphical detailed report looks like
- 5th
July 2014: 20% voice congestion (deviation from original voice)
- 6th
July 2014: Four times stress detection (bow gestures)
- 7th
July 2014: 3 hours of wrong sitting posture
- 8th
July 2014: 8 hours of dry eyes
- 9th
July 2014: No smile detected in the entire day
- 10th
July 2014: Increased heart beat rate in the last hour
P a g e | 45
KISMET | Project Document
B.1.15.5 Addressed consumer need
Preventing health related issues due to excessive usage of electronic devices
B.1.15.6 Other Applications
These reports can be used by the doctors to suggest health improvement measures
B.1.15.7 Target audience
Extensive users of electronic devices
B.1.15.8 Technology Considerations
Building Kinect integrated calendar app for seeing date wise or accumulated health over a time span
B.1.15.9 References
Kinect’s heart rate monitor: http://thenextweb.com/microsoft/2013/05/22/the-new-xbox-one-kinect-tracks-
your-heart-rate-happiness-hands-and-hollers/
P a g e | 46
KISMET | Project Document
B.2 Accessory based scenarios
B.2.1 Virtual OneNote Surface
B.2.1.1 Illustration
P a g e | 47
KISMET | Project Document
B.2.1.2 Context
Dual efforts are required when whiteboard information written with pen markers have to be converted into
digital format. In order to eliminate this problem, interactive whiteboards have been developed which have
been used in several places. However, they are very costly ranging around $1000 which is beyond the reach
of normal consumer. Thus, there is a need to think of cost effective solution for translating whiteboard text
into digital format.
B.2.1.3 Feature Overview
- Real world area to OneNote mapping for writing: Kinect maps any real-world surface four corners with One
Page. In order to do so, the use is require to calibrate the four corners of the real world using hand gesture
locking.
- Real time transfer of content being written on board to OneNote: In order to support the transfer of content
on OneNote on real-time basis, one of the two methods can be adopted.
Method 1: Replacing whiteboard with magic slate and using wireless pen to write on the slate
Age old magic slate for kids helps is writing glowing text. A big and sophisticated version of the same can be
used for writing on board. This will only be used for visual feedback to the user. In order to do the real data
transfer, a wireless pen mouse (which opens OneNote on a click) would have to be designed for writing on
the magic slate. The slider at the bottom would erase the written text without using any consumable material
like marker.
Method 2: Doing ZnS coating on any surface and using wireless UV pen to glow text and transfer content
As per the photoluminescence property, whenever UV light is passed onto a ZnS coated surface, the area
where light is passed starts glowing for a specific time. Some people have explored writing text with it. The
time during which the text stays depends on the combination of ZnS being used.
It is still a concept and might not work in daylight scenarios. The authenticity of its working needs to be tested.
- Use of # tags for special commands:
Use of special #tags while writing on the board will help perform certain operations on the OneNote like
opening a new section with name as the test following #tag.
B.2.1.4 Use Case
Wireless pen mouse
to transfer the
content digitally to
OneNote
Visual feedback to the
person writing on the
board through magic
slate or ZnS
P a g e | 48
KISMET | Project Document
Use Case ID: UC-16
Use Case Name: Auto-transfer of whiteboard content to OneNote
Actor: Working Professional (WP)
Description: WP is having a meeting and he wants to explain some concepts on the whiteboard.
After explanation, he wants the text to be auto transferred to OneNote and sent to
all meeting employees
Pre-conditions: - Setting up of one of the methods (as mentioned above) for transfer to
content to OneNote and provide visual feedback to the user
- KISMET app to compare text coming from wireless pen mouse to visual data
from RGB camera for sanity check
Post-conditions: OneNote copy for text written on whiteboard for sharing and receiving feedback
Normal Course of Events: - WP comes 5min prior to the meeting and puts his KISMET device on a stand
facing the whiteboard
- He starts the ‘OneNote Text Transfer app‘ to calibrate the four corners of the
whiteboard for mapping to OneNote
- He goes to each corner of the board, points to the corner with a finger and
locks the corner with another gesture.
- Similar process, WP repeats for other corners
- One other employees join the meeting, he starts the text recording function
of ‘OneNote Text Transfer app’
- Data being written on the board is real time sent to OneNote using wireless
pen and visual feedback is provided to the user using one of the two
methods mentioned in the section above
- KISMET OneNote compare feature checks the data in between with visuals
from RGB camera for integrity
- After the meeting, WP shared this digital note copy with the meeting
attendees
B.2.1.5 Addressed consumer need
Difficulty in converting text and drawing written on whiteboard in a digital format for future reference and
usage
B.2.1.6 Other Applications
This is a limited use scenario
B.2.1.7 Target audience
- Business Professionals
- Teaching Professionals
B.2.1.8 References
- Light Writer: http://www.gijsvanbon.nl/kryt.html
P a g e | 49
KISMET | Project Document
- Interactive UV t-shirts: http://www.thisiswhyimbroke.com/interactive-uv-light-shirts
B.2.2 Five senses Kinect
B.2.2.1 Illustration
P a g e | 50
KISMET | Project Document
P a g e | 51
KISMET | Project Document
B.2.2.2 Context
Instead of focusing on just two senses (mainly the sight and hearing), the feature set of next generation Kinects
should be extended to enable other three senses too namely, the touch, smell and taste.
While taste still seems to be in early stages of exploration, there have been some good progress with smell
sensors and haptic feedback.
B.2.2.3 Feature Overview
Kinect Enabled Haptic Cover: KISMET device will have a specially designed cover to transfer haptic messages.
It is a flexible cover with small pins inside which can increase or decrease their size to create 3d shapes. Thus
cover will change shape based on the different heights of protruded pins.
B.2.2.4 Use Case
Use Case ID: UC-17
Use Case Name: Remote feel of 3d object
Actor: Woman named ‘A’
Description: A wants to communicate the feel of a real world object remotely to his friend at
geographically different location
Pre-conditions: - Availability of Kinect enabled haptic cover with A
Post-conditions: People at two different geographic locations can feel physical shapes present at the
other location.
Normal Course of Events: - A puts a real world object in front of KISMET device
- Kinect’s depth sensor maps 3d data of an object at location A
- Based on protrusion resolution parameter of the haptic cover, rough
mapping of the 3d data is done on haptic cover at location B
B.2.2.5 Addressed consumer need
Providing immersive experiences by communicating all five real sensors at a remote location
B.2.2.6 Other Applications
- SDK for developers to build haptic applications like braille for visually impaired or feeling remote items
- Skype integration
- Playing haptic tic-tac-toe or chess between people at different locations
P a g e | 52
KISMET | Project Document
B.2.2.7 Target audience
Geographic distributed individuals
B.2.2.8 Technology Requirements
Building self-adjusting pin matrix in the form of a haptic cover to map a 3d object is a complex feature and
would require feasibility analysis
B.2.2.9 References
inFORM- MIT Media Lab: http://www.tested.com/art/makers/459049-mit-media-labs-inform-shapeshifting-
table/
B.2.3 Automated Indoor Mapping Personal Assistant
B.2.3.1 Illustration
P a g e | 53
KISMET | Project Document
B.2.3.2 Context
Home is a space where people prefer to put personalized items. This is beyond the reach of GPS. For outside
‘app-ification and device-ification’ models to enter these houses, it is important to gather and understand the
P a g e | 54
KISMET | Project Document
contextual information within these spaces. In order to enable this, real world house into be mapped to a
virtual world where people can play, explore and personalize.
B.2.3.3 Feature Overview
Indoor mapping with robotic devices plug-in: The KISMET device would quick plug-in for quad-copters and
other indoor robotic devices like Roomba. Once they are plugged, as the robot will move in the entire house,
it will map out a 3d view of the indoor to be used for various purposes.
B.2.3.4 Use Case
Use Case ID: UC-18
Use Case Name: Re-designing the home with automated indoor 3d mapping platform
Actor: College Student
Description: A college student wants to change the interior of the house and thus requires a 3d
model of the same to play with
Pre-conditions: - The student has a quad copter or a robotic device like Roomba
- Availability of a hardware accessory for enabling 360 degree rotation of the
KISMET device
- A Windows Store App to build live 3d geometry and export as a .3dx or .obj
file for further manipulation or consumption
- A touch based app to place and manipulate smart objects on top of this 3d
geometry
Post-conditions: The student finalizes his new house look and shares with family for approval
Normal Course of Events: - Student plugs in KISMET device (Depth sensor enabled mobile device) with
Roomba robot (obstacle avoiding, floor moving device). Present day
Roomba has an additional accessory protruding out where KISMET is
connected and can rotation 360 degree.
- He downloads the Indoor Mapping app from the Windows Store
- He switches on the robot
- The app is started to start making 3d model of the house
- The device autonomously maps the house
- Once done, it comes back to the user based on face detection
- The app outputs a 3d object (.3dx, .obj etc.)
- The student uses touch based 3d manipulation feature of KISMET device to
cut and replace new virtual furniture in the house
B.2.3.5 Addressed consumer need
- Autonomous mapping
- 3D map of homes with exact dimension for real-time visualizations example while home furniture
shopping
- Access to contextual information at home to make it more safe and secure
P a g e | 55
KISMET | Project Document
B.2.3.6 Other Applications
- Augmented reality based games for the entire house rather than a limited location like hunt for 5 virtual
ducks in your home.
B.2.3.7 Target audience
Multiple-segments – house wives, kids, interior designers, sociologist, psychologists, data scientists etc.
B.2.3.8 Technology Considerations
Mapping time of flight with quad-copter’s or robot’s moving speed
B.2.3.9 References
Automated Indoor Mapping: http://www-video.eecs.berkeley.edu/research/indoor/
B.2.4 Kinect enabled smart casing – a self-protection device
B.2.4.1 Illustration
P a g e | 56
KISMET | Project Document
P a g e | 57
KISMET | Project Document
B.2.4.2 Context
Few years back, iPhone had conducted a survey with around 2000 iPhone users on broken phones. After the
survey, it was found that Americans have spent an estimated $5.9 billion on repairing and replacing broken
phones over the past five years. Accidental drop was found to be the most common reason for breaking and
submerged in water (tub, sink and toilet) being the second. For Windows phone too, the statistics would have
P a g e | 58
KISMET | Project Document
been similar. This shows a major gap where smart devices cannot take care of themselves. Is there a way where
miniaturized form of Kinect or other sensors solve this issue?
B.2.4.3 Feature Overview
Automated shock resistant mobile cover/pouch: The smart protector in the form of pouch or metallic casing
gets activated based on one of the two options:
1. In case some object is falling on top of the device - This detection is done using depth mapping and
calculating velocity on real time basis
2. In case device itself is falling on the ground – This can simply be done using accelerometer integration
B.2.3.4 Use Case
Use Case ID: UC-19
Use Case Name: Protecting device in rain
Actor: Working woman
Description: A working woman is awaiting near the bus stop for the bus. As it is drizzling, the lady
is holding an umbrella with one hand. She has to send some urgent mails and thus
she is working on her smart phone with the other hand.
Pre-conditions: A rain drop from the umbrella is about to fall on the device and has high probability
of destroying the phone
Post-conditions: KISMET device self- protects itself from getting damaged
Normal Course of Events: - The KISMET device detects a possibility of something about to fall on the
device. Almost instantly the water droplet starts coming towards the device.
- Depth camera detects the speed of the falling object and distance remaining
between the moving object and device.
- As soon as moving object crosses safety distance, the device detects danger
- It alarms the safety cover to instantly get on
- The safety cover covers the entire device
- The water droplet falls on the cover thereby protecting the device
B.2.4.5 Addressed consumer need
Accidental breakage of phone or smart devices
B.2.4.6 Other Applications
Nil. This feature has limited usage
B.2.4.7 Target audience
P a g e | 59
KISMET | Project Document
Anyone
B.2.4.8 Technology Requirements
Hardware (protection cover) activation based on depth sensor data points
B.2.5 Plug & play Kinect components
B.2.5.1 Illustration
P a g e | 60
KISMET | Project Document
B.2.5.2 Context
Plug & Play Kinect Components and Sensors: Smartphone interior is designed in the shape as shown below:
The left four corners spacing have magnetic clutch points where different sensors or Kinect components can
be clubbed as per the requirements. Thus, instead of a complete Kinect bundle, based on your needs you can
use basic Kinect components like depth camera, microphone or integrate additional sensors like temperature
with Kinect.
B.2.5.3 Feature Overview
Use based plug-in sensors to provide smart variations:
Each magnetic clutch point will have different end-points to enable activation and seamless data flow with
sensors. The major clutch points would be:
B.2.5.4 Use Case
Use Case ID: UC-20
Use Case Name: Field diagnostics of diseases by rural health workers with integrated microscope
Actor Rural health worker/volunteer
Description: Since the people living in rural areas of Africa and India don’t have adequate access
to healthcare and medication, often their diseases remain undiagnosed or
misdiagnosed. Often health workers are provided with a mobile device for effective
health management of the nearby area. Now through the availability of plug and use
portable microscope piece, heath workers will be able to early diagnose the diseases
and provide proper medication to the citizens.
Pre-conditions: All the rural health workers are given a KISMET device (Kinect enabled mobile device
like smartphone, tablet or phablet).
The device has plug & use sensor outlets on the four corners
A smart Microscope sensor is provided to the for plugging in with KISMET device
Post-conditions: Real-time disease diagnostics in the rural areas
Normal Course of Events: - Health workers go to rural areas for doing regular inspection
Power
Input
Output
Ground
P a g e | 61
KISMET | Project Document
- They see a person having high fever. They suspect him having malaria
disease.
- He needs immediate medication but health worker can’t give a device
without proper detection. She takes his blood test.
- Health worker takes out her KISMET device
- She unclutches the camera from her device & places the microscope piece
to the side
- She opens the microscope app & start testing the disease by doing 500x
zoom, depth and optical focus
- Through her quick diagnostics, she confirm the person has malaria
- The person’s medication is started real time.
B.2.5.5 Addressed consumer need
- Sensor personalization based on needs
- Handy access to sensors
B.2.6.6 Other Applications
- Pollution tracking
- Home based diagnostics
This given use case can be extended to make personal diagnostics common to all households. People would
be available to conduct health tests at home and then instantly share the same to doctor to get real-time
feedback
B.2.6.7 Target audience
- Rural markets
- People requiring daily use of multiple sensors for testing, benchmarking and data collection like
environmental science, medicine, photography
B.2.6.8 Technology Considerations
- Distributed Power and data flow model for quick plug and use of sensor packages
For specific use case (UC2):
- Embedding polymer optics in a portable sensor
- Optical and depth based focus
- Knob for changing magnification levels: 140x-2100x
- Multi view modes- Grey scale, inverse, emboss mode
- Illumination package based on type of microscopy- light field, dark field, fluorescence etc.
P a g e | 62
KISMET | Project Document
- Panning, changing 3d field of view
B.2.7.9 References
A. 50 cent microscope
https://www.youtube.com/watch?v=xUzZ_01N0eE
b. Portable Microscope:
https://www.youtube.com/watch?v=wPdCBebQEuY
B.2.8 Holo-Kinect
B.2.8.1 Illustration
P a g e | 63
KISMET | Project Document
B.2.8.2 Context
Holography & 3d projections have been of great interest to people as it brings along a realism factor to the
static world. Kinect enabled mobile devices can help take this field of exploration to the hands of consumers.
B.2.8.3 Feature Overview
- Holographic projection device casing: The flap case of the device has been converted to a holographic
projection display for 3d viewing and manipulation.
- Holographic prism accessories for KISMET device: A holographic prism kept on top of KISMET device can
support Kinect gestures for smart object manipulation in 3d.
B.2.8.4 Use Case
Use Case ID: UC-21
Use Case Name: Designing an art piece for the house using holographic display cover
Actor: Artist
Description: An artist is interested in designing and printing a 3d art piece for his house. He uses
holographic projection cover with the KISMET device to carve his piece and 3d print
it with gestures
Pre-conditions: - Availability of holographic flap case with the artist
- Availability of an app for 3d gesture based designing with holographic viewing
Post-conditions: Artist 3d prints the art piece and keeps at his house
Normal Course of Events: - Artist inserts KISMET device in the holographic case
- He opens the holographic projection app
- He imports a 3d design
- The figure is changed by 3d operations through finger gestures
- Happy with the final design, the artist send it to 3d printing
B.2.8.5 Addressed consumer need
- Easy 3d interactions & manipulations
- Safety while car driving: Lot of people have to turn to their cellphones to read/see while driving which is not
safe and increases chances of accident. This problem can be minimized by projecting phone’s display
holography on car’s front glass.
Etc.
P a g e | 64
KISMET | Project Document
B.2.8.6 Other Applications
Such things can also be used in places where the person cannot always look into the smart device (phone,
tablet or phablet). One such example is while driving. An enhancement of this would help in providing car’s
creative, 3d dashboards.
B.2.8.7 Target audience
Multiple Segments: 3d artists, designers, architects, young generations looking for cool interactions, car dealers
etc.
B.2.8.8 Technology Considerations
- Near field detection
- Extended list of finger based gestures
B.2.8.9 References
- Creative use on car’s dashboard (heads up care display)
https://www.youtube.com/watch?v=Qg8dq9FlX78
- Holho
http://www.holhocollection.com/
P a g e | 65
KISMET | Project Document
B.2.9 A fashion accessory
B.2.9.1 Illustration
B.2.9.2 Context
As technology options are increasing in number, people often choose the product which he/she can make a
style statement. Not many withstand the criteria.
P a g e | 66
KISMET | Project Document
B.2.9.3 Feature Overview
Separating depth sensors & Kinect’s rgb camera from the main mobile device to make it as a fashion accessory:
One of the main requirements for Kinect to collect and utilize real world data is having them placed at visible
locations. It would be difficult to always carry the phone in hand with Kinect camera ad sensors pointing to the
environment. Thus, another approach for tackling this issue is to separate Kinect capabilities from the mobile
device and make Kinect wearable as a fashion accessory which can be flaunted over the dress. This would help
in getting real-time data without you having to bother about carrying it across. Some options for making
wearable Kinect include earring, fashion locket, fashion belt, stylish bag and belt buckle. This can provide
wireless input to the KISMET device for further manipulation.
B.2.9.4 Use Case
Use Case ID: UC-22
Use Case Name: Searching for Facebook friends in the party and informing on smart watch
Actor: Model
Description: A model goes to a party. However she hardly knows anyone there. She is wearing
Depth sensor & rgb camera as her earring. She has synchronized the smart earring
with her smart watch.
Pre-conditions: Model is alone in the party
Post-conditions: She finds mutual friends and hangs out with them
Normal Course of Events: - Kinect camera in the earring takes an image of people around and using
depth sensing, crop the faces of individual people
- It then syncs with KISMET device to share this data and search for mutual
Facebook friends
- Based on facial matching algorithms, mutual friends are identified.
- This information is sent wirelessly to smart watch like -”Hey! The person
towards your right is Arjun’s friend.”
- This provides a starting conversation for the model to interact with people in
the party.
B.2.9.5 Addressed consumer need
Making technology cool and fashionable to wear
B.2.9.6 Other Applications
- Style changing bags- Bags visual view can change based on three styles – office, party or causal. The same
bag is adaptive to different environments. For example is there is a party happening around, Kinect sensor
detects the same and changes bag cover to make it animated flower glowing bag. This is done using leds or
flexible oled displays.
- Kinect enabled Living bags- Smart pouches in bag where after KISMET can be placed. These pouches will
have end points to sensor in the bag. They can also be solar charging bags. Other options can include weather
P a g e | 67
KISMET | Project Document
indicator sensors built in the bag providing real time suggestions. Inputs are provided based on changing
environment outside or things inside the bag.
- Next generation home shopping: Example- A person scans his feet. Depth and RBG camera calculates the
feet size automatically and from among the choices available, provide the best fitting size of shoes online.
B.2.9.7 Target audience
Young generation (people who are 15-25 yrs. old)
B.1.10 viSparsh – a haptic belt for visually impaired
B.2.10.1 Illustration
P a g e | 68
KISMET | Project Document
B.2.10.2 Context
Worldwide, 285 million people are visually impaired with 90% of them living in the developing countries.
Empowering such a huge population to lead a normal life through assistive technologies can help the world
by utilizing their undiscovered potential. ‘viSparsh’ is a small step towards that empowerment.
B.2.10.3 Feature Overview
The focus of the viSparsh project is to create a haptic (touch-based feedback) waist belt for the blind with a
simple goal that a blind person can avoid obstacles that are on and above ground height. This belt has an
infrared optic sensor that projects thousands of dots in the 3-12 feet distance to determine what obstacles are
present. Based on the proximity and relative direction of the obstacle to the wearer, a series of vibrators on
the belt alert the wearer of the presence of obstacles around them. This system is lightweight, low-cost (~Rs
5000), easy to use, non-conspicuous and also frees up the ears and hands of the blind person. The system
provides a 160 degree view of the environment and is complementary to the limited reach of the walking stick.
It also does not require any additional infrastructure in the environment.
B.2.10.4 Use Case
Use Case ID: UC-23
Use Case Name: Guiding visually impaired to walk on the road
Actor: Visually impaired
Description: For ages, visually impaired walk using canes. However, canes are not very effective
with identifying above ground object as the cane searches for obstacles on the floor.
These canes also makes the disability much more visible where people develop bias
about people even before talking. This proves to be a reason for visually impaired
getting cut out from the entire society. Thus, they need some intuitive technologies
where viSparsh comes to the rescue.
Pre-conditions: Visually impaired needs to cross the road
Post-conditions: He crosses the road successfully avoiding on and above ground obstacles.
Normal Course of Events: - Visually impaired wears viSparsh waist belt
- He starts walking on the road
- This belt detect obstacles on the front side (using Microsoft Kinect sensor)
- The information is sent for processing
- Based on the analysis, visually impaired person (wearing the belt) is informed
about the distance and direction of the obstacle using vibrations on three
inner sides of the belt (front, right and left)
- For example, if obstacle is on the right side, only right side of the belt vibrates
and the intensity of the vibrations increases as obstacles comes closer and
vice-versa.
P a g e | 69
KISMET | Project Document
B.2.10.5 Addressed consumer need
Unavailability of sophisticated tool for helping visually impaired navigate
B.2.10.6 Other Applications
Converting the belt form factor to the size of a buckle which can be inserted in any belt to keep track of real
world information and give as required.
B.2.10.7 Target audience
Visually impaired
B.2.10.8 Technology Considerations
1. The weight and size of the belt are the biggest challenges that we are facing in this prototype.
2. Though we can design the new casing for Kinect by tearing the black, plastic casing and removing
some of the unwanted components inside it yet the size does not reduce drastically and we also don’t
have enough expertise in giving it a product shape.
3. User Experience: The user sitting with the device ON can also create some problem to the device. In
addition to it, in some scenarios like for Indian women who wear ‘saree’ (an Indian attire), a belt may
not be able to serve the right purpose.
4. The belt still does not work in outdoor mode due to Infra-Red rays coming from the sunrays.
B.2.10.9 References
Project Video: https://www.youtube.com/watch?v=KTmGZ6sElnU
CNN IBN news: https://www.youtube.com/watch?v=F2xGf-Cr6nI
Blog: http://visparsh.blogspot.in/
P a g e | 70
KISMET | Project Document
B.3 Extras
B.3.1 Kinect Green
B.3.1.1 Illustration
P a g e | 71
KISMET | Project Document
B.3.1.2 Context
The world needs to move towards more eco-friendly ways to support and promote technology. Using green
technology is one of the ways. What if we could provide smart utilization of readily available energy sources –
light, heat, vibrations, generation of power from body heat, or use of solar computing add-ons?
B.3.1.3 Feature Overview
Promoting use of green and clean technology through add-ons, initiatives, partnerships to make a sustainable
future.
a. Solar power enabled car for providing power to Kinect – Option to power KISMET device with solar
panels
b. Building real-time global awareness ecological chart – Since all KISMET device users will have Kinect
power in their hands, they can become our change agents to track green health of a place through
smart sensors. This region specific sensor data (like air pollution) is uploaded on cloud to create a
global awareness chart. This can be used by government and organizations to make intelligent
decisions about sustainable future. Incentivizing people to use more of green technology ways can
help enable this vision. One option of incentivizing could be ‘Reuse, Reduce and Recycle to gain game
points’.
B.3.1.4 Use Case
Use Case ID: UC-24
Use Case Name: Underwater pollution check
Actor: Student
Description: Specific range of IR light can help detect pollution in aquatic life
Pre-conditions: Underwater pollution check needs to be done
Post-conditions: Pollution level is checked, uploaded and game points are earner
Normal Course of Events: - Student goes underwater with his KISMET device
- He throws out IR light and opens pollution check application in the device
- The pollution check application reads following parameters:
. Pollution level in aquatic life
. RGB and depth image of the scene
. GPS co-ordinates of the location
- The student uploads this data on an online global awareness chart
- Post verification of the data, the student wins extra XBOX game points
B.3.1.5 Addressed consumer need
Need to move to a sustainable future
B.3.1.6 Other Applications
- Powering KISMET device through solar panels
P a g e | 72
KISMET | Project Document
B.3.1.7 Target audience
Everyone
B.3.1.8 Technology Consideration
- Exploring green technology ways of powering Kinect and KISMET device
- Water proof mobile device
- Embedding additional sensors in Kinect
B.3.1.9 References
Mid-range IR sensor for detecting pollution in aquatic life: http://www.mdpi.com/1424-8220/9/8/6232/pdf
P a g e | 73
KISMET | Project Document
B.3.2 Minority report design for transparent world
B.3.2.1 Illustration
B.3.2.2 Context
Transparent glass screens are the next generation displays. Lot of research is already going on in actualizing
smart mirror/glass concepts like a video named ‘A day made of glass’ by Corning.
B.3.2.3 Feature Overview
Interactive Heads Up Display (HUD) design for the transparent world: Often the first thing people do in the
morning is to look into their mirror or check their mobile for updates. What if we combine the two activities
and build a new minority report like design for the glass or mirror? This requires Kinect application to integrate
with Microsoft products like PowerPoint, outlook, SkyDrive etc. KISMET device can be wirelessly connected
P a g e | 74
KISMET | Project Document
with a projector projecting on the transparent screen (using holographic projection sheet coated on the
transparent screen). KISMET device is placed right below the screen and track user gestures to show and open
related content on the glass.
B.3.2.4 Use Case
Use Case ID: UC-25
Use Case Name: Getting morning updates on smart mirror
Actor: User 1
Description: User is in a habit of checking the day’s calendar as the first thing in the morning along
with brushing teeth. However, doing multi-tasking might involve some risks like
mobile device getting wet or broken (as you are not fully conscious right after you
wake up). Thus, he needs NUI based interaction to get the required updates.
Pre-conditions: User needs morning updates
Post-conditions: He gets to check his day’s calendar, emails and world news on the smart mirror
Normal Course of Events: - User wakes up and walks up to the sink (having a smart mirror on the top)
- He puts the KISMET device in a stand just below the mirror
- KISMET automatically makes a wireless connection with the back end
projector projecting on the mirror and starts tracking gesture
- Once the connection is established, the application running on KISMET pulls
the calendar in a simple to navigate gesture and shows on the mirror
- The user uses his gestures to navigate into the calendar or check emails or
send a reply
B.3.2.5 Addressed consumer need
NUI based, real world integration with Microsoft products
B.3.2.6 Other Applications
- Setting up interactive installations/3d art
B.3.2.7 Target audience
Everyone
B.3.2.8 Technology Requirements
Seamless wireless connection and data transfer between mobile device and projector
B.3.2.9 References
Productivity Future Vision:
P a g e | 75
KISMET | Project Document
https://www.youtube.com/watch?v=t5X2PxtvMsU&list=PLdJTVYW_WV_RkaTk5SbTVyPZRKSOty84w
A day made of glass 2:
https://www.youtube.com/watch?v=X-GXO_urMow
A day made of glass:
https://www.youtube.com/watch?v=6Cf7IL_eZ38&feature=kp
Holographic sheet for displaying information on transparent screen:
http://en.wikipedia.org/wiki/Holographic_screen
DIY projection TV:
http://www.practical-home-theater-guide.com/build-a-projection-tv.html
Phone into projection:
http://content.photojojo.com/diy/turn-your-phone-into-a-photo-projector-for-1/
P a g e | 76
KISMET | Project Document
B.3.3 One Microsoft gesture story for home: New XBOX
SmartGlass
B.3.3.1 Illustration
P a g e | 77
KISMET | Project Document
B.3.3.2 Context
Microsoft Office Division (MOD) has a sub-team named ‘CXE (Common Experience)’ which looks after the
shared experiences between all Microsoft Office products. The popular ‘ribbon’ features and functionalities are
part of that and is standardized for MOD. As future is being carved and people are going towards augmented
reality and gesture driven technologies, a time will soon arrive when the entire Microsoft language ecosystem
has to be represented with a gesture story. Thus, there is a need to define standardized NUI based experiences
for this upcoming world and to truly depict the One Microsoft gesture story.
B.3.3.3 Feature Overview
Enablement of Microsoft products for the NUI world in following two ways:
1. Shared experience NUI platform: Defining standardized set of features (including gestures and audio
commands) that are common for all Microsoft products like Air keyboard, air scroll, zoom in-out etc.
This can be termed as ‘Common Gesture Experience’
2. Product specific NUI experience: Each Microsoft product can define their own gestures for transitions
etc.
Sign language for each Microsoft product like ‘x’ for Xbox, ‘b’ for Bing and ‘p’ for PowerPoint.
Embedding Kinect’s capabilities into mobile device will in this way pave the way for next generation XBOX
SmartGlass functionalities.
B.3.3.4 Use Case
Use Case ID: UC-26
Use Case Name: On the go productivity at home
Actor: Working Women (WW)
Description: A women has organized a party at home. While cooking at home, she needs to get
some urgent office work signed off and send to others in the team.
Pre-conditions: Working women needs to reply to some emails while cooking
Post-conditions: She sends emails using NUI on the KSIMET device
Normal Course of Events: - She is preparing food in the kitchen
- She kept her KISMET device (Kinect enabled mobile device) in the kitchen
table on a kick stand
- She gets an email alert on her device which can can’t pick up to read as her
hands are occupied
- As there is lot of noise around with multiple people coming in, she can’t use
Cortana to open, read and reply to the mail
- She uses air gesture password along with her facial recognition to unlock the
device
P a g e | 78
KISMET | Project Document
- Next she does two full circles revolution gesture to let the device know that
she is starting to do command gesture. This will prevent from accidental
gesture tracking
- She now makes alphabet ‘E’ gesture to open email followed by a gesture to
transfer content to a bigger TV screen nearby
- She zooms in the big screen with next gesture, scroll the mail to read
- Lastly she opens the Air Keyboard (again with a gesture).Types the reply,
“Approved” and makes thumbs up gesture to send the mail.
B.3.3.5 Addressed consumer need
Standardization and customization of gestures for use with all Microsoft products
B.3.3.6 Other Applications
Home automation to interact with IoT world (like light, fan, TV) using shared gestures
B.3.3.7 Target audience
Working professionals
P a g e | 79
KISMET | Project Document
B.3.4 Kinect Personified
B.3.4.1 Illustration
B.3.4.2 Context
Everyone needs a personalized device. However, it is often the time that accidently or intentionally your device
gets used by the other person. Even you might be open towards sharing the device, not often you would be
interested in sharing the personal data like messages, phone directory or specific applications.
B.3.4.3 Feature Overview
Multiple personalization views for the same device: This is an enhanced version of the Kids corner view that
was launched in earlier Windows Smartphone devices. The main difference is that instead of you having to
manually switch on the kids’ corner section, RGB do facial tracking followed by random audio testing to define
what view would be perfect of a particular audience. This would also include scenarios where the device would
know if someone at your back is peeping in see the data on your device without you being aware of the same.
This would be done by doing contextual through depth mapping of not just who is in the front but who is at
back and what is their behavior?
P a g e | 80
KISMET | Project Document
B.3.4.4 Use Case
Use Case ID: UC-27
Use Case Name: Privacy Protection from a stranger
Actors: Mobile User (MU) and a stranger (S)
Description: A stranger is interested in getting some private and personal data of a user. He has
got hold of user’s password somehow. Now his initial plan is to access his smart device
once the user is not around.
Pre-conditions: A stranger interested in data privacy breach
Post-conditions: The smart KISMET device doesn’t allow data access to unauthorized users
Normal Course of Events: Mobile User goes to the washroom keeping his KISMET device (mobile Kinect device)
on his desk in a locked state
On seeing the right opportunity, the stranger (S) quickly goes to access user’s device.
Since S is smart, he also brings along a printed facial mask of the user to misguide
the device in case facial detection is done.
While doing the facial check with depth camera, the device inconsistency in the facial
contours and the facial mask was 2
In order to confirm the identity of the user, the device generates a random line and
asks the stranger to repeat that line in his voice
Unaware of this feature, S tries to mimic original user’s voice but fails to unlock the
device. Since every time the text is different, S cannot get a pre-recorded voice sample
of the original user
Alternative Courses: Unable to get access to the required data, the stranger (S) thinks of peeping from
behind while the user is accessing the data
S stands behind the shoulder of the user with the user’s knowledge
Standing at a distance over his heels, S tries to view the data from the back
Depth camera in KISMET device constantly scanning the background, sees an
additional face
The depth camera is studies the facial characteristics of S at the back and finds
something suspicious
The background running app in the device matches face with relatives and doesn’t
find any correlation
The user is instantly informed about the probability of privacy breach through a
notification bar message
B.3.4.5 Addressed consumer need
This eliminates a major cause of data being mishandled due to use by the other person. A simple example
being that your 1 year kid unknowingly picks up your mobile device and accidently deletes your phone book.
B.3.4.6 Target audience
Family
P a g e | 81
KISMET | Project Document
C. Appendix
Use Case 1: Creating obstacle detection DIY robot
Use Case 2: Buying shoes from online retail store
Use Case 3: Walking on the street with a digital friend named ‘Kinect Cortana’
Use Case 4: Context aware KISMET soldering guide
Use Case 5: Learning alphabet writing through live stencils
Use Case 6: Sending eye reports to doctor & improving health
Use Case 7: Mood based music listening
Use Case 8: Hollywood style real-time CGI cinematography
Use Case 9: Preparing for party through real-time feedback from IoT world
Use Case 10: Teaching history lesson using 3d animated story telling framework
Use Case 11: Transfer of short family video clip to smart wallet
Use Case 12: Understanding monumental structures
Use Case 13: Preventing road accident
Use Case 14: 3d scanning of face
Use Case 15: Providing active health feedback to the mobile user
Use Case 16: Auto-transfer of whiteboard content to OneNote
Use Case 17: Remote feel of 3d object
Use Case 18: Re-designing the home with automated indoor 3d mapping platform
Use Case 19: Protecting device in rain
Use Case 20: Field diagnostics of diseases by rural health workers with integrated microscope
Use Case 21: Designing an art piece for the house using holographic display cover
Use Case 22: Searching for Facebook friends in the party and informing on smart watch
Use Case 23: Guiding visually impaired to walk on the road
P a g e | 82
KISMET | Project Document
Use Case 24: Underwater pollution check
Use Case 25: Getting morning updates on smart mirror
Use Case 26: On the go productivity at home
Use Case 27: Privacy Protection from a stranger

Weitere ähnliche Inhalte

Was ist angesagt?

Augmented reality
Augmented realityAugmented reality
Augmented reality
sahebsab
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction
Bello Abubakar
 
Business Perspective- Augmented Reality
Business Perspective- Augmented RealityBusiness Perspective- Augmented Reality
Business Perspective- Augmented Reality
ashua12
 

Was ist angesagt? (20)

Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented Reality pdf
Augmented Reality pdf Augmented Reality pdf
Augmented Reality pdf
 
Report on Augmented Reality 2015
Report on Augmented Reality 2015Report on Augmented Reality 2015
Report on Augmented Reality 2015
 
3D in Android
3D in Android3D in Android
3D in Android
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Recent Trends And Challenges In Augmented Reality
Recent Trends And Challenges In Augmented RealityRecent Trends And Challenges In Augmented Reality
Recent Trends And Challenges In Augmented Reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Human Computer Interface Augmented Reality
Human Computer Interface Augmented RealityHuman Computer Interface Augmented Reality
Human Computer Interface Augmented Reality
 
Smart Glasses and the Evolution of Human-Computing Interfaces
Smart Glasses and the Evolution of Human-Computing InterfacesSmart Glasses and the Evolution of Human-Computing Interfaces
Smart Glasses and the Evolution of Human-Computing Interfaces
 
Augmented Reality - the next big thing in mobile
Augmented Reality - the next big thing in mobileAugmented Reality - the next big thing in mobile
Augmented Reality - the next big thing in mobile
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented reality marketing strategies: The how to guide for marketers (full)
Augmented reality marketing strategies: The how to guide for marketers (full)Augmented reality marketing strategies: The how to guide for marketers (full)
Augmented reality marketing strategies: The how to guide for marketers (full)
 
"Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat...
"Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat..."Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat...
"Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat...
 
Business Perspective- Augmented Reality
Business Perspective- Augmented RealityBusiness Perspective- Augmented Reality
Business Perspective- Augmented Reality
 

Andere mochten auch (18)

Mirage marbles
Mirage marblesMirage marbles
Mirage marbles
 
In Defense of Beauty_Senior Thesis_Matthew J Peterson
In Defense of Beauty_Senior Thesis_Matthew J PetersonIn Defense of Beauty_Senior Thesis_Matthew J Peterson
In Defense of Beauty_Senior Thesis_Matthew J Peterson
 
806 קיץ ב 2011
806 קיץ ב 2011806 קיץ ב 2011
806 קיץ ב 2011
 
Ekonom
Ekonom Ekonom
Ekonom
 
Book talk Podcast Lesson
Book talk Podcast LessonBook talk Podcast Lesson
Book talk Podcast Lesson
 
2559 project
2559 project2559 project
2559 project
 
Community Analysis (851)
Community Analysis (851)Community Analysis (851)
Community Analysis (851)
 
OntoFrac-S
OntoFrac-SOntoFrac-S
OntoFrac-S
 
մխիթար սեբաստացի
մխիթար սեբաստացիմխիթար սեբաստացի
մխիթար սեբաստացի
 
Flexstaff
FlexstaffFlexstaff
Flexstaff
 
Alexander McQueen
Alexander McQueenAlexander McQueen
Alexander McQueen
 
Script - Langley Vale - Sean Wayne
Script - Langley Vale - Sean WayneScript - Langley Vale - Sean Wayne
Script - Langley Vale - Sean Wayne
 
Woodrow wilson
Woodrow wilsonWoodrow wilson
Woodrow wilson
 
Nike pw presentation
Nike pw presentationNike pw presentation
Nike pw presentation
 
How we used synergy in our website
How we used synergy in our websiteHow we used synergy in our website
How we used synergy in our website
 
մեսրոպ մաշտոց
մեսրոպ մաշտոցմեսրոպ մաշտոց
մեսրոպ մաշտոց
 
Conditions for life on earth
Conditions for life on earthConditions for life on earth
Conditions for life on earth
 
Presentasi unggas
Presentasi unggasPresentasi unggas
Presentasi unggas
 

Ähnlich wie KISMET (Kinect In Small Form for Mobile Envisioned Technologies)

AbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docxAbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docx
bartholomeocoombs
 
SIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTSIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORT
JISMI JACOB
 
Location based services using augmented reality
Location based services using augmented realityLocation based services using augmented reality
Location based services using augmented reality
IAEME Publication
 
Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
bhavyakishore
 

Ähnlich wie KISMET (Kinect In Small Form for Mobile Envisioned Technologies) (20)

Augmented Reality in Medical Education
Augmented Reality in Medical EducationAugmented Reality in Medical Education
Augmented Reality in Medical Education
 
A case study on Adstuck's Augmented Reality ventures (Amity University) by Pa...
A case study on Adstuck's Augmented Reality ventures (Amity University) by Pa...A case study on Adstuck's Augmented Reality ventures (Amity University) by Pa...
A case study on Adstuck's Augmented Reality ventures (Amity University) by Pa...
 
AbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docxAbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docx
 
finalgoogle
finalgooglefinalgoogle
finalgoogle
 
Seminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green ITSeminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green IT
 
IRJET- Augmented Reality in Interior Design
IRJET- Augmented Reality in Interior DesignIRJET- Augmented Reality in Interior Design
IRJET- Augmented Reality in Interior Design
 
Interior Designing Mobile Application based on Markerless Augmented Reality (AR)
Interior Designing Mobile Application based on Markerless Augmented Reality (AR)Interior Designing Mobile Application based on Markerless Augmented Reality (AR)
Interior Designing Mobile Application based on Markerless Augmented Reality (AR)
 
SIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTSIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORT
 
Augmented reality : Possibilities and Challenges - An IEEE talk at DA-IICT
Augmented reality : Possibilities and Challenges - An IEEE talk at DA-IICTAugmented reality : Possibilities and Challenges - An IEEE talk at DA-IICT
Augmented reality : Possibilities and Challenges - An IEEE talk at DA-IICT
 
Google Glass: A Futuristic Fashion Failure Gadget
Google Glass: A Futuristic Fashion Failure  GadgetGoogle Glass: A Futuristic Fashion Failure  Gadget
Google Glass: A Futuristic Fashion Failure Gadget
 
Technical Report on Google Glass/Department of INFORMATION TECHNOLOGY
Technical Report on Google Glass/Department of INFORMATION TECHNOLOGYTechnical Report on Google Glass/Department of INFORMATION TECHNOLOGY
Technical Report on Google Glass/Department of INFORMATION TECHNOLOGY
 
Location based services using augmented reality
Location based services using augmented realityLocation based services using augmented reality
Location based services using augmented reality
 
Google glass documentation
Google glass documentationGoogle glass documentation
Google glass documentation
 
Raspberry Pi Augmentation: A Cost Effective Solution To Google Glass
Raspberry Pi Augmentation: A Cost Effective Solution To Google GlassRaspberry Pi Augmentation: A Cost Effective Solution To Google Glass
Raspberry Pi Augmentation: A Cost Effective Solution To Google Glass
 
IRJET- Smart Mirror
IRJET- Smart MirrorIRJET- Smart Mirror
IRJET- Smart Mirror
 
How effective is Swift’s AR technology in developing.pdf
How effective is Swift’s AR technology in developing.pdfHow effective is Swift’s AR technology in developing.pdf
How effective is Swift’s AR technology in developing.pdf
 
Controlling Electrical Appliances Using IOT and AR
Controlling Electrical Appliances Using IOT and ARControlling Electrical Appliances Using IOT and AR
Controlling Electrical Appliances Using IOT and AR
 
Smart Glasses Technology
Smart Glasses TechnologySmart Glasses Technology
Smart Glasses Technology
 
Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
 
IRJET- Deep Dive into Augmented Reality
IRJET-  	  Deep Dive into Augmented RealityIRJET-  	  Deep Dive into Augmented Reality
IRJET- Deep Dive into Augmented Reality
 

Kürzlich hochgeladen

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 

Kürzlich hochgeladen (20)

Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 

KISMET (Kinect In Small Form for Mobile Envisioned Technologies)

  • 1.
  • 2. P a g e | 2 KISMET | Project Document Project Name: KISMET (Kinect In Small form for Mobile Envisioned Technologies) Version: 0.1 (Draft) Prepared by: Rolly Seth Location: Hyderabad, India Date: 14th July, 2014
  • 3. P a g e | 3 KISMET | Project Document Contents A. EXECUTIVE SUMMARY......................................................................................................................................................4 A.1 PROJECT OVERVIEW ..................................................................................................................................................... 4 A.2 PURPOSE AND SCOPE OF THE DOCUMENT .................................................................................................................. 5 B. SCENARIOS.......................................................................................................................................................................6 B.1 MAIN SCENARIOS.......................................................................................................................................................... 6 B.1.1 Lego Kinect ................................................................................................................................................................................................... 6 B.1.2 Dimensions based retail experience...................................................................................................................................................... 9 B.1.3 Kinect and Cortana integrated personal assistant...........................................................................................................................12 B.1.4 Visual assistant for students.................................................................................................................................................................. 14 B.1.5 Kinect + a small projector = Live Stencils...........................................................................................................................................17 B.1.6 Mobile health-checkup ........................................................................................................................................................................... 19 B.1.7 Mood based environment creation ......................................................................................................................................................21 B.1.8 Auto-sync of multi Kinects.....................................................................................................................................................................23 B.1.9 Kinectduino ................................................................................................................................................................................................26 B.1.10 K-Enchant- A magical framework .....................................................................................................................................................30 B.1.11 Harry Potter style interactions.............................................................................................................................................................32 B.1.12 Self-paced K-12 education ...................................................................................................................................................................35 B.1.13 Kinect enabled real-time, road safety warnings ............................................................................................................................38 B.1.14 3d modelling and gesture enabled print..........................................................................................................................................40 B.1.15 Kinect Active Health...............................................................................................................................................................................42 B.2 ACCESSORY BASED SCENARIOS..................................................................................................................................46 B.2.1 Virtual OneNote Surface........................................................................................................................................................................46 B.2.2 Five senses Kinect.....................................................................................................................................................................................49 B.2.3 Automated Indoor Mapping Personal Assistant ............................................................................................................................. 52 B.2.4 Kinect enabled smart casing – a self-protection device ................................................................................................................ 55 B.2.5 Plug & play Kinect components...........................................................................................................................................................59 B.2.8 Holo-Kinect................................................................................................................................................................................................62 B.2.9 A fashion accessory.................................................................................................................................................................................65 B.1.10 viSparsh – a haptic belt for visually impaired.................................................................................................................................67 B.3 EXTRAS .......................................................................................................................................................................70 B.3.1 Kinect Green ..............................................................................................................................................................................................70 B.3.2 Minority report design for transparent world...................................................................................................................................73 B.3.3 One Microsoft gesture story for home: New XBOX SmartGlass ................................................................................................76 B.3.4 Kinect Personified.....................................................................................................................................................................................79 C. APPENDIX.......................................................................................................................................................................81
  • 4. P a g e | 4 KISMET | Project Document A. Executive Summary A.1 Project Overview As the validity of Moore’s law still lasts, the ever need to increase the chip performance and decrease in the device size continues. Microsoft Kinect is no exception. Lot of research is already going on in the industry to integrate Kinect like device in small sized products. Some major efforts in this direction have already been visible around. Examples being - Next Generation 3d Photography in HTC One – The application takes the picture along with the depth data. The depth data enables adding additional photo-filters based on who is in front or back or need to do post click focusing etc. This would also allow taking pictures with gestures without pressing the manual button. - Dynamic perspective in Amazon Fire Phone – Few days back, Amazon showed its latest phone which gives the ability to provide 3d parallax effects at places like lock screens and they have named it ‘Dynamic Perspective’. - Providing human scale understanding of space and motion to mobile phones - Google’s Project Tango aims to see the world in 3d to take visual cues from the surroundings and thereby provide more contextually relevant information. - Context sensitive information and shopping experience – Yet another feature of Amazon Smartphone is ‘Firefly’ which takes out contextual information like barcode, audio piece and eye- tracking to fetch you the related content for shopping. - Augmented Reality Games – Augmented reality is another area where lot many companies are banking on. Vuforia by Qualcomm is one of them which creates high fidelity, engaging and 3d environments on mobiles. - 3d Computing SDKs – Several companies have already released their SDKs for creating 3d, interactive experiences on small devices. Intel Perceptual Computing SDK is just one of them. SoftKinetic is another one which enables 3d vision and experiences for portable devices. The list is enormous. The aim of this document is not to focus on them. The document aims to present new scenarios when Kinect’s capabilities are encapsulated in small and portable devices like smartphones, tablets and phablets. The project is named ‘KISMET’ which stands for ‘Kinect In Small form for Mobile Envisioned Technologies’. Terminology Used: For the sake of simplicity and avoid any confusion, the term ‘KISMET device’ has been used in place of Kinect enabled future windows mobile devices like Smartphones, tablets and phablets. Since, mobile devices already have RGB camera and microphones (which have their own increasing capabilities), essentially use of term ‘KISMET’ would mean adding depth sensor to mobile devices as shown in the legend below. Legend: ‘KISMET’= Depth Sensor + Smartphone/tablet/phablet
  • 5. P a g e | 5 KISMET | Project Document A.2 Purpose and scope of the document The purpose of the document is to present scenarios/applications that would meet the following two criteria: a. Depth sensor integrated with smartphone, tablet and phablets b. Address a large consumer base instead of being enterprise centric The scenarios presented in this document are classified into three main categories: - Main Scenarios - These scenarios revolve around enhanced applications written on the mobile device to utilize the depth sensor data along with already existing RGB camera and Microphone data. - Accessory based Scenarios – These are not sole applications and require using add-on to the mobile device for enhancing the capabilities of making use of depth data. - Extras – These might not directly relate to the use of depth data but provide an understanding on how social, secure and sustainable experiences can be created for the new world be it things like initiatives or design changes etc. Based on the scenario, all or some of the following sections would be covered in the rest of the document: i. Illustration ii. Context iii. Feature Overview iv. Use Case v. Addressed Consumer Need vi. Other Applications vii. Target Audience viii. Technology Considerations ix. References
  • 6. P a g e | 6 KISMET | Project Document B. Scenarios B.1 Main scenarios B.1.1 Lego Kinect B.1.1.1 Illustration B.1.1.2 Context Lego bricks are famous worldwide. They are a powerful yet simple DIY (Do It Yourself) bricks. As of 2013, around 560 billion Lego parts have been produce and each year 20 billion LEGO elements are produced. The world’s children spend 5 billion hours a year playing with LEGO bricks. Since this number is huge, it automatically promotes a big hacking culture in the community right through the childhood age. It gives the imagination in your hands to make what you want.
  • 7. P a g e | 7 KISMET | Project Document B.1.1.3 Feature Overview Basic LEGO bricks are an empowerment tool. However, over the years there has been evolution in LEGO itself. In order to understand where ‘KISMET’ would fit, let us understand the current LEGO ecosystem that exists now. As shown in the left side of the image below, LEGO has a base layer which are the plastic bricks. They don’t have a brain of their own. In order to provide that brain, LEGO came up with Mindstorms series robotics kit (Layer 2). On top of it, computer program (Layer 3) comes where kids and students can built robots or other electronics stuff. Since LEGO bricks are the essential building blocks, these can’t be replaced. However, there is an opportunity area to replace Layer 2 and Layer 3 with KISMET based applications. The mobile KISMET device through depth sensing will provide capabilities like object detection, line follower, color detection, speech recognition or any accessory add-ons. Using these capabilities might be challenging and that is where Layer 3 revamp comes. KISMET device will have touch develop based applications for even youngsters to build innovative stuff without having programming languages depth. PS: For fixing KISMET with LEGO bricks, KISMET device edges can be grooved to fit into the LEGO brick or a special cover can also be designed. B.1.1.4 Use Case Use Case ID: UC-1 Use Case Name: Creating obstacle detection DIY robot Actor: A 6 years old kid Description: A kid has a number of basic LEGO bricks and other LEGO accessories at home. His mother gets a new KISMET device (depth camera enabled mobile device). He is interested in building an ‘obstacle avoidance’ robot with it. LEGO ECOSYSTEM Layer 3 Computer Program Layer 2 Mindstorms NxT Layer 1 Bricks Touch Develop App KISMET uses - Object Detection - Line follower - Color detection - Speech Recognition - Accessory Add- on Present Future
  • 8. P a g e | 8 KISMET | Project Document Pre-conditions:  Availability of easy to plug-in phone casing/border design for LEGO elemental brick  A touch and develop app for LEGO Kinect in Windows Store  Availability of small sized depth sensors Post-conditions: The kid creates moving robot with LEGO bricks and Windows Phone (KISMET device) Normal Course of Events:  A kid is interested in building an intelligent, LEGO robot which can see and move in the house  He combines different LEGO bricks & tyres to make a DIY vehicle  He downloads the Kinect LEGO app from the Windows Store  He selects Input Sensor (depth), Command (‘Move without obstacle avoidance’) and Output (LEGO servo motor) in the app and saves this configuration  Plugs the phone onto LEGO built vehicle using specially designed casing/covering and connect the LEGO Servo motor cord with Phone ports  Run the saved configuration  Your cost saving and easy to build robot is ready  The robot moves in your entire house by doing obstacle through small depth sensors Assumptions: Availability of simple to develop app to enable further automation of LEGOs Windows Phone casing directly fits the device into the LEGO brick or there is a phone version where instead of smooth device edge, there are grooved edges for fixing in the brick Availability of hardware ports to power motor and provide asynchronous, duplex communication B.1.1.5 Addressed consumer need In the recent past there has been outburst of hacking culture and DIY (Do-It-Yourself) revolution. This concept promotes the consumer need to get the basic building blocks with which they can create something for the tomorrow. B.1.1.6 Other Applications - Build imaginative things like seeing flower vase, LEGO 3d vision helicopters etc. - Build interactive robots with basic LEGO bricks through the use - People would be able to create their own creative stands with the Windows device - Other creative uses that cannot be imagined now. B.1.1.7 Target audience
  • 9. P a g e | 9 KISMET | Project Document In this case the two major audience will be kids (1-17 years old) who stay in 130 countries where LEGOs are available in the world. Also hobbyist would be interested in exploring the same. Since it will be a creative tool, there is a high possibility that people of other age groups also get attracted to it. B.1.1.8 Technology Considerations - Re-usable port for plugging/powering and running external devices like servo motor - Power consumption - Dissipated heat due to small size of KISMET device B.1.1.9 References During my last MIT Media Lab visit, I met Prof. Mitchel Resnick (Head, Lifelong Kindergarten group that made LEGO Mindstorms Robotics kit). He has clear view of the future where he mentioned that don’t make something that people can use but give something to people with which they can create unlimited things with their imagination. That is where the realization came that technology’s purpose is not to aid but also empower people. B.1.2 Dimensions based retail experience B.1.2.1 Illustration
  • 10. P a g e | 10 KISMET | Project Document B.1.2.2 Context Online shopping is a growing trend. The number of digital buyers in 2013 were around 157 million and this number is expected to rise to 180 million in 2017 according to industry estimates. While doing online shopping, one of the major obstacle is figuring out what size would be of perfect fit for you be it shoes, t-shirts or other items. With the deals like cash on delivery or one month replacement, consumer are less reluctant to buy things the digital way. Most of the replacements and order failure happens due to improper size fittings. This leads the company to spend extra money enabling door to door replacements. There is an opportunity to save company expenditures on this by providing the right size to the buyer during the first time itself. B.1.2.3 Feature Overview 3d scanning to know the right size of buying items: This feature would inform you about the right size of a product that you wish to buy through depth scanning. As each store has their custom sizes, this feature can further be integrated with online stores for automatic searching of what items will suit a buyer’s requirements.
  • 11. P a g e | 11 KISMET | Project Document B.1.2.4 Use Case Use Case ID: UC-2 Use Case Name: Buying shoes from online retail store Actor: Youngster Description: A boy wish to buy a new sports shoes for his upcoming sports event. He doesn’t have much time to go to the market for buying. Thus, he opens the online store site to buy the shoes. However, he gets confused on seeing multiple options and is unsure which of them would provide the perfect fit for his leg size. Pre-conditions: A boy wants to buy sports shoes Post-conditions: He orders shoes online using automated personalized fitting options Normal Course of Events: - Once the boy opens the shoe’s retail site, the site asks the boy to scan his feet to help get an estimate of his shoe size - A KISMET 3d scan child windows opens - The boy scans his feet and clicks ‘Done’ - The child window closes - The site has a 3d image of boy’s feet along with all the required dimensions. - The site suggests shoe options based on exact size fitting for the boy’s feet - The boy chooses a shoe and orders it online B.1.2.5 Addressed consumer need Getting the right product through online shopping during the first attempt itself B.1.2.6 Other Applications - Multiple retail experiences like buying dress, goggles etc.
  • 12. P a g e | 12 KISMET | Project Document B.1.3 Kinect and Cortana integrated personal assistant B.1.3.1 Illustration B.1.3.2 Context Microsoft Cortana is great in doing following three things: a. Good at finding something if we divide things into verbs and objects b. Helps you communicate with people in your life or people in the world like Facebook or twitter c. Helps you to remember things
  • 13. P a g e | 13 KISMET | Project Document Cortana is best at handling large amount of information or text as it directly jumps into the data. B.1.3.3 Feature Overview Cortana will be able to see the 3d world with the help of depth sensor. It can have facial features mimicking app to give a sense of you talking to a friend. Depth sensor and RGB camera will see and analyze the environment and based on picture or environment mapping, Cortana will figure out the contextually relevant information for the individual. It will also provide suggestions accordingly. B.1.3.4 Use Case Use Case ID: UC-3 Use Case Name: Walking on the street with a digital friend named ‘Kinect Cortana’ Actor: A visually impaired Description: A visually impaired person needs a friend who is always nearby to verbally explain the environment around and alert in case of danger nearby. Pre-conditions: - Cortana – personal assistant integration with depth camera and RGB camera to provide contextual information through voice - Kinect and Facebook (social network) integration to do facial recognition and search names Post-conditions: Visually impaired is able to walk around with his new eyes as Kinect Cortana Normal Course of Events: - Visually impaired person open the ‘See World’ app - He puts the KISMET device in his t-shirt’s front pocket such that the Kinect depth sensor and rgb camera is slightly above the pocket - He starts walking casually on the road like every day else - Kinect Cortana does real time environment analysis and provides audio feedback messages like, “Hey James, watch out. There is a skateboard coming towards you from front.”, “I guess Rita, your new Facebook friend is coming towards you.” - Visually impaired have dialogue with Kinect Cortana. Visually Impaired –“Hey Cortana, can you see and tell me what is the building in the front named?” Kinect Cortana-“Sure. It is Macy’s Store. ” Assumptions: Kinect sensing technology works in sunlight B.1.3.5 Addressed consumer need - Context based help - 24*7 personal assistant availability B.1.3.6 Other Applications
  • 14. P a g e | 14 KISMET | Project Document - Accident prevention on roads - Real world based search - Learning - Increasing social consciousness - Psychological therapy B.1.3.7 Target audience Anyone B.1.3.8 Technology Considerations - Kinect working in sunlight - Advanced pattern matching and machine learning to determine context B.1.3.9 References Lend an eye: https://www.youtube.com/watch?v=eeFlpjuv8rs B.1.4 Visual assistant for students B.1.4.1 Illustration
  • 15. P a g e | 15 KISMET | Project Document B.1.4.2 Context While many activities have been automated, still there are several tasks which require manual interventions. Some of these are high precision activities like soldering, welding. These are cumbersome and are more prone to mistakes as a single mistake can ruin the entire pcb (printed circuit board) or welding material. One way to reduce the inefficiencies is real-time guidance on where and how to perform the task (like soldering or welding). B.1.4.3 Feature Overview
  • 16. P a g e | 16 KISMET | Project Document Digital instructor to aid learning and provide real-time feedback: This feature of KISMET device works in three steps: a. Pre-feeding of ideal design in the KISMET device b. Depth & RGB camera compares real time work in progress with the ideal design c. Real time feedback through audio is provided as to where the person is going wrong B.1.4.4 Use Case Use Case ID: UC-4 Use Case Name: Context aware KISMET soldering guide Actor: Learner Description: Person has to quickly solder a microcontroller on to a PCB (printed control board). It is becoming difficult for him to one by one read which pin to be soldered where. He needs assistance with soldering. Pre-conditions: Person is learning to solder Post-conditions: Person is able to complete the soldering accurately Normal Course of Events: - Person feeds in an image of the circuit in the ‘Kinect Image Guide app’ - He fixes the PCB in circuit board holder - He hangs the KISMET device upside-down at a distance from the table so that RGB camera can clearly see the PCB at the bottom - Person places the microcontroller in the PCB board and started the soldering process for circuit design - Kinect Image Guide App maps the original circuit image with real-time circuit design view from RGB and depth camera of Kinect - The app provides audio instructions in case the person is not soldering at the correct pins - Idea behavior is suggested by the Kinect Image guiding app B.1.4.5 Addressed consumer need Shortage of real time instructors for 1:1 personalized and quality teaching B.1.4.6 Other Applications - Performance assessment /quality check of items being developed - Factories like manufacturing, product design can use it to train people or guide the accuracy of the work being accomplished - Coaching Institutions can make a customized learning package B.1.4.7 Target audience - Student - Learner
  • 17. P a g e | 17 KISMET | Project Document B.1.5 Kinect + a small projector = Live Stencils B.1.5.1 Illustration B.1.5.2 Context Stencils are a great way to learn. If depth sensor and projector are both integrated in the mobile device (KISMET), we can build ‘Live Stencils’ for learning purposes. B.1.5.3 Feature Overview A learning tool to build, project a stencil and provide real-time feedback on the quality of tracing or drawing.
  • 18. P a g e | 18 KISMET | Project Document B.1.5.4 Use Case Use Case ID: UC-5 Use Case Name: Learning alphabet writing through live stencils Actor: A 3 years old kid Description: Pre-conditions: - Depth sensor and small projector integrated in the mobile device (phablet, tablet or smartphone) - ‘Kinect Stencil’ App downloaded on the device Post-conditions: Kid learns how to write alphabets Normal Course of Events: - KISMET device kept on small stand facing down towards the table - White paper placed on the table - ‘Learning stencil’ (alphabets in this case) projected over the paper - Kid traces the same alphabets on the paper and learns - RGB camera and Depth sensor tracks if the tracing is complete - Once completes reviews the final output and gives suggestion to the kid, ‘Good Work’ - Kid says, “Next” or ”repeat” - The app moves onto the second projection or remains on the audio command given B.1.5.5 Addressed consumer need Lack of intuitive ways for drawing based learning post content digitization B.1.5.6 Other Applications In electronics engineering, for soldering purposes B.1.5.7 Target audience - Students - Artists B.1.5.8 Technology Considerations - Miniaturizing projector size B.1.5.9 References
  • 19. P a g e | 19 KISMET | Project Document Not directly related (Cube keyboard projection: https://www.youtube.com/watch?v=rCghJvjB7rI) B.1.6 Mobile health-checkup B.1.6.1 Illustration
  • 20. P a g e | 20 KISMET | Project Document B.1.6.2 Context This scenario addresses the need to gather health related data for preventive measures and then utilize it to build free health exercises that are tracked for aiming towards health improvement. B.1.6.3 Feature Overview a. Health Tests Application: It provides capability to do mobile based health check-ups. The application provides a list of options to take the test like eye scan, dental scan etc. b. Game based health improvement exercises: These test reports are used to create custom game exercises on the KISMET device like revolving eyes 10 times, moving neck up and down etc. B.1.6.4 Use Case Use Case ID: UC-6 Use Case Name: Sending eye reports to doctor & improving health Actor: A man Description: - Pre-conditions: A man’s eyes are paining Post-conditions: He gets his eyes tested and have preliminary report sitting at home Normal Course of Events: - He opens health test app - Selects eye scan from the test list & follows the instructions given on the device - He takes his eyes near the KISMET’s RBG Camera and depth sensor - On a count of 3, 2 and 1… HD eye scanning starts - Post scan, preliminary analysis is done by KISMET device to find any major issue like stressed eye etc. - In parallel, eye tests reports are sent to the doctor on his email - Based on preliminary analysis, the app finds three tailored eye exercises for the person - User selects an exercise for trying out - Kinect’s depth camera tracks the eye gestures & guides on real time basis B.1.6.5 Addressed consumer need Consumerization of basic health check-ups B.1.6.6 Other Applications - B.1.6.7 Target audience
  • 21. P a g e | 21 KISMET | Project Document Everyone (This scenario has more vast reach compared to the previous scenario) B.1.7 Mood based environment creation B.1.7.1 Illustration
  • 22. P a g e | 22 KISMET | Project Document B.1.7.2 Context Ubiquitous systems are present everywhere 24*7. They only get actuated based on the person’s needs else they remain hidden behind the virtual walls of the world. This need doesn’t necessarily have to be explicitly stated through words like saying, “XBOX On”. Many times the needs can be implicit too. These implicit needs are very often understood by body language and facial expressions. If we can tap these implicit messages, we can create a more intelligent and ubiquitous environment. B.1.7.3 Feature Overview Mood based environment actions: Non- verbal messages are read and understood by KISMET device on real time basis. Based on this knowledge, appropriate suggestions are made though the notification bar on KISMET device or environmental actions are taken. B.1.7.4 Use Case Use Case ID: UC-7 Use Case Name: Mood based music listening Actor: Working Professional Description: A working professional is driving towards home after hectic day. He needs some refreshing music. He puts his KISMET device near the car dashboard and switches on the radio in his car. Slow music starts playing. But the person’s expression shows that he doesn’t like the music. Pre-conditions: Person is not liking the music as evident from his expressions Post-conditions: Person gets to hear music of his choice Normal Course of Events: - KISMET Device detects changes in facial expressions - From those expressions, it is inferred that person is not liking something - To determine this ‘something’, environment is scanned to check the external dependency - Microphone detects slow music being played from the car’s radio - Based on this inference, KISMET device says, “You look dull friend. Shall I put a more lively music playing radio station for you?” - Seeing proactive behavior of KISMET device, person’s face glows with a remark, “Yes” - KISMET scans other radio stations and switches to a more suited station. B.1.7.5 Addressed consumer need Difficulty in managing non-intuitive devices B.1.7.6 References
  • 23. P a g e | 23 KISMET | Project Document Apple’s patent on mood based ad targeting: http://techcrunch.com/2014/01/23/apple-patent-explores- mood-based-ad-targeting/ Apple’s patent on mood based delivery system: http://appleinsider.com/articles/14/01/23/apple-investigating- mood-based-ad-delivery-system Mood based music: http://www.moodfuse.com/ Depression detection through Kinect: http://www.technologyreview.com/view/513126/kinect-powered- depression-detector-is-amazing-and-creepy/ B.1.8 Auto-sync of multi Kinects B.1.8.1 Illustration
  • 24. P a g e | 24 KISMET | Project Document B.1.8.2 Context In the recent past, lot of focus has been given to quick syncing between multiple devices. Television synchronization with other smart devices being the chief among them. With the advent and miniaturization of smart cameras and sensors, there needs to be a holistic model for quick sync between multiple sensors and camera data to produce immersive experiences.
  • 25. P a g e | 25 KISMET | Project Document B.1.8.3 Feature Overview Multi –KISMET devices auto sync feature: Based on master slave model, from among several KISMET devices, one of the device acts as master. All other KISMET devices are slave who send real-time information about the environment to the master device. This can help in creating a virtual 3d world out of real world 3d data on a real-time basis. B.1.8.4 Use Case Use Case ID: UC-8 Use Case Name: Hollywood style real-time CGI cinematography Actor: Cinematographer Description: Using multi-KISMET sync and 3d output manipulation application, the cinematographer is able to blend real world with virtual world on a real-time basis. This helps in creating a draft of CGI movies by finalizing the scene view (angle of shoot for a particular scene) Pre-conditions: - Development of app for viewing multi-KISMET sync 3d output and doing manipulation - For multiple mobile devices, the option to make one of them ‘Master’ which would receive wireless outputs from all other nearby ‘SLAVE’ devices Post-conditions: Real time cinematized CGI shot Normal Course of Events: - A cinematographer places multiple KISMET devices at different angles - Sets all these devices as the ‘SLAVE’ to send parametric values to a master device - He keeps one KISMET device (mobile Kinect) in his hand which will act as the master device - He opens the CGI app - He now can see different slave views in a sliced manner or consolidated in a 3d format - He manipulates the real-time view by placing smart objects or remove others - Cinematographer is able to see and understand the CGI shot from multiple angles and take quick decisions on which would suit the movie requirement. This will also help in deciding where will virtual objects fit in the scene B.1.8.5 Addressed consumer need - People have been facing lot of trouble in order to sync and utilize multi-Kinects data - Real-time cinematography for quick multi-scene view at the shooting location and make effective decisions accordingly. Multi-Kinect sync feature can help provide following additional features:  Zoom in 3D  Add virtual set in 3d  Crop and shift items real-time  Map actor intent to CG actor
  • 26. P a g e | 26 KISMET | Project Document  Add visual effects - Increasing area of influence for life size interactions - Difficulty in overlapping Kinect data with DSLR B.1.8.6 Other Applications - Home Security with room based view and the master device can also be at remote location B.1.8.7 Target audience - Hollywood - Regional movie making enthusiast - 3d modelers and consultants B.1.8.8 Technology Considerations - Real-time Bluetooth communication model for sending high bandwidth data like video - Battery life sustenance during video feed/HD image sync B.1.8.9 References RGBD Toolkit: http://www.rgbdtoolkit.com/ B.1.9 Kinectduino B.1.9.1 Illustration
  • 27. P a g e | 27 KISMET | Project Document B.1.9.2 Context Arduino is one of the most talked about open source platform for prototyping Internet of things scenarios within your home. While it can do a brilliant job of communicating with multiple appliances and activate different experiences at home, they are cut out from the real world knowledge unless the person itself carries Arduino or other similar development platform everywhere he goes. This provides a difficulty in providing context aware environment creation by merging outside and inside home ontologies and knowledge. One of the ways to deal with this is to have two companions- one which provides real world information and other which uses this information to create context aware personalized environment. The best way to get this external world information is by integrating with a device that stays, move around with the person in the external world.
  • 28. P a g e | 28 KISMET | Project Document B.1.9.3 Feature Overview Connecting KISMET (Kinect enabled mobile device) with Arduino or similar development platform: KISMET devices are mobile like smartphones, tablets and phablets and a user usually carries it along everywhere in the external world. This provides an untapped opportunity to gather and analyze the contextual data to activate personalized experiences at home. Thus, the two companions as discussed in the ‘Context’ section above are: Arduino fueling the home & KISMET device understanding the context from the external world information. + = Seamless world creation (by merging the learnings of one world with the personalization setting determination for the other world) There are two ways in which two companions can interact: A. Real-time basis (Proactive): One example of this can be if it is too sunny outside, refrigerator (via Arduino) is informed on real-time basis to lower down the temperature and have cold water available once the person gets home. B. Asynchronous interaction (Reactive): Person gets home after day long work and plugs his KISMET with Arduino at home. Arduino takes the data dump and converts it to knowledge that can be useful for the home setting. One example can be determining the person’s mood as per day’s happening. Based on that the environment is changed by Arduino like if he is upset, curtains will have lively flower animations running. B.1.9.4 Use Case Use Case ID: UC-9 Use Case Name: Preparing for party through real-time feedback from IoT world Actor: Businessman Description: A businessman is throwing a party for his friends at home tonight. He needs to buy some food items for cooking dinner. He is not sure what items are available home and is he doesn’t want to buy duplicate items. He doesn’t have time left to go to home, check & again come back to the food store. Pre-conditions: Businessman need to buy food items which are not available at home Post-conditions: He buys only those items which are not at home Normal Course of Events: - Person goes to food store - Scans cold-drink bottle in 3d and sends these depth based built model to Arduino connected to the refrigerator at home. - Arduino connected to the azure platform receives the 3d images and compares with the items available in the refrigerator. Internal World Home appliances powered by Arduino External World World seen & understood by KISMET device
  • 29. P a g e | 29 KISMET | Project Document - Post comparing, Arduino figures out that such item (specific brand of Cold drink bottle) is not available at home. - The person is responded about the unavailability to item at home. - Person goes ahead and completes his shopping. B.1.9.5 Addressed consumer need - Enabling IOT (internet of things) at home - Creating more personalized and context aware environment at home based on happenings in the outside world B.1.9.6 Other Applications - Keeping family ubiquitously connected - Remote access to home B.1.9.7 Target audience Anyone B.1.9.8 Technology Considerations Arduino is a DIY platform. In actual scenario, more reliable devices can be used for communication via Azure. B.1.9.9 References Windows Phone 8 communicating with Arduino using Bluetooth: http://developer.nokia.com/community/wiki/Windows_Phone_8_communicating_with_Arduino_using_Blueto oth
  • 30. P a g e | 30 KISMET | Project Document B.1.10 K-Enchant- A magical framework B.1.10.1 Illustration B.1.10.2 Context Teaching to kids is not an easy task. Innovative ways have been explored all around to capture their interest in learning and knowledge acquisition. One of the most effective ways among them is the creation of stories to teach kids. These stories transport them to a new world where they can relate and learn more. ‘Fairy stories’ are one among them. What if we can provide a framework to the teachers to create 3d fairy stories which they can play in the school and tell stories using gestures?
  • 31. P a g e | 31 KISMET | Project Document B.1.10.3 Feature Overview Framework for creating ‘fairy tale effects powered stories’ using 3d animations activated by gestures – It can be thought of as a futuristic version of power point where teachers create a new story line in four parts a. Select an animated 3d story template b. Add the text, videos or hyperlinks (basic storyline) c. Add the 3d animated transitions (made with After effects or other 3d software) like pixie dust d. Map Kinect gestures with the transitions as a trigger for activating those transitions B.1.10.4 Use Case Use Case ID: UC-10 Use Case Name: Teaching history lesson using 3d animated story telling framework Actor: Teacher Description: Teacher has to make her students learn about World War II. Pre-conditions: Teacher has created a 3d animated storyline which will react to teachers’ gestures in the class. This has been done before the class starts. Post-conditions: Kids learn about World War II through an engaging and immersive experience Normal Course of Events: - Teacher connects her KISMET device (where she has created the Kinect Student Story) to the class projector - She places here KISMET device right below the projector with Kinect camera facing towards her - With her varying hand gestures and audio, different 3d animations happen on the projected screen thereby making kids feel that they are in that story era. Few examples being: . While starting the story, teacher says “and the story starts” to start pixie dust introduction . She does zoom in gesture and time travel countdown starts with parameters re-defined by the teacher . Teacher raises her both hands and drums beat start playing with sand particles overlaying the text B.1.10.5 Addressed consumer need Need for immersive story creation for better understanding and memorability B.1.10.6 Other Applications - Interactive Museums - Elevator Pitch - Themes based parks
  • 32. P a g e | 32 KISMET | Project Document B.1.10.7 Target audience - Teachers - Story tellers - Presenters B.1.10.8 References Marco Tempest- A magical tale: http://www.ted.com/talks/marco_tempest_a_magical_tale_with_augmented_reality B.1.11 Harry Potter style interactions B.1.11.1 Illustration
  • 33. P a g e | 33 KISMET | Project Document B.1.11.2 Context Future is building on Wi-Fi enabled connected environment. Currently, roughly each person uses 2 connected devices at home which are mainly phones and tablets. However, with the dawn of pervasive computing, soon there has to be outburst of more connected devices for home. Out of these, not every device would be required to have huge processing power associated with it. Some would be just low power animated displays like Wi-Fi enabled photo frames. This would actualize Harry Potter style animated devices too.
  • 34. P a g e | 34 KISMET | Project Document B.1.11.3 Feature Overview Wi-Fi enabled environmental accessory: This would allow form factor independent content transfer for the IOT world. The environmental accessory would include everyday things like smart wallets, smart walls, smart photo frames etc. These accessories will be Wi-Fi enabled, low power displays and will have only one mode of communication (from KISMET device to the accessory). The information process would include two simple steps: a. Pick up gesture: Generally a short, animated clip is picked up from a KISMET device. b. Drop gesture: With KISMET device in hand, the drop gesture directed towards the other device where this looping clip needs to be showcased. Based on the direction and the depth, automatic Wi-Fi communication is established with the related device and content is transferred seamlessly. B.1.11.4 Use Case Use Case ID: UC-11 Use Case Name: Transfer of short family video clip to smart wallet Actor: Husband Description: A husband is going out of station for few months leaving his family at home. He deeply loves his family and wants to carry a looping, 2 sec, family animated clip along in his wallet Pre-conditions: Wallet has low power based flexible display enabled by Wi-Fi Post-conditions: Husband can see his family’s clip whenever he misses them Normal Course of Events: - Husband open the clip in his KISMET device - Using Kinect integrated in the KISMET device, he does a pick-up gesture. This opens the ‘send to…’ page - Then he does another drop gesture on the wallet within Kinect’s camera field view - Based on direction and depth, the related Wi-Fi module is identified & automated content transfer is enabled - Husband’s wallet now has animated clip running - For saving power, the clip is ON only when wallet is opened - Whenever wallet is closed, the display also shuts down B.1.11.5 Addressed consumer need Need to create more intuitive displays for memory lane B.1.11.6 Other Applications - Finding animated wallpaper and putting at home walls
  • 35. P a g e | 35 KISMET | Project Document B.1.11.7 Target audience Anyone B.1.11.8 Technology Requirements - Protocol to enable automated Wi-Fi transfer based on Kinect gestures - Requirement for small sized power displays B.1.11.9 References Nil B.1.12 Self-paced K-12 education B.1.12.1 Illustration
  • 36. P a g e | 36 KISMET | Project Document B.1.12.2 Context According to world literacy foundation, more than 796 million people in the world cannot read and write and 67 children do not have access to primary education. This untapped potential if empowered with self-paced education can create many more future leaders and change agents. Several organizations worldwide are working towards providing easy access to K12 education for all children below the age of 14 years. B.1.12.3 Feature Overview
  • 37. P a g e | 37 KISMET | Project Document Self- paced learning framework enabled by mobile Kinect. This requires building free educational applications using contextual information present in the environment. B.1.12.4 Use Case Use Case ID: UC-12 Use Case Name: Understanding monumental structures Actor: Kid Description: A kid goes out for picnic with family. His family gives her KISMET device to be in touch while they take rest in the park for some time. Pre-conditions: The kid sees a monument and is interested in knowing more about it Post-conditions: Kid learns about the monument based on her interest and environment Normal Course of Events: - Kid raises her phone with Kinect RBG & depth camera facing the monument - Kid points finger at the monument and asks, ”Cortana, what is this?” - KISMET devices does 3d scan and depth mapping - Based on the above analysis, crops the image pointed by finger - Context is determined of the image based on other environment factors - Advanced Bing image search is done based on context like GPS location etc. - Results are presented through audio based on confidence level B.1.12.5 Addressed consumer need - Providing pull based mechanism instead of push - Rural empowerment through free educational games B.1.12.6 Other Applications Guide for visually impaired B.1.12.7 Target audience Students/Learners B.1.12.8 Technology Requirements Seamless integration of RGB and depth sensors with Bing B.1.12.9 References
  • 38. P a g e | 38 KISMET | Project Document Microsoft’s Project Adam B.1.13 Kinect enabled real-time, road safety warnings B.1.13.1 Illustration B.1.13.2 Context As world is switching towards digital, people are seen more focused on their devices than on the road. By getting absorbed in reading/seeing something on their mobile device, they become less conscious about the
  • 39. P a g e | 39 KISMET | Project Document environment while walking. This makes them accident prone on the road and other similar places. Is there a way they can be safeguarded while walking? B.1.13.3 Feature Overview Environment based message alert model: RGB camera & depth sensor are placed on the back of the mobile device. Other way can be thought of as rotating miniaturized Kinect on top of mobile device. This (rotating sensors & camera) will provide the flexibility of integrating multiple scenarios based on the directional need. Once there is some emergency, alert messages are sent on the notification bar of the device. B.1.13.4 Use Case Use Case ID: UC-13 Use Case Name: Preventing road accident Actor: College Student Description: A college student is walking on the road. He is busy in texting his friends and his eyes are always on the smart phone. Pre-conditions: A speeding car is coming towards the student. There is also an open pot hole 200mts away which he must escape. Post-conditions: The student gets saved from being hit and falling into the pot hole. Normal Course of Events: - RGB Camera & RGB sensor scans the 3d world - Background app on the device analyzes the environment - Discrepancies and danger items are identified in the forward walk path - Notification bar is used to inform the student about the dangers in front B.1.13.5 Addressed consumer need Personal safety on the road B.1.13.6 Other Applications - B.1.13.7 Target audience Anyone B.1.13.8 Technology Requirements
  • 40. P a g e | 40 KISMET | Project Document KISMET device to work in daylight B.1.14 3d modelling and gesture enabled print B.1.14.1 Illustration
  • 41. P a g e | 41 KISMET | Project Document B.1.14.2 Context 3d modelling and designs have often been left to the mercy of high end softwares which often complicated and has slow learning curve due to the complexities of the 3d world manipulation. People thus require a new way of designing the future. NUI (Natural User Interface) shows a promising future in this regard. B.1.14.3 Feature Overview - Gesture based 3d modelling - Gesture based panoramic shot building through 2d data stitching B.1.14.4 Use Case Use Case ID: UC-14 Use Case Name: 3d scanning of face Actor: A man named ‘X’ Description: X in interested in 3d printing his face. He doesn’t have much knowledge of high costing 3d scanners or know how to use Autodesk like 3d softwares Pre-conditions: - An application which stitches 2d images asynchronously using depth data with gesture to define the border and take the snapshot - Wireless messaging model to transfer 3d model for 3d printing Post-conditions: 3d model of the person’s face for 3d printing or using in virtual world Normal Course of Events: - X downloads the 3d scan app - He puts the KISMET device at a distance - He then defines the four corners of the real world face which he wants to capture - The app utilizes depth data to map only relevant face data from one side - X repeats the same process from other angles - Automatic 2d snapshots switching happens using the same app to make the 3d model - He uses export button to wirelessly send the face model for 3d printing B.1.14.5 Addressed consumer need  Cost Effective 3d modelling  Steep learning curve for 3d application usage B.1.14.6 Other Applications Exporting real world characters into digital world for second life or 3d avatar for Skype
  • 42. P a g e | 42 KISMET | Project Document B.1.14.7 Target audience Multi-segments including families B.1.14.8 Technology Considerations - Near field sensors miniaturization - Depth Camera usage for background cutout and border definition (includes automates green screen editing) B.1.15 Kinect Active Health B.1.15.1 Illustration
  • 43. P a g e | 43 KISMET | Project Document B.1.15.2 Context With the advent of digitization, people tend to spend more time in front of electronic devices. Excessive usage of these devices often bring health problems like back pain, dry eyes, fatigue etc. With the integration of Kinect’s capabilities in a mobile device, active health feedback about the person can be provided to the user. Let us see how?
  • 44. P a g e | 44 KISMET | Project Document B.1.15.3 Feature Overview Unconscious health chart preparation of the user: KISMET’s RGB camera and depth sensors track real time health information about the person in the background while he is working on other items on the mobile device. Whenever, an emergency is found, a notification message pops up on the device giving real-time feedback. If the user wants to consciously see his health report, he can quickly get access to graphical health chart app prepared from Kinect’s data. KISMET based health chart can be navigated through calendar Health of the user determined using some of the following parameters: - Body gestures - Smiles - Voice changes - Eye movement - Heart beat - Facial changes - Environmental factors B.1.15.4 Use Case Use Case ID: UC-15 Use Case Name: Providing active health feedback to the mobile user Actor: Mobile user Description: So far tracking and quantifying real time health was difficult as sensors couldn’t be present everywhere. However, with possibility of depth sensor being integrated with mobile device, now there is a means to track and understand patterns of deteriorating/improving health. Pre-conditions: A person using KISMET device has been continuously working from past 8 hours. Without eating food, he his comprising his health for work. Post-conditions: He gets to see detailed health report on real time basis Normal Course of Events: - Mobile user is continuously working on the mobile device for past 8 hours - Mobile user gets a notification message on KISMET Device, It says, “Kinect Message: You have increased heartbeat. Take rest. To know more, click here.” - User clicks on the message to see the trend - Graphical detailed report looks like - 5th July 2014: 20% voice congestion (deviation from original voice) - 6th July 2014: Four times stress detection (bow gestures) - 7th July 2014: 3 hours of wrong sitting posture - 8th July 2014: 8 hours of dry eyes - 9th July 2014: No smile detected in the entire day - 10th July 2014: Increased heart beat rate in the last hour
  • 45. P a g e | 45 KISMET | Project Document B.1.15.5 Addressed consumer need Preventing health related issues due to excessive usage of electronic devices B.1.15.6 Other Applications These reports can be used by the doctors to suggest health improvement measures B.1.15.7 Target audience Extensive users of electronic devices B.1.15.8 Technology Considerations Building Kinect integrated calendar app for seeing date wise or accumulated health over a time span B.1.15.9 References Kinect’s heart rate monitor: http://thenextweb.com/microsoft/2013/05/22/the-new-xbox-one-kinect-tracks- your-heart-rate-happiness-hands-and-hollers/
  • 46. P a g e | 46 KISMET | Project Document B.2 Accessory based scenarios B.2.1 Virtual OneNote Surface B.2.1.1 Illustration
  • 47. P a g e | 47 KISMET | Project Document B.2.1.2 Context Dual efforts are required when whiteboard information written with pen markers have to be converted into digital format. In order to eliminate this problem, interactive whiteboards have been developed which have been used in several places. However, they are very costly ranging around $1000 which is beyond the reach of normal consumer. Thus, there is a need to think of cost effective solution for translating whiteboard text into digital format. B.2.1.3 Feature Overview - Real world area to OneNote mapping for writing: Kinect maps any real-world surface four corners with One Page. In order to do so, the use is require to calibrate the four corners of the real world using hand gesture locking. - Real time transfer of content being written on board to OneNote: In order to support the transfer of content on OneNote on real-time basis, one of the two methods can be adopted. Method 1: Replacing whiteboard with magic slate and using wireless pen to write on the slate Age old magic slate for kids helps is writing glowing text. A big and sophisticated version of the same can be used for writing on board. This will only be used for visual feedback to the user. In order to do the real data transfer, a wireless pen mouse (which opens OneNote on a click) would have to be designed for writing on the magic slate. The slider at the bottom would erase the written text without using any consumable material like marker. Method 2: Doing ZnS coating on any surface and using wireless UV pen to glow text and transfer content As per the photoluminescence property, whenever UV light is passed onto a ZnS coated surface, the area where light is passed starts glowing for a specific time. Some people have explored writing text with it. The time during which the text stays depends on the combination of ZnS being used. It is still a concept and might not work in daylight scenarios. The authenticity of its working needs to be tested. - Use of # tags for special commands: Use of special #tags while writing on the board will help perform certain operations on the OneNote like opening a new section with name as the test following #tag. B.2.1.4 Use Case Wireless pen mouse to transfer the content digitally to OneNote Visual feedback to the person writing on the board through magic slate or ZnS
  • 48. P a g e | 48 KISMET | Project Document Use Case ID: UC-16 Use Case Name: Auto-transfer of whiteboard content to OneNote Actor: Working Professional (WP) Description: WP is having a meeting and he wants to explain some concepts on the whiteboard. After explanation, he wants the text to be auto transferred to OneNote and sent to all meeting employees Pre-conditions: - Setting up of one of the methods (as mentioned above) for transfer to content to OneNote and provide visual feedback to the user - KISMET app to compare text coming from wireless pen mouse to visual data from RGB camera for sanity check Post-conditions: OneNote copy for text written on whiteboard for sharing and receiving feedback Normal Course of Events: - WP comes 5min prior to the meeting and puts his KISMET device on a stand facing the whiteboard - He starts the ‘OneNote Text Transfer app‘ to calibrate the four corners of the whiteboard for mapping to OneNote - He goes to each corner of the board, points to the corner with a finger and locks the corner with another gesture. - Similar process, WP repeats for other corners - One other employees join the meeting, he starts the text recording function of ‘OneNote Text Transfer app’ - Data being written on the board is real time sent to OneNote using wireless pen and visual feedback is provided to the user using one of the two methods mentioned in the section above - KISMET OneNote compare feature checks the data in between with visuals from RGB camera for integrity - After the meeting, WP shared this digital note copy with the meeting attendees B.2.1.5 Addressed consumer need Difficulty in converting text and drawing written on whiteboard in a digital format for future reference and usage B.2.1.6 Other Applications This is a limited use scenario B.2.1.7 Target audience - Business Professionals - Teaching Professionals B.2.1.8 References - Light Writer: http://www.gijsvanbon.nl/kryt.html
  • 49. P a g e | 49 KISMET | Project Document - Interactive UV t-shirts: http://www.thisiswhyimbroke.com/interactive-uv-light-shirts B.2.2 Five senses Kinect B.2.2.1 Illustration
  • 50. P a g e | 50 KISMET | Project Document
  • 51. P a g e | 51 KISMET | Project Document B.2.2.2 Context Instead of focusing on just two senses (mainly the sight and hearing), the feature set of next generation Kinects should be extended to enable other three senses too namely, the touch, smell and taste. While taste still seems to be in early stages of exploration, there have been some good progress with smell sensors and haptic feedback. B.2.2.3 Feature Overview Kinect Enabled Haptic Cover: KISMET device will have a specially designed cover to transfer haptic messages. It is a flexible cover with small pins inside which can increase or decrease their size to create 3d shapes. Thus cover will change shape based on the different heights of protruded pins. B.2.2.4 Use Case Use Case ID: UC-17 Use Case Name: Remote feel of 3d object Actor: Woman named ‘A’ Description: A wants to communicate the feel of a real world object remotely to his friend at geographically different location Pre-conditions: - Availability of Kinect enabled haptic cover with A Post-conditions: People at two different geographic locations can feel physical shapes present at the other location. Normal Course of Events: - A puts a real world object in front of KISMET device - Kinect’s depth sensor maps 3d data of an object at location A - Based on protrusion resolution parameter of the haptic cover, rough mapping of the 3d data is done on haptic cover at location B B.2.2.5 Addressed consumer need Providing immersive experiences by communicating all five real sensors at a remote location B.2.2.6 Other Applications - SDK for developers to build haptic applications like braille for visually impaired or feeling remote items - Skype integration - Playing haptic tic-tac-toe or chess between people at different locations
  • 52. P a g e | 52 KISMET | Project Document B.2.2.7 Target audience Geographic distributed individuals B.2.2.8 Technology Requirements Building self-adjusting pin matrix in the form of a haptic cover to map a 3d object is a complex feature and would require feasibility analysis B.2.2.9 References inFORM- MIT Media Lab: http://www.tested.com/art/makers/459049-mit-media-labs-inform-shapeshifting- table/ B.2.3 Automated Indoor Mapping Personal Assistant B.2.3.1 Illustration
  • 53. P a g e | 53 KISMET | Project Document B.2.3.2 Context Home is a space where people prefer to put personalized items. This is beyond the reach of GPS. For outside ‘app-ification and device-ification’ models to enter these houses, it is important to gather and understand the
  • 54. P a g e | 54 KISMET | Project Document contextual information within these spaces. In order to enable this, real world house into be mapped to a virtual world where people can play, explore and personalize. B.2.3.3 Feature Overview Indoor mapping with robotic devices plug-in: The KISMET device would quick plug-in for quad-copters and other indoor robotic devices like Roomba. Once they are plugged, as the robot will move in the entire house, it will map out a 3d view of the indoor to be used for various purposes. B.2.3.4 Use Case Use Case ID: UC-18 Use Case Name: Re-designing the home with automated indoor 3d mapping platform Actor: College Student Description: A college student wants to change the interior of the house and thus requires a 3d model of the same to play with Pre-conditions: - The student has a quad copter or a robotic device like Roomba - Availability of a hardware accessory for enabling 360 degree rotation of the KISMET device - A Windows Store App to build live 3d geometry and export as a .3dx or .obj file for further manipulation or consumption - A touch based app to place and manipulate smart objects on top of this 3d geometry Post-conditions: The student finalizes his new house look and shares with family for approval Normal Course of Events: - Student plugs in KISMET device (Depth sensor enabled mobile device) with Roomba robot (obstacle avoiding, floor moving device). Present day Roomba has an additional accessory protruding out where KISMET is connected and can rotation 360 degree. - He downloads the Indoor Mapping app from the Windows Store - He switches on the robot - The app is started to start making 3d model of the house - The device autonomously maps the house - Once done, it comes back to the user based on face detection - The app outputs a 3d object (.3dx, .obj etc.) - The student uses touch based 3d manipulation feature of KISMET device to cut and replace new virtual furniture in the house B.2.3.5 Addressed consumer need - Autonomous mapping - 3D map of homes with exact dimension for real-time visualizations example while home furniture shopping - Access to contextual information at home to make it more safe and secure
  • 55. P a g e | 55 KISMET | Project Document B.2.3.6 Other Applications - Augmented reality based games for the entire house rather than a limited location like hunt for 5 virtual ducks in your home. B.2.3.7 Target audience Multiple-segments – house wives, kids, interior designers, sociologist, psychologists, data scientists etc. B.2.3.8 Technology Considerations Mapping time of flight with quad-copter’s or robot’s moving speed B.2.3.9 References Automated Indoor Mapping: http://www-video.eecs.berkeley.edu/research/indoor/ B.2.4 Kinect enabled smart casing – a self-protection device B.2.4.1 Illustration
  • 56. P a g e | 56 KISMET | Project Document
  • 57. P a g e | 57 KISMET | Project Document B.2.4.2 Context Few years back, iPhone had conducted a survey with around 2000 iPhone users on broken phones. After the survey, it was found that Americans have spent an estimated $5.9 billion on repairing and replacing broken phones over the past five years. Accidental drop was found to be the most common reason for breaking and submerged in water (tub, sink and toilet) being the second. For Windows phone too, the statistics would have
  • 58. P a g e | 58 KISMET | Project Document been similar. This shows a major gap where smart devices cannot take care of themselves. Is there a way where miniaturized form of Kinect or other sensors solve this issue? B.2.4.3 Feature Overview Automated shock resistant mobile cover/pouch: The smart protector in the form of pouch or metallic casing gets activated based on one of the two options: 1. In case some object is falling on top of the device - This detection is done using depth mapping and calculating velocity on real time basis 2. In case device itself is falling on the ground – This can simply be done using accelerometer integration B.2.3.4 Use Case Use Case ID: UC-19 Use Case Name: Protecting device in rain Actor: Working woman Description: A working woman is awaiting near the bus stop for the bus. As it is drizzling, the lady is holding an umbrella with one hand. She has to send some urgent mails and thus she is working on her smart phone with the other hand. Pre-conditions: A rain drop from the umbrella is about to fall on the device and has high probability of destroying the phone Post-conditions: KISMET device self- protects itself from getting damaged Normal Course of Events: - The KISMET device detects a possibility of something about to fall on the device. Almost instantly the water droplet starts coming towards the device. - Depth camera detects the speed of the falling object and distance remaining between the moving object and device. - As soon as moving object crosses safety distance, the device detects danger - It alarms the safety cover to instantly get on - The safety cover covers the entire device - The water droplet falls on the cover thereby protecting the device B.2.4.5 Addressed consumer need Accidental breakage of phone or smart devices B.2.4.6 Other Applications Nil. This feature has limited usage B.2.4.7 Target audience
  • 59. P a g e | 59 KISMET | Project Document Anyone B.2.4.8 Technology Requirements Hardware (protection cover) activation based on depth sensor data points B.2.5 Plug & play Kinect components B.2.5.1 Illustration
  • 60. P a g e | 60 KISMET | Project Document B.2.5.2 Context Plug & Play Kinect Components and Sensors: Smartphone interior is designed in the shape as shown below: The left four corners spacing have magnetic clutch points where different sensors or Kinect components can be clubbed as per the requirements. Thus, instead of a complete Kinect bundle, based on your needs you can use basic Kinect components like depth camera, microphone or integrate additional sensors like temperature with Kinect. B.2.5.3 Feature Overview Use based plug-in sensors to provide smart variations: Each magnetic clutch point will have different end-points to enable activation and seamless data flow with sensors. The major clutch points would be: B.2.5.4 Use Case Use Case ID: UC-20 Use Case Name: Field diagnostics of diseases by rural health workers with integrated microscope Actor Rural health worker/volunteer Description: Since the people living in rural areas of Africa and India don’t have adequate access to healthcare and medication, often their diseases remain undiagnosed or misdiagnosed. Often health workers are provided with a mobile device for effective health management of the nearby area. Now through the availability of plug and use portable microscope piece, heath workers will be able to early diagnose the diseases and provide proper medication to the citizens. Pre-conditions: All the rural health workers are given a KISMET device (Kinect enabled mobile device like smartphone, tablet or phablet). The device has plug & use sensor outlets on the four corners A smart Microscope sensor is provided to the for plugging in with KISMET device Post-conditions: Real-time disease diagnostics in the rural areas Normal Course of Events: - Health workers go to rural areas for doing regular inspection Power Input Output Ground
  • 61. P a g e | 61 KISMET | Project Document - They see a person having high fever. They suspect him having malaria disease. - He needs immediate medication but health worker can’t give a device without proper detection. She takes his blood test. - Health worker takes out her KISMET device - She unclutches the camera from her device & places the microscope piece to the side - She opens the microscope app & start testing the disease by doing 500x zoom, depth and optical focus - Through her quick diagnostics, she confirm the person has malaria - The person’s medication is started real time. B.2.5.5 Addressed consumer need - Sensor personalization based on needs - Handy access to sensors B.2.6.6 Other Applications - Pollution tracking - Home based diagnostics This given use case can be extended to make personal diagnostics common to all households. People would be available to conduct health tests at home and then instantly share the same to doctor to get real-time feedback B.2.6.7 Target audience - Rural markets - People requiring daily use of multiple sensors for testing, benchmarking and data collection like environmental science, medicine, photography B.2.6.8 Technology Considerations - Distributed Power and data flow model for quick plug and use of sensor packages For specific use case (UC2): - Embedding polymer optics in a portable sensor - Optical and depth based focus - Knob for changing magnification levels: 140x-2100x - Multi view modes- Grey scale, inverse, emboss mode - Illumination package based on type of microscopy- light field, dark field, fluorescence etc.
  • 62. P a g e | 62 KISMET | Project Document - Panning, changing 3d field of view B.2.7.9 References A. 50 cent microscope https://www.youtube.com/watch?v=xUzZ_01N0eE b. Portable Microscope: https://www.youtube.com/watch?v=wPdCBebQEuY B.2.8 Holo-Kinect B.2.8.1 Illustration
  • 63. P a g e | 63 KISMET | Project Document B.2.8.2 Context Holography & 3d projections have been of great interest to people as it brings along a realism factor to the static world. Kinect enabled mobile devices can help take this field of exploration to the hands of consumers. B.2.8.3 Feature Overview - Holographic projection device casing: The flap case of the device has been converted to a holographic projection display for 3d viewing and manipulation. - Holographic prism accessories for KISMET device: A holographic prism kept on top of KISMET device can support Kinect gestures for smart object manipulation in 3d. B.2.8.4 Use Case Use Case ID: UC-21 Use Case Name: Designing an art piece for the house using holographic display cover Actor: Artist Description: An artist is interested in designing and printing a 3d art piece for his house. He uses holographic projection cover with the KISMET device to carve his piece and 3d print it with gestures Pre-conditions: - Availability of holographic flap case with the artist - Availability of an app for 3d gesture based designing with holographic viewing Post-conditions: Artist 3d prints the art piece and keeps at his house Normal Course of Events: - Artist inserts KISMET device in the holographic case - He opens the holographic projection app - He imports a 3d design - The figure is changed by 3d operations through finger gestures - Happy with the final design, the artist send it to 3d printing B.2.8.5 Addressed consumer need - Easy 3d interactions & manipulations - Safety while car driving: Lot of people have to turn to their cellphones to read/see while driving which is not safe and increases chances of accident. This problem can be minimized by projecting phone’s display holography on car’s front glass. Etc.
  • 64. P a g e | 64 KISMET | Project Document B.2.8.6 Other Applications Such things can also be used in places where the person cannot always look into the smart device (phone, tablet or phablet). One such example is while driving. An enhancement of this would help in providing car’s creative, 3d dashboards. B.2.8.7 Target audience Multiple Segments: 3d artists, designers, architects, young generations looking for cool interactions, car dealers etc. B.2.8.8 Technology Considerations - Near field detection - Extended list of finger based gestures B.2.8.9 References - Creative use on car’s dashboard (heads up care display) https://www.youtube.com/watch?v=Qg8dq9FlX78 - Holho http://www.holhocollection.com/
  • 65. P a g e | 65 KISMET | Project Document B.2.9 A fashion accessory B.2.9.1 Illustration B.2.9.2 Context As technology options are increasing in number, people often choose the product which he/she can make a style statement. Not many withstand the criteria.
  • 66. P a g e | 66 KISMET | Project Document B.2.9.3 Feature Overview Separating depth sensors & Kinect’s rgb camera from the main mobile device to make it as a fashion accessory: One of the main requirements for Kinect to collect and utilize real world data is having them placed at visible locations. It would be difficult to always carry the phone in hand with Kinect camera ad sensors pointing to the environment. Thus, another approach for tackling this issue is to separate Kinect capabilities from the mobile device and make Kinect wearable as a fashion accessory which can be flaunted over the dress. This would help in getting real-time data without you having to bother about carrying it across. Some options for making wearable Kinect include earring, fashion locket, fashion belt, stylish bag and belt buckle. This can provide wireless input to the KISMET device for further manipulation. B.2.9.4 Use Case Use Case ID: UC-22 Use Case Name: Searching for Facebook friends in the party and informing on smart watch Actor: Model Description: A model goes to a party. However she hardly knows anyone there. She is wearing Depth sensor & rgb camera as her earring. She has synchronized the smart earring with her smart watch. Pre-conditions: Model is alone in the party Post-conditions: She finds mutual friends and hangs out with them Normal Course of Events: - Kinect camera in the earring takes an image of people around and using depth sensing, crop the faces of individual people - It then syncs with KISMET device to share this data and search for mutual Facebook friends - Based on facial matching algorithms, mutual friends are identified. - This information is sent wirelessly to smart watch like -”Hey! The person towards your right is Arjun’s friend.” - This provides a starting conversation for the model to interact with people in the party. B.2.9.5 Addressed consumer need Making technology cool and fashionable to wear B.2.9.6 Other Applications - Style changing bags- Bags visual view can change based on three styles – office, party or causal. The same bag is adaptive to different environments. For example is there is a party happening around, Kinect sensor detects the same and changes bag cover to make it animated flower glowing bag. This is done using leds or flexible oled displays. - Kinect enabled Living bags- Smart pouches in bag where after KISMET can be placed. These pouches will have end points to sensor in the bag. They can also be solar charging bags. Other options can include weather
  • 67. P a g e | 67 KISMET | Project Document indicator sensors built in the bag providing real time suggestions. Inputs are provided based on changing environment outside or things inside the bag. - Next generation home shopping: Example- A person scans his feet. Depth and RBG camera calculates the feet size automatically and from among the choices available, provide the best fitting size of shoes online. B.2.9.7 Target audience Young generation (people who are 15-25 yrs. old) B.1.10 viSparsh – a haptic belt for visually impaired B.2.10.1 Illustration
  • 68. P a g e | 68 KISMET | Project Document B.2.10.2 Context Worldwide, 285 million people are visually impaired with 90% of them living in the developing countries. Empowering such a huge population to lead a normal life through assistive technologies can help the world by utilizing their undiscovered potential. ‘viSparsh’ is a small step towards that empowerment. B.2.10.3 Feature Overview The focus of the viSparsh project is to create a haptic (touch-based feedback) waist belt for the blind with a simple goal that a blind person can avoid obstacles that are on and above ground height. This belt has an infrared optic sensor that projects thousands of dots in the 3-12 feet distance to determine what obstacles are present. Based on the proximity and relative direction of the obstacle to the wearer, a series of vibrators on the belt alert the wearer of the presence of obstacles around them. This system is lightweight, low-cost (~Rs 5000), easy to use, non-conspicuous and also frees up the ears and hands of the blind person. The system provides a 160 degree view of the environment and is complementary to the limited reach of the walking stick. It also does not require any additional infrastructure in the environment. B.2.10.4 Use Case Use Case ID: UC-23 Use Case Name: Guiding visually impaired to walk on the road Actor: Visually impaired Description: For ages, visually impaired walk using canes. However, canes are not very effective with identifying above ground object as the cane searches for obstacles on the floor. These canes also makes the disability much more visible where people develop bias about people even before talking. This proves to be a reason for visually impaired getting cut out from the entire society. Thus, they need some intuitive technologies where viSparsh comes to the rescue. Pre-conditions: Visually impaired needs to cross the road Post-conditions: He crosses the road successfully avoiding on and above ground obstacles. Normal Course of Events: - Visually impaired wears viSparsh waist belt - He starts walking on the road - This belt detect obstacles on the front side (using Microsoft Kinect sensor) - The information is sent for processing - Based on the analysis, visually impaired person (wearing the belt) is informed about the distance and direction of the obstacle using vibrations on three inner sides of the belt (front, right and left) - For example, if obstacle is on the right side, only right side of the belt vibrates and the intensity of the vibrations increases as obstacles comes closer and vice-versa.
  • 69. P a g e | 69 KISMET | Project Document B.2.10.5 Addressed consumer need Unavailability of sophisticated tool for helping visually impaired navigate B.2.10.6 Other Applications Converting the belt form factor to the size of a buckle which can be inserted in any belt to keep track of real world information and give as required. B.2.10.7 Target audience Visually impaired B.2.10.8 Technology Considerations 1. The weight and size of the belt are the biggest challenges that we are facing in this prototype. 2. Though we can design the new casing for Kinect by tearing the black, plastic casing and removing some of the unwanted components inside it yet the size does not reduce drastically and we also don’t have enough expertise in giving it a product shape. 3. User Experience: The user sitting with the device ON can also create some problem to the device. In addition to it, in some scenarios like for Indian women who wear ‘saree’ (an Indian attire), a belt may not be able to serve the right purpose. 4. The belt still does not work in outdoor mode due to Infra-Red rays coming from the sunrays. B.2.10.9 References Project Video: https://www.youtube.com/watch?v=KTmGZ6sElnU CNN IBN news: https://www.youtube.com/watch?v=F2xGf-Cr6nI Blog: http://visparsh.blogspot.in/
  • 70. P a g e | 70 KISMET | Project Document B.3 Extras B.3.1 Kinect Green B.3.1.1 Illustration
  • 71. P a g e | 71 KISMET | Project Document B.3.1.2 Context The world needs to move towards more eco-friendly ways to support and promote technology. Using green technology is one of the ways. What if we could provide smart utilization of readily available energy sources – light, heat, vibrations, generation of power from body heat, or use of solar computing add-ons? B.3.1.3 Feature Overview Promoting use of green and clean technology through add-ons, initiatives, partnerships to make a sustainable future. a. Solar power enabled car for providing power to Kinect – Option to power KISMET device with solar panels b. Building real-time global awareness ecological chart – Since all KISMET device users will have Kinect power in their hands, they can become our change agents to track green health of a place through smart sensors. This region specific sensor data (like air pollution) is uploaded on cloud to create a global awareness chart. This can be used by government and organizations to make intelligent decisions about sustainable future. Incentivizing people to use more of green technology ways can help enable this vision. One option of incentivizing could be ‘Reuse, Reduce and Recycle to gain game points’. B.3.1.4 Use Case Use Case ID: UC-24 Use Case Name: Underwater pollution check Actor: Student Description: Specific range of IR light can help detect pollution in aquatic life Pre-conditions: Underwater pollution check needs to be done Post-conditions: Pollution level is checked, uploaded and game points are earner Normal Course of Events: - Student goes underwater with his KISMET device - He throws out IR light and opens pollution check application in the device - The pollution check application reads following parameters: . Pollution level in aquatic life . RGB and depth image of the scene . GPS co-ordinates of the location - The student uploads this data on an online global awareness chart - Post verification of the data, the student wins extra XBOX game points B.3.1.5 Addressed consumer need Need to move to a sustainable future B.3.1.6 Other Applications - Powering KISMET device through solar panels
  • 72. P a g e | 72 KISMET | Project Document B.3.1.7 Target audience Everyone B.3.1.8 Technology Consideration - Exploring green technology ways of powering Kinect and KISMET device - Water proof mobile device - Embedding additional sensors in Kinect B.3.1.9 References Mid-range IR sensor for detecting pollution in aquatic life: http://www.mdpi.com/1424-8220/9/8/6232/pdf
  • 73. P a g e | 73 KISMET | Project Document B.3.2 Minority report design for transparent world B.3.2.1 Illustration B.3.2.2 Context Transparent glass screens are the next generation displays. Lot of research is already going on in actualizing smart mirror/glass concepts like a video named ‘A day made of glass’ by Corning. B.3.2.3 Feature Overview Interactive Heads Up Display (HUD) design for the transparent world: Often the first thing people do in the morning is to look into their mirror or check their mobile for updates. What if we combine the two activities and build a new minority report like design for the glass or mirror? This requires Kinect application to integrate with Microsoft products like PowerPoint, outlook, SkyDrive etc. KISMET device can be wirelessly connected
  • 74. P a g e | 74 KISMET | Project Document with a projector projecting on the transparent screen (using holographic projection sheet coated on the transparent screen). KISMET device is placed right below the screen and track user gestures to show and open related content on the glass. B.3.2.4 Use Case Use Case ID: UC-25 Use Case Name: Getting morning updates on smart mirror Actor: User 1 Description: User is in a habit of checking the day’s calendar as the first thing in the morning along with brushing teeth. However, doing multi-tasking might involve some risks like mobile device getting wet or broken (as you are not fully conscious right after you wake up). Thus, he needs NUI based interaction to get the required updates. Pre-conditions: User needs morning updates Post-conditions: He gets to check his day’s calendar, emails and world news on the smart mirror Normal Course of Events: - User wakes up and walks up to the sink (having a smart mirror on the top) - He puts the KISMET device in a stand just below the mirror - KISMET automatically makes a wireless connection with the back end projector projecting on the mirror and starts tracking gesture - Once the connection is established, the application running on KISMET pulls the calendar in a simple to navigate gesture and shows on the mirror - The user uses his gestures to navigate into the calendar or check emails or send a reply B.3.2.5 Addressed consumer need NUI based, real world integration with Microsoft products B.3.2.6 Other Applications - Setting up interactive installations/3d art B.3.2.7 Target audience Everyone B.3.2.8 Technology Requirements Seamless wireless connection and data transfer between mobile device and projector B.3.2.9 References Productivity Future Vision:
  • 75. P a g e | 75 KISMET | Project Document https://www.youtube.com/watch?v=t5X2PxtvMsU&list=PLdJTVYW_WV_RkaTk5SbTVyPZRKSOty84w A day made of glass 2: https://www.youtube.com/watch?v=X-GXO_urMow A day made of glass: https://www.youtube.com/watch?v=6Cf7IL_eZ38&feature=kp Holographic sheet for displaying information on transparent screen: http://en.wikipedia.org/wiki/Holographic_screen DIY projection TV: http://www.practical-home-theater-guide.com/build-a-projection-tv.html Phone into projection: http://content.photojojo.com/diy/turn-your-phone-into-a-photo-projector-for-1/
  • 76. P a g e | 76 KISMET | Project Document B.3.3 One Microsoft gesture story for home: New XBOX SmartGlass B.3.3.1 Illustration
  • 77. P a g e | 77 KISMET | Project Document B.3.3.2 Context Microsoft Office Division (MOD) has a sub-team named ‘CXE (Common Experience)’ which looks after the shared experiences between all Microsoft Office products. The popular ‘ribbon’ features and functionalities are part of that and is standardized for MOD. As future is being carved and people are going towards augmented reality and gesture driven technologies, a time will soon arrive when the entire Microsoft language ecosystem has to be represented with a gesture story. Thus, there is a need to define standardized NUI based experiences for this upcoming world and to truly depict the One Microsoft gesture story. B.3.3.3 Feature Overview Enablement of Microsoft products for the NUI world in following two ways: 1. Shared experience NUI platform: Defining standardized set of features (including gestures and audio commands) that are common for all Microsoft products like Air keyboard, air scroll, zoom in-out etc. This can be termed as ‘Common Gesture Experience’ 2. Product specific NUI experience: Each Microsoft product can define their own gestures for transitions etc. Sign language for each Microsoft product like ‘x’ for Xbox, ‘b’ for Bing and ‘p’ for PowerPoint. Embedding Kinect’s capabilities into mobile device will in this way pave the way for next generation XBOX SmartGlass functionalities. B.3.3.4 Use Case Use Case ID: UC-26 Use Case Name: On the go productivity at home Actor: Working Women (WW) Description: A women has organized a party at home. While cooking at home, she needs to get some urgent office work signed off and send to others in the team. Pre-conditions: Working women needs to reply to some emails while cooking Post-conditions: She sends emails using NUI on the KSIMET device Normal Course of Events: - She is preparing food in the kitchen - She kept her KISMET device (Kinect enabled mobile device) in the kitchen table on a kick stand - She gets an email alert on her device which can can’t pick up to read as her hands are occupied - As there is lot of noise around with multiple people coming in, she can’t use Cortana to open, read and reply to the mail - She uses air gesture password along with her facial recognition to unlock the device
  • 78. P a g e | 78 KISMET | Project Document - Next she does two full circles revolution gesture to let the device know that she is starting to do command gesture. This will prevent from accidental gesture tracking - She now makes alphabet ‘E’ gesture to open email followed by a gesture to transfer content to a bigger TV screen nearby - She zooms in the big screen with next gesture, scroll the mail to read - Lastly she opens the Air Keyboard (again with a gesture).Types the reply, “Approved” and makes thumbs up gesture to send the mail. B.3.3.5 Addressed consumer need Standardization and customization of gestures for use with all Microsoft products B.3.3.6 Other Applications Home automation to interact with IoT world (like light, fan, TV) using shared gestures B.3.3.7 Target audience Working professionals
  • 79. P a g e | 79 KISMET | Project Document B.3.4 Kinect Personified B.3.4.1 Illustration B.3.4.2 Context Everyone needs a personalized device. However, it is often the time that accidently or intentionally your device gets used by the other person. Even you might be open towards sharing the device, not often you would be interested in sharing the personal data like messages, phone directory or specific applications. B.3.4.3 Feature Overview Multiple personalization views for the same device: This is an enhanced version of the Kids corner view that was launched in earlier Windows Smartphone devices. The main difference is that instead of you having to manually switch on the kids’ corner section, RGB do facial tracking followed by random audio testing to define what view would be perfect of a particular audience. This would also include scenarios where the device would know if someone at your back is peeping in see the data on your device without you being aware of the same. This would be done by doing contextual through depth mapping of not just who is in the front but who is at back and what is their behavior?
  • 80. P a g e | 80 KISMET | Project Document B.3.4.4 Use Case Use Case ID: UC-27 Use Case Name: Privacy Protection from a stranger Actors: Mobile User (MU) and a stranger (S) Description: A stranger is interested in getting some private and personal data of a user. He has got hold of user’s password somehow. Now his initial plan is to access his smart device once the user is not around. Pre-conditions: A stranger interested in data privacy breach Post-conditions: The smart KISMET device doesn’t allow data access to unauthorized users Normal Course of Events: Mobile User goes to the washroom keeping his KISMET device (mobile Kinect device) on his desk in a locked state On seeing the right opportunity, the stranger (S) quickly goes to access user’s device. Since S is smart, he also brings along a printed facial mask of the user to misguide the device in case facial detection is done. While doing the facial check with depth camera, the device inconsistency in the facial contours and the facial mask was 2 In order to confirm the identity of the user, the device generates a random line and asks the stranger to repeat that line in his voice Unaware of this feature, S tries to mimic original user’s voice but fails to unlock the device. Since every time the text is different, S cannot get a pre-recorded voice sample of the original user Alternative Courses: Unable to get access to the required data, the stranger (S) thinks of peeping from behind while the user is accessing the data S stands behind the shoulder of the user with the user’s knowledge Standing at a distance over his heels, S tries to view the data from the back Depth camera in KISMET device constantly scanning the background, sees an additional face The depth camera is studies the facial characteristics of S at the back and finds something suspicious The background running app in the device matches face with relatives and doesn’t find any correlation The user is instantly informed about the probability of privacy breach through a notification bar message B.3.4.5 Addressed consumer need This eliminates a major cause of data being mishandled due to use by the other person. A simple example being that your 1 year kid unknowingly picks up your mobile device and accidently deletes your phone book. B.3.4.6 Target audience Family
  • 81. P a g e | 81 KISMET | Project Document C. Appendix Use Case 1: Creating obstacle detection DIY robot Use Case 2: Buying shoes from online retail store Use Case 3: Walking on the street with a digital friend named ‘Kinect Cortana’ Use Case 4: Context aware KISMET soldering guide Use Case 5: Learning alphabet writing through live stencils Use Case 6: Sending eye reports to doctor & improving health Use Case 7: Mood based music listening Use Case 8: Hollywood style real-time CGI cinematography Use Case 9: Preparing for party through real-time feedback from IoT world Use Case 10: Teaching history lesson using 3d animated story telling framework Use Case 11: Transfer of short family video clip to smart wallet Use Case 12: Understanding monumental structures Use Case 13: Preventing road accident Use Case 14: 3d scanning of face Use Case 15: Providing active health feedback to the mobile user Use Case 16: Auto-transfer of whiteboard content to OneNote Use Case 17: Remote feel of 3d object Use Case 18: Re-designing the home with automated indoor 3d mapping platform Use Case 19: Protecting device in rain Use Case 20: Field diagnostics of diseases by rural health workers with integrated microscope Use Case 21: Designing an art piece for the house using holographic display cover Use Case 22: Searching for Facebook friends in the party and informing on smart watch Use Case 23: Guiding visually impaired to walk on the road
  • 82. P a g e | 82 KISMET | Project Document Use Case 24: Underwater pollution check Use Case 25: Getting morning updates on smart mirror Use Case 26: On the go productivity at home Use Case 27: Privacy Protection from a stranger