SlideShare a Scribd company logo
1 of 25
X




Spatial Gestures using a Tactile-Proprioceptive Display
              Eelke Folmer & Tony Morelli - TEI’12, Kingston
                                Player-Game Interaction Lab
                                 University of Nevada, Reno
Spatial Gestures in NUI’s




                   Player-Game Interaction Research
                         University of Nevada, Reno
Spatial Gestures in NUI’s




                   Player-Game Interaction Research
                         University of Nevada, Reno
No Display / Unable to see




                    Player-Game Interaction Research
                          University of Nevada, Reno
No Display / Unable to see




          ?
                    Player-Game Interaction Research
                          University of Nevada, Reno
Non-Visual NUI’s

          item A
          item B
          item C



     visual impairment          mobile contexts


Limitations:
» no spatial gestures
» rely on visuospatial memory
                                    Player-Game Interaction Research
                                          University of Nevada, Reno
Non-Visual NUI’s

            item A
 “item B”   item B
            item C



      visual impairment         mobile contexts


Limitations:
» no spatial gestures
» rely on visuospatial memory
                                    Player-Game Interaction Research
                                          University of Nevada, Reno
Non-Visual NUI’s

            item A
 “item B”   item B
            item C
                                        ?


      visual impairment         mobile contexts


Limitations:
» no spatial gestures
» rely on visuospatial memory
                                    Player-Game Interaction Research
                                          University of Nevada, Reno
Tactile-Proprioceptive display




Turn the Human body into a display
Proprioception
 »human ability to sense the orientation of limbs
 »augment haptic feedback with prop. information
                                   Player-Game Interaction Research
                                         University of Nevada, Reno
Tactile-Proprioceptive display




Turn the Human body into a display
Proprioception
 »human ability to sense the orientation of limbs
 »augment haptic feedback with prop. information
                                   Player-Game Interaction Research
                                         University of Nevada, Reno
Example




                                         frequency




               error                 0
          Player-Game Interaction Research
                University of Nevada, Reno
Example




                                         frequency




               error                 0
          Player-Game Interaction Research
                University of Nevada, Reno
Example




                                         frequency




               error                 0
          Player-Game Interaction Research
                University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: procedure & results




Space invaders like game
Between-subjects study with 16 subjects
Corrected search time:
 »linear 51.7 ms/pixel      significant difference
 »multilinear 40.3 ms/pixel
No sig. difference in error
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: procedure & results




Space invaders like game
Between-subjects study with 16 subjects
Corrected search time:
 »linear 51.7 ms/pixel      significant difference
 »multilinear 40.3 ms/pixel
No sig. difference in error
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 2: Spatial Gesture

                                    X




Multilinear scanning
correct gesture if:
  » within 150 pixels of target
  » Z axis decrease of 20 cm
  » less than 5% error in each axis of rotation
8 subjects (not participate in Study 1)
corrected search time: 45.9 ms/pixel
aiming accuracy: 21.4°             Player-Game Interaction Research
                                                   University of Nevada, Reno
Study 2: Spatial Gesture

                                    X




Multilinear scanning
correct gesture if:
  » within 150 pixels of target
  » Z axis decrease of 20 cm
  » less than 5% error in each axis of rotation
8 subjects (not participate in Study 1)
corrected search time: 45.9 ms/pixel
aiming accuracy: 21.4°             Player-Game Interaction Research
                                                   University of Nevada, Reno
Potential Applications




navigation   low cost motor    exergames for users
              rehabilitation      who are blind




                               Player-Game Interaction Research
                                     University of Nevada, Reno
Current/Future Work
                                  direction of error found




                                                Y



                                                X

     3D scanning            Extension of Fitts’s law

3D target selection
two handed scanning
Model for non-visual pointing
                                     Player-Game Interaction Research
                                           University of Nevada, Reno
props & questions




This research supported by NSF Grant IIS-1118074
Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the author(s) and do not necessarily reflect the views of the
National Science Foundation.
                                                       Player-Game Interaction Research
                                                             University of Nevada, Reno
props & questions




                           ?
This research supported by NSF Grant IIS-1118074
Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the author(s) and do not necessarily reflect the views of the
National Science Foundation.
                                                       Player-Game Interaction Research
                                                             University of Nevada, Reno

More Related Content

Recently uploaded

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Victor Rentea
 

Recently uploaded (20)

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
Cyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdfCyberprint. Dark Pink Apt Group [EN].pdf
Cyberprint. Dark Pink Apt Group [EN].pdf
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 

Featured

How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental Health
ThinkNow
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie Insights
Kurio // The Social Media Age(ncy)
 

Featured (20)

2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot
 
Everything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTEverything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPT
 
Product Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsProduct Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage Engineerings
 
How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental Health
 
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfAI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
 
Skeleton Culture Code
Skeleton Culture CodeSkeleton Culture Code
Skeleton Culture Code
 
PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024
 
Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)
 
How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie Insights
 
Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024
 
5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary
 
ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd
 
Getting into the tech field. what next
Getting into the tech field. what next Getting into the tech field. what next
Getting into the tech field. what next
 
Google's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentGoogle's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search Intent
 
How to have difficult conversations
How to have difficult conversations How to have difficult conversations
How to have difficult conversations
 
Introduction to Data Science
Introduction to Data ScienceIntroduction to Data Science
Introduction to Data Science
 
Time Management & Productivity - Best Practices
Time Management & Productivity -  Best PracticesTime Management & Productivity -  Best Practices
Time Management & Productivity - Best Practices
 
The six step guide to practical project management
The six step guide to practical project managementThe six step guide to practical project management
The six step guide to practical project management
 
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
 

Spatial Gestures using a Tactile-Proprioceptive Display

  • 1. X Spatial Gestures using a Tactile-Proprioceptive Display Eelke Folmer & Tony Morelli - TEI’12, Kingston Player-Game Interaction Lab University of Nevada, Reno
  • 2. Spatial Gestures in NUI’s Player-Game Interaction Research University of Nevada, Reno
  • 3. Spatial Gestures in NUI’s Player-Game Interaction Research University of Nevada, Reno
  • 4. No Display / Unable to see Player-Game Interaction Research University of Nevada, Reno
  • 5. No Display / Unable to see ? Player-Game Interaction Research University of Nevada, Reno
  • 6. Non-Visual NUI’s item A item B item C visual impairment mobile contexts Limitations: » no spatial gestures » rely on visuospatial memory Player-Game Interaction Research University of Nevada, Reno
  • 7. Non-Visual NUI’s item A “item B” item B item C visual impairment mobile contexts Limitations: » no spatial gestures » rely on visuospatial memory Player-Game Interaction Research University of Nevada, Reno
  • 8. Non-Visual NUI’s item A “item B” item B item C ? visual impairment mobile contexts Limitations: » no spatial gestures » rely on visuospatial memory Player-Game Interaction Research University of Nevada, Reno
  • 9. Tactile-Proprioceptive display Turn the Human body into a display Proprioception »human ability to sense the orientation of limbs »augment haptic feedback with prop. information Player-Game Interaction Research University of Nevada, Reno
  • 10. Tactile-Proprioceptive display Turn the Human body into a display Proprioception »human ability to sense the orientation of limbs »augment haptic feedback with prop. information Player-Game Interaction Research University of Nevada, Reno
  • 11. Example frequency error 0 Player-Game Interaction Research University of Nevada, Reno
  • 12. Example frequency error 0 Player-Game Interaction Research University of Nevada, Reno
  • 13. Example frequency error 0 Player-Game Interaction Research University of Nevada, Reno
  • 14. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 15. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 16. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 17. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 18. Study 1: procedure & results Space invaders like game Between-subjects study with 16 subjects Corrected search time: »linear 51.7 ms/pixel significant difference »multilinear 40.3 ms/pixel No sig. difference in error Player-Game Interaction Research University of Nevada, Reno
  • 19. Study 1: procedure & results Space invaders like game Between-subjects study with 16 subjects Corrected search time: »linear 51.7 ms/pixel significant difference »multilinear 40.3 ms/pixel No sig. difference in error Player-Game Interaction Research University of Nevada, Reno
  • 20. Study 2: Spatial Gesture X Multilinear scanning correct gesture if: » within 150 pixels of target » Z axis decrease of 20 cm » less than 5% error in each axis of rotation 8 subjects (not participate in Study 1) corrected search time: 45.9 ms/pixel aiming accuracy: 21.4° Player-Game Interaction Research University of Nevada, Reno
  • 21. Study 2: Spatial Gesture X Multilinear scanning correct gesture if: » within 150 pixels of target » Z axis decrease of 20 cm » less than 5% error in each axis of rotation 8 subjects (not participate in Study 1) corrected search time: 45.9 ms/pixel aiming accuracy: 21.4° Player-Game Interaction Research University of Nevada, Reno
  • 22. Potential Applications navigation low cost motor exergames for users rehabilitation who are blind Player-Game Interaction Research University of Nevada, Reno
  • 23. Current/Future Work direction of error found Y X 3D scanning Extension of Fitts’s law 3D target selection two handed scanning Model for non-visual pointing Player-Game Interaction Research University of Nevada, Reno
  • 24. props & questions This research supported by NSF Grant IIS-1118074 Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Player-Game Interaction Research University of Nevada, Reno
  • 25. props & questions ? This research supported by NSF Grant IIS-1118074 Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Player-Game Interaction Research University of Nevada, Reno

Editor's Notes

  1. Hi, My name is Eelke Folmer and I'm here to present the work I did with my grad student Tony Morelli on using a tactile proprioceptive display to perform spatial gestures. \n\n
  2. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  3. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  4. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  5. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  6. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  7. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  8. So you can imagine this pretty difficult if you are unable see or if you don’t have a display. \n\n
  9. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  10. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  11. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  12. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  13. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  14. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  15. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  16. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  17. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  18. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  19. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  20. Along the lines of recent work that turns the body into an input space, we explore turning the body into an display. But instead of using your body to communicate information to someone else we communicate information to the user using their own body. To do that we use a largely unexplored output modality called proprioception. Proprioception is the human ability to sense the orientation of their limbs and which allows you for example to touch your nose with your eyes closed. \n\nRecent work by my lab and others shows you can augment haptic feedback with proprioceptive information to facilitate an significantly larger information space that can be accessed in an ear and eye free manner and which can be used to point out targets around the user. Let me illustrate this with an example. \n\n
  21. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  22. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  23. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  24. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  25. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  26. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  27. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  28. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  29. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  30. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  31. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  32. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  33. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  34. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  35. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  36. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  37. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  38. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  39. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  40. We conducted a between subjects study with 16 CS students. \nSubject played an augmented reality like space invaders game where they had to shoot 40 aliens. \nResults showed a significant difference between search time corrected for distance with multlinear scanning being significantly faster. No difference in error was found. \n
  41. We conducted a between subjects study with 16 CS students. \nSubject played an augmented reality like space invaders game where they had to shoot 40 aliens. \nResults showed a significant difference between search time corrected for distance with multlinear scanning being significantly faster. No difference in error was found. \n
  42. The second study explored spatial interaction. \n\nSubject had to scan to the location of a balloon and pop it using a thrust gesture. A gesture was correct if it was within 150 pixels of the target and there was less than 5% error in rotation along each axes of the controller. A user study with 8 users found an aiming error of 21 degrees. \n\n
  43. The second study explored spatial interaction. \n\nSubject had to scan to the location of a balloon and pop it using a thrust gesture. A gesture was correct if it was within 150 pixels of the target and there was less than 5% error in rotation along each axes of the controller. A user study with 8 users found an aiming error of 21 degrees. \n\n
  44. The second study explored spatial interaction. \n\nSubject had to scan to the location of a balloon and pop it using a thrust gesture. A gesture was correct if it was within 150 pixels of the target and there was less than 5% error in rotation along each axes of the controller. A user study with 8 users found an aiming error of 21 degrees. \n\n
  45. Potential applications of our technique include: \n- navigation where the Y coordinate can be used to indicate the distance to a target, low cost motor rehabilitation and exercise games for users who are blind \n
  46. Current and future work focuses on extending this to 3D target selection.\n\nWe are further interested in seeing if this non visual pointing task can somehow be modeled. \n\n
  47. \n