This presentation was on Empathic Mixed Reality, which we applied Mixed Reality technology to Empathic Computing in our studies. We shared an overview of our research and selected findings. This talk was given at ETRI and KAIST in Daejeon, South Korea, on the 24th of May 2017.
9. Education:
2011 - 15 Ph.D. in Computer Science (University of Canterbury, New Zealand)
2006 - 08 M.Sc. in Computer Science (Asian Institute of Technology (AIT), Thailand)
2000 - 03 B.Sc. in Physics/Computer Science (University of Canterbury, New Zealand)
Research Experience:
Jun 2016-present Research Fellow (ECL, University of South Australia, Australia)
Summer 2014 Research Intern (MIC group, Microsoft Research, WA)
Summer 2013 Visiting Scholar (MxR Lab, Institute for Creative Technologies, USC, CA)
2011-14 Research assistant (HIT Lab NZ, University of Canterbury)
Additional Experience:
2015 - 16 Unity Director (QuiverVision, Japan)
2009 - 10 Computer Forensic Specialist (Royal Thai Police, Thailand)
2005 - 09 Forensic Scientist (Royal Thai Police, Thailand)
Thammathip Piumsomboon, Ph.D.
13. Trends in Technology
Contents by Prof. Mark Billinghurst
https://medium.com/@marknb00/the-coming-age-of-empathic-computing-617caefc7016
14. Contents by Prof. Mark Billinghurst
https://medium.com/@marknb00/the-coming-age-of-empathic-computing-617caefc7016
Interaction Technology
Physiological Sensing
Emotiv
Empatica
Implicit
Explicit
15. Contents by Prof. Mark Billinghurst
https://medium.com/@marknb00/the-coming-age-of-empathic-computing-617caefc7016
Content Capture
3D Image/Space Capture
Matterport
Google
Project Tango
Time
Photo
Film
Live
Video
Panorama
360 Video
3D Space
1850 1900 1940 1990 2000 2010
2D Static
Immersive
Live
Experience
Realism
16. Contents by Prof. Mark Billinghurst
https://medium.com/@marknb00/the-coming-age-of-empathic-computing-617caefc7016
Networking Speeds
Network Innovation
5G
Text
Audio
Natural
Video
18. “Seeing with the Eyes of another,
Listening with the Ears of another,
and Feeling with the Heart of another..”
- Alfred Adler
What is empathy?
Contents by Prof. Mark Billinghurst
https://medium.com/@marknb00/the-coming-age-of-empathic-computing-617caefc7016
19. What is Empathy Computing?
Contents by Prof. Mark Billinghurst
https://medium.com/@marknb00/the-coming-age-of-empathic-computing-617caefc7016
21. 1. Understanding: Systems that can understand your feelings and
emotions
2. Experiencing: Systems that help you better experience the world of
others
3. Sharing: Systems that help you better share the experience of others
- Prof.Mark Billinghurst
How to achieve Empathic Computing?
22. What is Mixed Reality?
Empathic
ComputingMixed Reality
23. Milgram and Kishino’s Mixed Reality on the Reality-Virtuality Continuum
P. Milgram and F. Kishino, "A taxonomy of mixed reality visual displays," IEICE TRANSACTIONS on
Information and Systems, vol. 77, pp. 1321-1329, 1994.
24. F. Steinicke, G. Bruder, K. Rothaus, & K. Hinrichs.
Poster: A virtual body for augmented virtuality by
chroma-keying of egocentric videos. In 3D User
Interfaces, 2009. 3DUI 2009.
Microsoft HoloLens
26. 1. Understanding: Systems that can understand your feelings and
emotions
2. Experiencing: Systems that help you better experience the world of
others
3. Sharing: Systems that help you better share the experience of others
- Prof.Mark Billinghurst
Sensors
VR
AR
Why applying MR to Empathic Computing?
27. Affordances of MR interfaces align well with requirements necessary for
Empathic Computing
1. MR naturally support collaboration in 3D environments (real/virtual)
2. MR is highly personalized platforms easy for personal data capture
(embedded sensors) and user’s environment (context)
3. Data captured could also be shared and experience with a remote
person in MR, enabling them to feel as if they are there
Why applying MR to Empathic Computing?
28. 3.Through Heart and Eyes:
Sharing WhatYou Feel and Interacting with WhatYou See
29. 3.Through Heart and Eyes: Sharing WhatYou Feel
and Interacting with What You See
3.1 Sharing Where You Gaze
3.2 Sharing What You Feel
3.3 Interacting with What You See
3.4 Enhancing Your Collaboration
30. 3.1 Sharing WhereYou Gaze
3.2 Sharing WhatYou Feel
3.3 Interacting with What You See
3.4 Enhancing Your Collaboration
32. T. Piumsomboon, A. Dey, B. Ens, G. Lee, and M. Billinghurst, “CoVAR: Mixed-Platform Remote Collaboration
Between Augmented and Virtual Realities with Shared Collaboration Cues“, in reviewing process.
By providing shared collaboration cues?
• Our collaboration cues
• CoVAR: System Overview
• Experimental Setup
• Variables
• Summary
41. 3.1 Sharing Where You Gaze
Summary
• Collaboration cues in FoV and Gaze are crucial for improving collaborations.
• Head-gaze (FoV + head-ray) condition was found the most useful since head-
gaze input was used as the default interaction method even in the eye-gaze
condition (avoid confounding factor). This utilizes the implicit nature of
shared interaction and collaboration/communication cue in gaze.
T. Piumsomboon, A. Dey, B. Ens, G. Lee, and M. Billinghurst, “CoVAR: Mixed-Platform Remote Collaboration
Between Augmented and Virtual Realities with Shared Collaboration Cues“, in reviewing process.
42. 3.2 Sharing WhatYou Feel
3.3 Interacting with What You See
3.4 Enhancing Your Collaboration
3.1 Sharing WhereYou Gaze
44. By measuring and sharing physiological cues?
We know:
• VR can trigger emotional response
• Heart-rate can be an indicator of emotional response
• Sharing physiological feedback increases positive affect
A. Dey, T. Piumsomboon, Y. Lee, and M. Billinghurst, "Effects of Sharing Physiological States of Players in a
Collaborative Virtual Reality Gameplay," presented at the Proceedings of the 2017 CHI Conference on Human
Factors in Computing Systems, Denver, Colorado, USA, 2017.
51. Data Collected
• Raw heart rate
• Positive and negative affect schedule (PANAS)
• Subjective Questionnaire (four point Likert-scale)
• Relative head orientation
3.2 Sharing What You Feel
Hypotheses
When heart-rate feedback is shown:
• Observers will feel more connected to the active player
• Generate more positive affect
• More interaction between collaborators
Scary game:
• Will trigger more subjective understanding of emotions
Participants
26 (13 in each group)
7 females
Age: m=30.5, s.d.=5.2
52. 3.2 Sharing What You Feel
Raw heart-rate
• No significant difference
• Slightly higher heart-rate in scary zombie game
53. 3.2 Sharing What You Feel
Positive and negative affect
schedule (PANAS)
• Significant effect of gaming
• Scary zombie game had more
positive and negative affects
• No significant (p=.15) effect of
heart-rate visualization
54. Relative head orientation
• Significant effect of
gaming experiences
• Joyous game had more
aligned head orientation
than scary game
3.2 Sharing What You Feel
55. Summary
• Game had a significant effect on PANAS
• Heart-rate feedback showed promises to be effective
3.2 Sharing What You Feel
A. Dey, T. Piumsomboon, Y. Lee, and M. Billinghurst, "Effects of Sharing Physiological States of Players in a
Collaborative Virtual Reality Gameplay," presented at the Proceedings of the 2017 CHI Conference on Human
Factors in Computing Systems, Denver, Colorado, USA, 2017.
60. T. Piumsomboon, G. Lee, R. W. Lindeman, and M. Billinghurst, "Exploring natural eye-gaze-based interaction for
immersive virtual reality," in 2017 IEEE Symposium on 3D User Interfaces (3DUI), 2017, pp. 36-39.
a. Examples of eye-gaze + gestures
interaction for Mixed Reality
b. Examples of natural eye-gaze-based
interaction for immersive Virtual Reality
By using our natural inputs and designing around
our natural behaviour?
61. a. Examples of eye-gaze + gestures interaction
for Mixed Reality
64. b. Examples of natural eye-gaze-based interaction
for immersive Virtual Reality
65. Overview of our Eye-gaze-based interaction
• Duo-Reticles
• Radial Pursuit
• Nod and Roll
Initial Study
• Variables
• Results
3.3 Interacting with What You See
T. Piumsomboon, G. Lee, R. W. Lindeman, and M. Billinghurst, "Exploring natural eye-gaze-based interaction for
immersive virtual reality," in 2017 IEEE Symposium on 3D User Interfaces (3DUI), 2017, pp. 36-39.
66. HTC Vive + Pupil Labs Eye Tracker
3.3 Interacting with What You See
67. A laptop PC running our
software on Unity version
5.4.1f1.
HTC Vive Kit + a pair of
Pupil Labs eye trackers with
a binocular mount
An iMac running Pupil
Labs Capture software
Hardware Setup
3.3 Interacting with What You See
68. 3.3 Interacting with What You See
Overview
Type of Eye Movement Interaction Technique
Eye Saccade Duo-Reticles
Smooth Pursuit Radial Pursuit
Vestibulo-Ocular Reflex (VOR) Nod and Roll
Vergence None Tested
71. 3.3 Interacting with What You See
Duo-Reticles (DR)
Inertial Reticle (IR)
Real-time Reticle (RR)
A-1
As RR and IR are aligned,
alignment time counts down
A-2 A-3
Selection completed
76. Nod and Roll – Video 1
3.3 Interacting with What You See
77. 3.3 Interacting with What You See
C-2
C-1 Head-gaze Reticle (HR)
Real-time Reticle (RR)
C-3
Nod and Roll
78. Nod and Roll – Video 2
3.3 Interacting with What You See
79. 3.3 Interacting with What You See
Study IndependentVariable: Interaction Technique
Part 1 Duo-Reticles vs Gaze-Dwell 1 (GD1)
Part 2 Radial Pursuit vs Gaze-Dwell 2 (GD2)
Part 3 Explorative
Initial Study
81. 3.3 Interacting with What You See
Study
Dependent Variables
Objective Measures Subjective Measures
Part1
(DR)
Task Completion Time
# Errors
Usability Ratings
Semi-structured interview
Part 2
(RP)
Task Completion Time
# Errors
Usability Ratings
Semi-structured interview
Part 3
(NR)
None
Usability Ratings
Semi-structured interview
Dependent Variables
82. 3.3 Interacting with What You See
Cond Median p Cond Median p Cond Median
GD1 5 GD2 5
DR 5 RP 5
GD1 5 GD2 5
DR 4 RP 5
GD1 5 GD2 5
DR 5 RP 5
GD1 5 GD2 5
DR 5 RP 6
GD1 6 GD2 6
DR 6 RP 6
GD1 6 GD2 6
DR 6 RP 6
GD1 5 GD2 5
DR 6 RP 4
GD1 3 GD2 2
DR 2 RP 3
GD1 2 GD2 3
DR 6 RP 5
1 2 3 4 5 6 7
I prefered this technique 0.02 0.12
I felt tired using it 0.09 0.03 NR 4
It was frustrating to use 0.14 0.30 NR 3
It was fun to use 0.14 0.07 NR 6
I need to concentrate to use it 0.33 0.09 NR 5
It was easy for me to use 0.07 0.07 NR 5
I felt satisfied using it 0.14 0.03 NR 5
It felt natural to use 0.17 0.07 NR 4
I could interact precisely 0.23 0.17 NR 4
PART 1 PART 2 PART 3
Frequencies Frequencies Frequencies
Usability Ratings
▪ No performance difference (as expected)
▪ Most participants preferred Duo-Reticles over Gaze-Dwell 1
▪ Radial Pursuit was more satisfying and less fatiguing than Gaze-Dwell 2
83. T. Piumsomboon, G. Lee, R. W. Lindeman, and M. Billinghurst, "Exploring natural eye-gaze-based interaction for
immersive virtual reality," in 2017 IEEE Symposium on 3D User Interfaces (3DUI), 2017, pp. 36-39.
3.3 Interacting with What You See
Summary
• Three novel eye-gaze-based interaction techniques inspired by natural eye
movements
• An initial study found positive results supporting our approaches
▪ Similar performance as Gaze-Dwell, but superior user experience
• Will continue to apply the same principles to improve user experience
using eye gaze for immersive VR
84. 3.4 Enhancing Your Collaboration
3.1 Sharing WhereYou Gaze
3.2 Sharing WhatYou Feel
3.3 Interacting with What You See
86. T. Piumsomboon, and M. Billinghurst, “CoVAR: Collaborative Virtual and Augmented Reality System for
Remote Collaboration“, on going research.
a. Examples of VR user body scaling
b. An example of VR user snapping to AR
perspective
By utilizing virtuality of the collaborations?
98. Project:Visualization of physiological data
HAO CHEN
PhD Student
Investigating how to visualise the physiological data to the
players to help them perceive the data more effectively.
Particularly, we are exploring multi-sensory (visual, audio,
and haptic) visualization of physiological data.
The goal of this project is to make VR experiences more
empathetic and higher in presence.
Contact:
Arindam.Dey@unisa.edu.au