1. Visuo-Haptic Augmented Reality
Runtime and Authoring Environment
for Medical Training
Ulrich Eck
Supervisor: Dr. Christian Sandor, University of South Australia
Co-Supervisor: Dr. Hamid Laga, University of South Australia
Associate Supervisor: Prof. Nassir Navab, TU Munich
14/02/2013, Research Proposal Presentation
1
Thank you for joining the presentation of my research proposal with the title:
Visuo-haptic Augmented Reality Runtime and Authoring Environment for Medical Training.
My supervisor is: Dr. Christian Sandor, my co-supervisor is: Dr. Hamid Laga,
and my associate supervisor is: Prof. Nassir Navab from TU Munich.
2. Visuo-Haptic Augmented Reality
Applications
Previous Work: Rapid Prototyping Vision: Medical Training
Authoring Environment
User
Interface
Runtime Environment
Development
System
Haptics Augmented Relality Simulation
2
I'm starting my presentation with an overview on the proposed research project.
The movie on the left side shows one example of my previous work on VHAR applications for rapid prototyping. The user uses a haptic device to paint on a virtual shoe.
We have evaluated such a painting application with kids as young as 6 years and found, that the user interface is an intuitive way to interact with virtual object.
My vision is to enable and motivate developers and researchers in the domain of medical procedures, to use VHAR user interfaces for their training simulators.
A mockup of such a training scenario is shown on the right.
In order to enable developers to build their applications with VHAR user interfaces, a new User Interface Development System is needed.
It consists of a runtime environment, which provides the required functionality, and an authoring environment, which allows developers to create and modify content and behavior
interactively.
In order to motivate researchers to use this technology in the medical domain, I will design and build medical training application prototypes and evaluate them with domain experts.
So let's look a bit closer to what VHAR is, related work in this young field of research, and the challenges in building applications with this user interface technology.
3. VHAR Properties
See and touch digital information:
embedded in the real world
Precisely co-located with haptic devices
Improved performance and realism for manual tasks
[P. Rhienmora et al., VR 2010]
3
The important properties of VHAR are:
- It enables users to see and touch digital information, which is embedded in the real world
- that haptic feedback and visual output are co-located
- previous research has shown, that VHAR improves user performance and realism for manual tasks
Some VHAR systems have been developed as part of research projects during the last decade.
I picked two interesting examples ...
4. Related Work
[Sandor, C. et al., IEICE 2007]
4
[Sandor and colleagues.] in 2007 presented a VHAR system which allowed users
to see and touch a virtual car, which is tracked using a real world object.
5. Related Work
[Harders, M. et al., TVCG 2009]
5
In 2009, Harders and colleagues presented a prototype for a medical training simulator
with physics-based simulation of soft tissue cutting.
Building such systems is challenging ...
6. Challenges in VHAR
Accurate co-location of visual rendering and haptic
interaction
Precise calibration of every component and
complete system
Low latency, realtime operation
6
The visual rendering and haptic interaction need to be accurately co-located.
Precise calibration of every component and the complete system is a necessary precondition.
Furthermore, VHAR applications need to run in realtime with low latency.
But there is more ...
7. More Challenges ...
Model representation and transformation:
Model simplification (haptic rendering requires
simpler geometry than visual rendering)
Simulation of deformable bodies (mass-spring
systems, finite element method)
Most VHAR applications have been built using
shared-data, multi-threaded architectures, which is
difficult to get right
7
Virtual models are needed in multiple representations at the same time for haptic rendering, physics-based simulation, and visual rendering.
Deformations of virtual objects need to be synchronized between all representations in realtime.
Finally, VHAR applications have been built using shared-data, multi-threaded architectures, which is difficult to get right.
In order to simplify and promote the development of such user interfaces, I want to answer the following research questions:
8. Research Questions
Is it possible to design and implement a widely applicable
VHAR runtime?
What is a suitable VHAR application authoring environment for
the stakeholders: programmers, designers, usability
engineers, and users?
What are measurable benefits of applications with VHAR user
interfaces in general, and specifically for medical training
simulators?
8
Is it possible to design and implement a widely applicable VHAR runtime?
What is a suitable VHAR application authoring environment for the stakeholders: programmers, designers, usability engineers, and users?
What are measurable benefits of applications with VHAR user interfaces in general, and specifically for medical training simulators?
9. Approach Overview
Applications Medical ...
controls executes
Authoring Environment
User Interface
Development uses
System
Runtime Environment
Haptics Augmented Relality Simulation
9
In order to answer these questions, I will first develop a runtime environment, which can be used to execute VHAR applications.
An authoring environment, which uses the runtime, enables users interactively create and modify content and behavior.
Finally, these applications will be evaluated to show measurable benefits in task performance.
Let’s have a closer look on a VHAR system ..
10. VHAR System Decomposition
Tracking Simulation
Tracker Simulation
World Model
Engine
Capture Haptic Rendering
Video
Haptic Collision Force
Visual Rendering Device Detection Response
Graphics Control
Engine Algorithms
[Eck, U., Honours Thesis 2012]
10
This simplified decomposition shows the main components of a typical VHAR application.
The haptic device sends sensor readings to the haptic rendering component and receives feedback forces.
The haptic rendering component determines collisions with virtual objects and calculates feedback forces based on the penetration depth.
The virtual object’s behavior is simulated based on physical laws.
Cameras from the head-mounted display capture the environment, which is used a background for the rendered objects and for tracking.
The users viewpoint and poses of other real objects are tracked in the video using fiducial markers and potentially fused with poses received from an external tracking system.
These poses are then used as input to the simulation engine and for visual rendering.
11. Simplify VHAR Development with a
User Interface Development System
Similar approach as early graphical user interface
development systems [Myers, B., IEEE Software 1989]
Dataflow process network architecture with support
for parallel execution [Lee, E., Proc. of IEEE 1995]
Runtime environment, that connects all components and
manages optimal scheduling of tasks on multi-core CPU
systems with multiple GPUs [Hermann et al., Euro-Par 2010]
Authoring environment, that enables developers to create and
modify content and behaviour at runtime [MacWilliams, 2005]
11
Building such applications is difficult.
I propose to create a UIDS for VHAR applications, similar to early research in graphical user interfaces.
As shown by Myers in 1989, UIDS can simplify the development of GUIs by providing appropriate communication patterns and clean apis to developers.
The communication pattern in VHAR applications is streams with varying update rates and throughput.
A suitable architecture for a set of processing nodes connected via streams is the dataflow process network architecture.
The dataflow architecture, which requires side-effect free processing components, decouples algorithms from communication and schedules.
The runtime environment implements the dataflow network and provides default implementations for all required components.
The authoring environment will use the flexible runtime to provide an interactive development system for creating and modifying content and behavior.
Let’s have a closer look onto the concurrency of VHAR systems ...
12. Concurrency of VHAR Subsystems
Continuos Time Discrete Time
Parallel scheduling of haptic
External
100Hz Tracking
Sensor rendering, simulation, sensor
Tracker Fusion
fusion, visual rendering, and
computer vision
Tracking/
Camera 30Hz Computer
Vision Distribution of workload on multi-
core CPUs and multiple GPUs
Haptic Haptic Simulation
Human 1Khz
Device Rendering 100-200Hz
Efficient data exchange between
concurrent subsystems
Visual Visual
Meet latency requirements for
60Hz
Display Rendering
realtime operation
12
The diagram shows physical devices which provide an interface to the real world, from analog to digital as well as from continuous time to discrete time.
All devices operate at different update rates and are normally not synchronized.
The update rates vary from 30Hz to 1kHz and the packet sizes range from small pose updates to large image buffers or geometric models.
In order to achieve maximum performance, all processing needs to be scheduled optimally for execution on all available CPU cores and GPUs.
Furthermore, the cost of communication between components needs to be taken into account, to achieve minimum latency and realtime operation.
A unique feature of the runtime environment will be the dynamic optimization of execution schedules.
13. Self Optimizing Dataflow Network
Static inputs:
- timing requirements (deadlines, latency)
- quality requirements (jitter, error)
Dataflow Specification - execution requirements (cpu, gpu, ...)
iteratively map dataflow network
optimally to available resources Scheduling
Algorithm
Runtime Environment
Dynamic inputs:
- node: processing time
- node: total error
- edge: cost of communication
Intra-process Connector
CPU1 GPU1 CPU2
Inter-process Connector
13
This diagram shows the dynamic schedule optimization for dataflow process networks
The dataflow graph connects processing nodes and provides information on static requirements, like timing, quality attributes, or execution context.
A scheduling algorithms segments the graph into a partition for every available CPU core and GPU.
During runtime, the dataflow runtime provides dynamic information about actual processing time, accumulated errors, and cost of communication.
This information is the used to iteratively optimize the graph segmentation until an optimal solution has been found.
14. Authoring Environment
Interactive creation and modification of
content and behavior
Support for development at runtime
to
[MacWilliams, Thesis 2005]
Live code editor with just-in-time compiler
[Victor, B., Cusec 2012, Storer, J., Projucer 2012]
14
The proposed authoring environment allow the developer to create and modify content and behavior interactively.
This can be done in two ways:
- Using the development at runtime process as presented by MacWilliams in 2005
- And via a live coding environment as demonstrated by Bret Victor in 2012
A short video for both approaches follows..
15. Development at Runtime
[Sandor et al., ISMAR 2005]
15
This clip shows a system developed by Sandor and colleagues in 2005, where users can define the behavior of interaction components at runtime.
16. Live Coding
[Victor, B., Cusec 2012]
16
This clip shows a live code editor developed by Bret Victor in 2012.
I’m planning to create such an environment for either C++ using the LLVM/Clang compiler suite or by using a jit-compiled language such as Racket, Clojure, Julia, or PyPy.
Once such an UIDS for VHAR exists, many applications can be built ...
17. Applications for VHAR
Medical procedures [Coles, T., PhD Thesis 2011]
Training [Knoerlein, B., PhD Thesis 2011]
Rapid prototyping [Sandor et al., IEICE 2007]
[Eck, Honours Thesis 2012]
many applications with haptic interaction
17
VHAR user interfaces can improve the user’s experience and performance in many domains, but this has not been studied extensively.
As previously shown, medical procedures, training, and rapid prototyping are good candidates.
Many haptic enabled applications can benefit from VHAR user interfaces.
Although, there are many options - I will focus on medical training scenarios during my research project.
18. State of the Art for Medical Simulation
[Ullrich and Kuhlen, VGC 2012]
18
This clip shows a typical setup of a VR-based medical training application.
In this demo, users can practice palpation and needle insertion using two haptic devices.
The visual output is presented on a 3D screen, but the haptic interaction is not co-located with the visual appearance.
I think, the user interface should be improved...
19. Improve User Interfaces with VHAR
State of the Art My Vision
[Ullrich and Kuhlen, VGC 2012]
19
As seen before, the haptic interaction and visual rendering are not co-located in current state of the art medical simulators.
I propose the use of VHAR user interfaces for medical procedures ...
20. VHAR can Improve Medical Training
Benefits:
Reduced cognitive load
Improved realism
Greater flexibility than mockup based simulators
Problem:
Formal evaluation of benefits is missing
20
because they can reduce cognitive load, improve realism, and provide greater flexibility than mockup based simulators.
But, in order to successfully deploy VHAR-enabled medical simulators, their benefits need to be formally evaluated.
21. Show Benefits of VHAR User Interfaces
Develop medical training prototypes using UIDS
Collaboration with TU Munich and German Space
Agency DLR
Evaluation with medical experts
Within-subject user study comparing task
performance to haptic-enabled VR training
21
Medical training simulators are complex to build and need be evaluated with medical experts.
Therefore, we have set up collaborations with TU Munich and the German Space Agency DLR.
This collaboration will help me
- to define appropriate scenarios for the evaluation of VHAR user interfaces in medical training,
- to build them using best-of-breed components for tracking, sensor fusion, collision detection, and haptic rendering,
- and to evaluate them with with domain experts in hospitals in Munich.
I plan to perform two within-subject user studies comparing task performance of VHAR-enabled simulators with haptic-enabled VR simulators.
I’m summarizing the expected research contributions as follows ..
22. Expected Contributions
Creation of the first widely applicable dataflow kernel for VHAR
applications and a reusable and extensible VHAR runtime
environment
The design and prototype implementation of the first
integrated development and authoring environment for VHAR,
which supports the development at runtime process
Show measurable benefits of VHAR user interfaces through
evaluation of VHAR-enabled applications with a focus on
medical training
22
The creation of the first widely applicable dataflow kernel for VHAR applications and a reusable and extensible VHAR runtime environment
The design and prototype implementation of the first integrated development and authoring environment for VHAR, which supports the development at runtime process.
To show measurable benefits of VHAR user interfaces through evaluation of VHAR-enabled applications with a focus on medical training.
23. Research Plan and Collaborations
DLR MVL TUM
2013
Collision Detection
VHAR Runtime Tracking
and Haptics with integrate integrate
Prototype Sensor Fusion
Rigid Bodies
Torque / Force TorqueViz Medical VHAR
consult
Visualisations Evaluation Requirements
specify
Authoring Env. specify
Prototype
2014
Medical VHAR Medical VHAR
collaborate
Prototyp 1 Prototype 1
Collisions and
Improved Authoring Prototype 1
Haptics with integrate refine
and Runtime Env. Evaluation
Deformable Bodies
2015
Medical VHAR Medical VHAR
collaborate
Prototype 2 Prototype 2
Prototype 2
Evaluation
23
A approximate timeline and an overview on our collaboration with TUM and DLR is shown in this diagram.
24. References
Selected References (14 out of 128):
Coles, T.R., 2011. Investigating Augmented Reality Visio- Haptic Techniques for Medical Training. Wales: Bangor
University.
Eck, U., 2011. HARP: A Framework for Visuo-Haptic Augmented Reality Research Projects. Adelaide: University of
South Australia.
Harders, M. et al., 2009. Calibration, Registration, and Synchronization for High Precision Augmented Reality Haptics.
IEEE Transactions on Visualization and Computer Graphics, 15(1), pp.138–149.
Hermann, E. et al., 2010. Multi-GPU and multi-CPU parallelization for interactive physics simulations. Euro-Par 2010-
Parallel Processing, pp.235–246.
Lee, E.A. & Parks, T.M., 1995. Dataflow Process Networks. Proceedings of the IEEE, 83(5), pp.773–801.
MacWilliams, A., 2005. A Decentralized Adaptive Architecture for Ubiquitous Augmented Reality Systems. Technische
Universität München.
Myers, B.A., 1989. User-Interface Tools: Introduction and Survey. IEEE Software, 6(1), pp.15–23.
Rhienmora, P. et al., 2010. Augmented Reality Haptics System for Dental Surgical Skills Training. In Proceedings of
the 17th ACM Symposium on Virtual Reality Software and Technology. Hong Kong: ACM, pp. 97–98.
Sandor, C. et al., 2005. Immersive Mixed-Reality Configuration of Hybrid User Interfaces. In Proceedings of IEEE and
Sandor, C. et al., 2007. Exploring Visuo-Haptic Mixed Reality, IEICE.
Sandor, C., 2010. Talk at TEDxAdelaide: The Ultimate Display, 2010, Last accessed on 20 November 2012.
ACM International Symposium on Mixed and Augmented Reality. Vienna, Austria, pp. 110–113.
Storer, J., 2012. Projucer Demo. youtu.be. Available at: http://youtu.be/imkVkRg-geI [Accessed December 6, 2012].
Ullrich, S. & Kuhlen, T., 2012. Haptic Palpation for Medical Simulation in Virtual Environments. Visualization and
Computer Graphics, pp.1–9.
Victor, B., 2012. Inventing on Principle. worrydream.com. Available at: http://worrydream.com/#!/InventingOnPrinciple
[Accessed February 6, 2013].
24
These are the references used in this presentation - 14 out of 128 citations in my research proposal.
25. Applications
Previous Work: Rapid Prototyping Vision: Medical Training
Authoring Environment
User
Interface
Runtime Environment
Development
System Questions ?
Haptics Augmented Relality Simulation
Thank You!
Expected Contributions:
Creation of the first widely applicable dataflow kernel for VHAR applications and a reusable and extensible VHAR
runtime environment
The design and prototype implementation of the first integrated development and authoring environment for VHAR,
which supports the development at runtime process
Show measurable benefits of VHAR user interfaces through evaluation of VHAR-enabled applications with a focus
on medical training
25
Thank you for listening - Any Questions ?