This document summarizes AR interaction techniques, including 2D and 3D interaction tasks, AR interfaces as data browsers, 3D AR interfaces, augmented surfaces and tangible interfaces, and architecture for AR interaction design. Some key points discussed are:
- 2D interaction tasks include selection, text entry, quantification, and positioning, while 3D tasks include navigation, selection, manipulation, and system control.
- AR interfaces can function as data browsers by registering virtual objects to the real world for manipulation and browsing of information.
- Tangible interfaces use physical objects as controls for virtual objects to support collaboration and establish shared meaning.
- Designing AR systems requires considering both physical and virtual interface components, display elements,
3. AR Interaction
Designing AR System = Interface Design
Using diff
U i different input and output technologies
i d h l i
Objective is a high quality of user experience
j g q y p
Ease of use and learning
Performance and satisfaction
4. Interaction Tasks
2D (from [Foley]):
Selection, Text Entry, Quantify, Position
y y
3D (from [Bowman]):
Navigation (Travel/Wayfinding)
Selection
Manipulation
System Control/Data Input
AR: 2D + 3D Tasks and.. more specific tasks?
[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE Computer
Graphics and Applications(Nov.): 13-48. 1984.
[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
5. AR Interfaces as Data Browsers
2D/3D virtual objects are
registered in 3D
“VR in Real World”
Interaction
I t ti
2D/3D virtual viewpoint
control
l
Applications
Visualization, training
6. AR Information Browsers
Information is registered to
real-world context
Hand held AR displays
Interaction
Manipulation f
M i l i of a windowi d
into information space
Applications
Context-aware information displays
Rekimoto, et al. 1997
8. Current AR Information Browsers
Mobile AR
GPS + compass
Many Applications
Layar
y
Wikitude
Acrossair
PressLite
Yelp
AR Car Finder
…
9. Junaio
AR Browser from Metaio
http://www.junaio.com/
p j
AR browsing
GPS + compass
2D/3D object placement
Photos/live video
Community viewing
10.
11. Advantages and Disadvantages
Important class of AR interfaces
Wearable computers
AR simulation, training
Limited interactivity
Modification of virtual
content is difficult
Rekimoto, et al. 1997
12. 3D AR Interfaces
Virtual objects displayed in 3D
physical space and manipulated
HMDs and 6DOF head-tracking
6DOF hand trackers for input
h d k f i
Interaction
Viewpoint control
Traditional 3D user interface Kiyokawa, et al. 2000
interaction: manipulation, selection,
etc.
15. Advantages and Disadvantages
Important class of AR interfaces
Entertainment, design, training
Advantages
User can interact with 3D virtual
object everywhere i space
bj t h in
Natural, familiar interaction
Disadvantages
Usually no tactile feedback
User has to use different devices for
virtual and physical objects
Oshima, et al. 2000
16. Augmented S f
Surfaces and
Tangible Interfaces
Basic principles
Virtual objects are
projected on a surface
Physical objects are used
as controls f virtual
l for l
objects
Support for collaboration
18. Tangible User Interfaces (Ishii 97)
Create digital shadows
for h i l bj
f physical objects
Foreground
g
graspable UI
Background
B k d
ambient interfaces
19. Tangible Interfaces - Ambient
Dangling String
D l S
Jeremijenko 1995
Ambient ethernet monitor
A bi h i
Relies on peripheral cues
Ambient Fixtures
Dahley, Wisneski, Ishii 1998
Use natural material qualities
for information display
25. Lessons from Tangible Interfaces
Physical objects make us smart
Norman’s “Things that Make Us Smart”
g
encode affordances, constraints
Objects aid collaboration
establish shared meaning
Objects increase understanding
serve as cognitive artifacts
26. TUI Limitations
Difficult to change object properties
can’t tell state of digital data
Limited display capabilities
projection screen = 2D
dependent on physical display surface
Separation between object and display
p j p y
ARgroove
27. Advantages and Disadvantages
Advantages
Natural - users hands are used for interacting
with both virtual and real objects.
- N need for special purpose input devices
No df i l i td i
Disadvantages
g
Spatial gap
- Interaction is limited only to 2D surface
• Full 3D interaction and manipulation is difficult
- Separation between interaction object and display
29. Back to the Real World
AR overcomes limitation of TUIs
enhance display possibilities
merge task/display space
provide public and private views
TUI + AR = Tangible AR
g
Apply TUI methods to AR interface design
30. Space vs. Time - Multiplexed
Space-multiplexed
Many devices each with one function
y
- Quicker to use, more intuitive, clutter
- Real Toolbox
Time multiplexed
Time-multiplexed
One device with many functions
- Space efficient
- mouse
31. Tangible AR: Tiles (Space Multiplexed)
Tiles
T l semantics
data tiles
operation tiles
i il
Operation on tiles
proximity
spatial arrangements
space-multiplexed
lti l d
34. Object Based Interaction: MagicCup
Intuitive Virtual Object Manipulation
on a Table-Top Workspace
Time multiplexed
Multiple Markers
M li l M k
- Robust Tracking
Tangible User Interface
- Intuitive Manipulation
Stereo Display
- Good Presence
37. Tangible AR: Time-multiplexed Interaction
Use of natural physical object manipulations to
j
control virtual objects
VOMAR Demo
Catalog book:
- Turn over the page
p g
Paddle operation:
- Push, shake, incline, hit, scoop
39. Advantages and Disadvantages
Advantages
Natural interaction with virtual and physical tools
- No need for special purpose input devices
Spatial interaction with virtual objects
- 3D manipulation with virtual objects anywhere in physical
space
Disadvantages
Requires Head Mounted Display
40. Wrap-up
Browsing Interfaces
simple (conceptually!), unobtrusive
(conceptually!)
3D AR Interfaces
expressive, creative, require attention
Tangible Interfaces
Embedded into conventional environments
Tangible
T ibl AR
Combines TUI input + AR display
p p y
42. Interface Design Path
1/ Prototype Demonstration
2/ Adoption of Interaction Techniques from other
interface metaphors
p Augmented Reality
3/ Development of new interface metaphors
appropriate to the medium
Virtual Reality
4/ Development of formal theoretical models for
predicting and modeling user actions
Desktop WIMP
43. AR Design Space
Reality Virtual Reality
Augmented R lit
A t d Reality
Physical Design Virtual Design
44. AR is mixture of physical affordance and
virtual affordance
Physical
Tangible
T ibl controllers and objects
ll d bj
Virtual
Virtual graphics and audio
45. AR Design Principles
Interface Components
Physical components
Display elements
- Visual/audio
Interaction metaphors
Physical
y Display
p y
Elements Interaction Elements
Metaphor
Input Output
46. Tangible AR Metaphor
AR overcomes limitation of TUIs
enhance dis la possibilities
display ssibilities
merge task/display space
provide public and private views
TUI + AR = Tangible AR
Apply TUI methods to AR interface design
47. Tangible AR Design Principles
Tangible AR Interfaces use TUI principles
Physical controllers for moving virtual content
Support for spatial 3D interaction techniques
Support for multi-handed interaction
Match object affordances to task requirements
j q
Support parallel activity with multiple objects
Allow collaboration between multiple users
48. Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
MagicLenses
Developed at Xerox PARC in 1993
View
Vi a region of the workspace differently to the rest
i f h k diff l h
Overlap MagicLenses to create composite effects
50. AR Lens Design Principles
Physical Components
Lens handle
- Virtual lens attached to real object
Display Elements
Lens view
- Reveal layers in dataset
Interaction Metaphor
Physically holding lens
Ph i ll h ldi l
51. 3D AR Lenses: Model Viewer
Displays models made up of multiple parts
Each part can be shown or hidden through the lens
Allows the user to peer inside the model
Maintains focus + context
54. Techniques based on AR Lenses
Object Selection
Select objects by targeting them with the lens
Information Filtering
Show different representations through the lens
Hide certain content to reduce clutter, look inside things
56. Case Study 2: LevelHead
Physical Components
Real blocks
Display Elements
Virtual person and rooms
Interaction M
I i Metaphor
h
Blocks are rooms
57.
58. Case Study 3: AR Chemistry (Fjeld 2002)
Tangible AR chemistry education
59. Goal: An AR application to test molecular
G
structure in chemistry
y
Physical Components
Real book,
R l b k rotation cube, scoop, tracking markers
i b ki k
Display Elements
p y
AR atoms and molecules
Interaction M t h
I t ti Metaphor
Build your own molecule
62. Case Study 4: Transitional Interfaces
Goal: An AR interface supporting transitions
from reality to virtual reality
Physical Components
Real book
Display Elements
AR and VR content
Interaction Metaphor
Book pages hold virtual scenes
p g
63. Milgram’s Continuum (1994)
Mixed Reality (MR)
Reality Augmented Augmented Virtuality
y
(Tangible Reality (AR) Virtuality (AV) (Virtual
Interfaces) Reality)
Central Hypothesis
yp
The next generation of interfaces will support transitions
along the Reality-Virtuality continuum
64. Transitions
Interfaces of the future will need to support
transitions along the RV continuum
ii l h i
Augmented Reality is preferred for:
co-located collaboration
Immersive Virtual Reality is preferred for:
I i Vi t l R lit i f df
experiencing world immersively (egocentric)
sharing views
remote collaboration
65. The MagicBook
Design Goals:
Allows user to move smoothly between reality
and virtual reality
Support collaboration
S ll b i
67. Features
Seamless transition between Reality and Virtuality
Reliance on real d
R li l decreases as virtual increases
i li
Supports egocentric and exocentric views
User can pick appropriate view
Computer becomes invisible
Consistent interface metaphors
Virtual content seems real
Supports collaboration
68. Design alternatives for
common user tasks in Tangible AR
User Tasks Interface Design
Camera on HMD
Viewpoint
Fixed camera – top view, front view, mirroring
Control Handheld camera
Statically paired virtual and physical objects
Selection Dynamically paired virtual and physical objects - paddle, pointer
D i ll i d i t l d h i l bj t ddl i t
Direct mapping of whole 6DOF
3D
Filtered/distorted mapping - snapping, non-linear mapping
Manipulation Multiplexed mapping - rotation from one, position from another
Location and pose based - proximity, spatial configuration
Gestures with props - tilt shake
tilt,
Event & Keyboard & mouse
Command Menu, 2D/3D GUI with tracking objects as pointers
Buttons Occlusion based interaction
Custom hardware devices
69. Design Tips for Tangible AR
Using metaphors from the real world
Take advantage of parallel interactions
Use timers to prevent accidents
Interaction volume – user, tracking, whitespace
What happens when tracking gets lost or object
is out of view?
Problems in visualization
Field of view, occlusions
,
71. Keyboard and Mouse Interaction
Traditional input techniques
OSG provides a framework for handling keyboard
p g y
and mouse input events (osgGA)
1. Subclass osgGA::GUIEventHandler
g
2. Handle events:
• Mouse up / down / move / drag / scroll-wheel
• Key up / down
3. Add instance of new handler to the viewer
72. Keyboard and Mouse Interaction
Create your own event handler class
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {
public:
KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }
virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa,
osg::Object* obj, osg::NodeVisitor* nv) {
switch (ea.getEventType()) {
// Possible events we can handle
case osgGA::GUIEventAdapter::PUSH: break;
case osgGA::GUIEventAdapter::RELEASE: break;
case osgGA::GUIEventAdapter::MOVE: break;
case osgGA::GUIEventAdapter::DRAG: break;
case osgGA::GUIEventAdapter::SCROLL: break;
case osgGA::GUIEventAdapter::KEYUP: b
d break;
k
case osgGA::GUIEventAdapter::KEYDOWN: break;
}
return false;
}
};
Add it to the viewer to receive events
viewer.addEventHandler(new KeyboardMouseEventHandler());
73. Keyboard Interaction
Handle W,A,S,D keys to move an object
case osgGA::GUIEventAdapter::KEYDOWN: {
switch (ea.getKey()) {
case 'w': // Move forward 5mm
localTransform->preMult(osg::Matrix::translate(0, -5, 0));
return true;
case 's': // Move back 5mm
localTransform->preMult(osg::Matrix::translate(0, 5, 0));
return true;
case 'a': // Rotate 10 degrees left
localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS));
return true;
case 'd': // Rotate 10 degrees right
localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS));
return t
t true;
case ' ': // Reset the transformation
localTransform->setMatrix(osg::Matrix::identity());
return true;
}
break;
localTransform = new osg::MatrixTransform();
localTransform->addChild(osgDB::readNodeFile("media/car.ive"));
arTransform->addChild(localTransform.get());
arTransform >addChild(localTransform get());
75. Mouse Interaction
Mouse is pointing device…
Use mouse to select objects in an AR scene
OSG provides methods for ray-casting and
intersection testing
Return an osg::NodePath (the p
g ( path from the hit
node all the way back to the root)
Projection
Plane (screen) scene
76. Mouse Interaction
Compute the list of nodes under the clicked position
Invoke an action on nodes that are hit, e.g. select, delete
case osgGA::GUIEventAdapter::PUSH:
osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa);
osgUtil::LineSegmentIntersector::Intersections intersections;
// Clear previous selections
for (
f (unsigned i t i = 0 i < t
i d int 0; targets.size(); i++) {
t i ()
targets[i]->setSelected(false);
}
// Find new selection based on click position
if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {
for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin();
iter != intersections.end(); iter++) {
if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) {
std::cout << "HIT!" << std::endl;
target >setSelected(true);
target->setSelected(true);
return true;
}
}
}
break;
79. Single Marker Techniques: Proximity
Use distance from camera to marker as
input parameter
e.g.
e g Lean in close to examine
Can use the osg::LOD class to show
different content at different depth
ranges Image: OpenSG Consortium
80. Single Marker Techniques: Proximity
// Load some models
osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");
osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");
osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");
// Use a Level-Of-Detail node to show each model at different distance ranges.
osg::ref_ptr<osg::LOD> lod = new osg::LOD();
lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away
lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away
lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away
arTransform->addChild(lod.get());
Define depth ranges for each node
Add as many as you want
Ranges can overlap
82. Multiple Marker Concepts
Interaction based on the relationship between
markers
e.g. When the distance between two markers
decreases below threshold invoke an action
Tangible User Interface
Applications:
A l
Memory card g
y games
File operations
83. Multiple Marker Proximity
Virtual
Camera
Transform A Transform B
Distance > Threshold
Switch A Switch B
Model Model Model Model
A1 A2 B1 B2
84. Multiple Marker Proximity
Virtual
Camera
Transform A Transform B
Distance <= Threshold
Switch A Switch B
Model Model Model Model
A1 A2 B1 B2
85. Multiple Marker Proximity
Use a node callback to test for proximity and update the relevant nodes
virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {
if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {
if (mMarkerA->valid() && mMarkerB->valid()) {
osg::Vec3 posA = mMarkerA->getTransform().getTrans();
osg::Vec3 posB = mMarkerB->getTransform().getTrans();
osg::Vec3 offset = posA - posB;
float distance = offset.length();
if (distance <= mThreshold) {
if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1);
if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1);
} else {
if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0);
if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0);
}
}
}
traverse(node,nv);
}
87. Paddle Interaction
Use
U one marker as a tool for selecting and
k lf l d
manipulating objects (tangible user interface)
Another
A h marker provides a frame of reference
k id f f f
A grid of markers can alleviate problems with occlusion
MagicCup (Kato et al) VOMAR (K t et al)
(Kato t l)
88. Paddle Interaction
Often useful to adopt a local coordinate system
Allows the camera
to move without
disrupting Tlocal
Places the paddle in
the
th same coordinate
di t
system as the
content on the grid
Simplifies interaction
osg
osgART co putes Tlocal using the osgART::LocalTransformationCallback
computes us g t e osg :: oca a s o at o Ca bac
89. Tilt and Shake Interaction
Detect types of paddle movement:
yp p
Tilt
- gradual change in orientation
Shake
- short sudden chan es in translation
short, s dden changes
90. More Information
• Mark Billinghurst
Billingh rst
– mark.billinghurst@hitlabnz.org
• Gun Lee
– gun.lee@hitlabnz.org
g @ g
• Websites
– www hitlabnz org
www.hitlabnz.org
92. Required Code
Calculating Camera P
C l l C Position
Range to marker
Loading Multiple Patterns/Models
/
Interaction between objects
Proximity
Relative position/orientation
Occlusion
Stencil buffering
Multi-marker tracking
94. Local vs. Global Interactions
Local
Actions determined from single camera to marker
transform
- shaking, appearance, relative position, range
Global
Actions determined from two relationships
- marker to camera, world to camera coords.
- Marker transform determined in world coordinates
• object tilt, absolute position, absolute rotation, hitting
95. Range-based Interaction
Sample Fil R
S l File: RangeTest.c
T t
/* get the camera transformation */
h f i
arGetTransMat(&marker_info[k], marker_center,
marker_width, marker_trans);
marker width marker trans);
/
/* find the range */
/
Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
p [ ][ ]
Zpos = marker_trans[2][3];
range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
96. Loading Multiple Patterns
Sample F l L dM l
S l File: LoadMulti.c
Uses object.c to load
Object Structure
typedef struct {
char name[256];
2 6
int id;
int visible;;
double marker_coord[4][2];
double trans[3][4];
double
d bl marker_width;
k idth
double marker_center[2];
} ObjectData_T;
_
97. Finding Multiple Transforms
Create object list
C
ObjectData_T *object;
Read in objects - in init( )
read_ObjData(
read ObjData( char *name int *objectnum );
*name,
Find Transform – in mainLoop( )
p(
for( i = 0; i < objectnum; i++ ) {
..Check patterns
p
..Find transforms for each marker
}
98. Drawing Multiple Objects
Send the object list to the draw function
draw( object objectnum );
object,
Draw each object individually
for( i = 0; i < objectnum; i++ ) {
if( object[i].visible == 0 ) continue;
argConvGlpara(object[i].trans, gl_para);
draw_object( object[i].id, gl_para);
}
99. Proximity Based Interaction
Sample File – CollideTest.c
Detect distance between markers
checkCollisions(object[0],object[1], DIST)
If distance < collide distance
Then change the model/perform interaction
100. Multi-marker Tracking
Sample File – multiTest.c
Multiple markers to establish a
single coordinate frame
g
Reading in a configuration file
Tracking from sets of markers
T ki f t f k
Careful camera calibration
101. MultiMarker Configuration File
Sample File - Data/multi/marker dat
Data/multi/marker.dat
Contains list of all the patterns and their exact
positions
ii
#the number of patterns to be recognized
6
Pattern File
#marker 1
Pattern Width +
Data/multi/patt.a
Coordinate Origin
40.0
0.0 0.0 Pattern Transform
1.0000 0.0000 0.0000 -100.0000 Relative to Global
0.0000 1.0000 0.0000 50.0000 Origin
0.0000 0.0000 1.0000 0.0000
0 0000 0 0000 1 0000 0 0000
…
102. Camera Transform Calculation
Include <AR/arMulti.h>
I l d <AR/ M lti h>
Link to libARMulti.lib
In mainLoop()
Detect markers as usual
arDetectMarkerLite(dataPtr, thresh,
&marker_info, &marker_num)
Use MultiMarker Function
if( (err=arMultiGetTransMat(marker_info,
marker_num, config)) < 0 ) {
k fi ))
argSwapBuffers();
return;
}
104. Paddle Interaction Code
Sample File – PaddleDemo.c
Get paddle marker location + draw paddle before drawing
p p g
background model
paddleGetTrans(paddleInfo, marker_info,
marker_flag, marker_num,
marker flag marker num &cparam);
/* draw the paddle */
if( paddleInfo->active ){
draw_paddle(
draw paddle( paddleInfo);
}
draw_paddle
draw paddle uses a Stencil Buffer to increase realism
105. Paddle Interaction Code II
Sample File – paddleDrawDemo.c
Finds the paddle position relative to global coordinate frame:
setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
Sample File – paddleTouch.c
Finds the paddle position:
findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
Checks for collisions:
checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
checkCollision(&curPaddlePos myTarget[i] pos 20 0)
106. General Tangible AR Library
command_sub.c, command_sub.h
d b d bh
Contains functions for recognizing a range of
different paddle motions:
diff ddl i
int check_shake( );
int
i t check_punch( )
h k h( );
int check_incline( );
int check pickup( );
check_pickup(
int check_push( );
Eg: to check angle between paddle and base
check_incline(paddle->trans, base->trans, &ang)