The document discusses interaction design for augmented reality (AR) applications. It covers various AR interaction techniques including browsing interfaces, 3D interfaces, tangible interfaces using physical objects, and augmented surfaces that project virtual content onto physical surfaces. Keyboard, mouse, touch, gesture, and proximity interactions are discussed. The advantages and disadvantages of different approaches are summarized. Effective AR interaction requires considering input and output technologies to create an intuitive user experience.
3. AR Interaction
Designing AR System = Interface Design
Using different input and output technologies
Objective is a high quality of user experience
Ease of use and learning
Performance and satisfaction
7. More terminology
Interaction Device= Input/Output of User
Interface
Interaction Style= category of similar
interaction techniques
Interaction Paradigm
Modality (human sense)
Usability
8. Back to AR
You can see spatially registered AR..
how can you interact with it?
9. Interaction Tasks
2D (from [Foley]):
Selection, Text Entry, Quantify, Position
3D (from [Bowman]):
Navigation (Travel/Wayfinding)
Selection
Manipulation
System Control/Data Input
AR: 2D + 3D Tasks and.. more specific tasks?
[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE Computer
Graphics and Applications(Nov.): 13-48. 1984.
[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
10. AR Interfaces as Data Browsers
2D/3D virtual objects are
registered in 3D
“VR in Real World”
Interaction
2D/3D virtual viewpoint
control
Applications
Visualization, training
11. AR Information Browsers
Information is registered to
real-world context
Hand held AR displays
Interaction
Manipulation of a window
into information space
Applications
Context-aware information displays
Rekimoto, et al. 1997
13. Current AR Information Browsers
Mobile AR
GPS + compass
Many Applications
Layar
Wikitude
Acrossair
PressLite
Yelp
AR Car Finder
…
14. Junaio
AR Browser from Metaio
http://www.junaio.com/
AR browsing
GPS + compass
2D/3D object placement
Photos/live video
Community viewing
18. Advantages and Disadvantages
Important class of AR interfaces
Wearable computers
AR simulation, training
Limited interactivity
Modification of virtual
content is difficult
Rekimoto, et al. 1997
19. 3D AR Interfaces
Virtual objects displayed in 3D
physical space and manipulated
HMDs and 6DOF head-tracking
6DOF hand trackers for input
Interaction
Viewpoint control
Traditional 3D user interface Kiyokawa, et al. 2000
interaction: manipulation, selection,
etc.
22. Advantages and Disadvantages
Important class of AR interfaces
Entertainment, design, training
Advantages
User can interact with 3D virtual
object everywhere in space
Natural, familiar interaction
Disadvantages
Usually no tactile feedback
User has to use different devices for
virtual and physical objects
Oshima, et al. 2000
23. Augmented Surfaces and
Tangible Interfaces
Basic principles
Virtual objects are
projected on a surface
Physical objects are used
as controls for virtual
objects
Support for collaboration
31. Other Examples
Triangles (Gorbert 1998)
Triangular based story telling
ActiveCube (Kitamura 2000-)
Cubes with sensors
32. Lessons from Tangible Interfaces
Physical objects make us smart
Norman’s “Things that Make Us Smart”
encode affordances, constraints
Objects aid collaboration
establish shared meaning
Objects increase understanding
serve as cognitive artifacts
33. TUI Limitations
Difficult to change object properties
can’t tell state of digital data
Limited display capabilities
projection screen = 2D
dependent on physical display surface
Separation between object and display
ARgroove
34. Advantages and Disadvantages
Advantages
Natural - users hands are used for interacting
with both virtual and real objects.
- No need for special purpose input devices
Disadvantages
Interaction is limited only to 2D surface
- Full 3D interaction and manipulation is difficult
36. Back to the Real World
AR overcomes limitation of TUIs
enhance display possibilities
merge task/display space
provide public and private views
TUI + AR = Tangible AR
Apply TUI methods to AR interface design
37. Space-multiplexed
Many devices each with one function
- Quicker to use, more intuitive, clutter
- Real Toolbox
Time-multiplexed
One device with many functions
- Space efficient
- mouse
46. Advantages and Disadvantages
Advantages
Natural interaction with virtual and physical tools
- No need for special purpose input devices
Spatial interaction with virtual objects
- 3D manipulation with virtual objects anywhere in physical
space
Disadvantages
Requires Head Mounted Display
47. Wrap-up
Browsing Interfaces
simple (conceptually!), unobtrusive
3D AR Interfaces
expressive, creative, require attention
Tangible Interfaces
Embedded into conventional environments
Tangible AR
Combines TUI input + AR display
48. AR User Interface: Categorization
Traditional Desktop: keyboard, mouse,
joystick (with or without 2D/3D GUI)
Specialized/VR Device: 3D VR devices,
specially design device
49. AR User Interface: Categorization
Tangible Interface : using physical object Hand/
Touch Interface : using pose and gesture of hand,
fingers
Body Interface: using movement of body
52. Keyboard and Mouse Interaction
Traditional input techniques
OSG provides a framework for handling keyboard
and mouse input events (osgGA)
1. Subclass osgGA::GUIEventHandler
2. Handle events:
• Mouse up / down / move / drag / scroll-wheel
• Key up / down
3. Add instance of new handler to the viewer
53. Keyboard and Mouse Interaction
Create your own event handler class
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {
public:
KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }
virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa,
osg::Object* obj, osg::NodeVisitor* nv) {
switch (ea.getEventType()) {
// Possible events we can handle
case osgGA::GUIEventAdapter::PUSH: break;
case osgGA::GUIEventAdapter::RELEASE: break;
case osgGA::GUIEventAdapter::MOVE: break;
case osgGA::GUIEventAdapter::DRAG: break;
case osgGA::GUIEventAdapter::SCROLL: break;
case osgGA::GUIEventAdapter::KEYUP: break;
case osgGA::GUIEventAdapter::KEYDOWN: break;
}
return false;
}
};
Add it to the viewer to receive events
viewer.addEventHandler(new KeyboardMouseEventHandler());
54. Keyboard Interaction
Handle W,A,S,D keys to move an object
case osgGA::GUIEventAdapter::KEYDOWN: {
switch (ea.getKey()) {
case 'w': // Move forward 5mm
localTransform->preMult(osg::Matrix::translate(0, -5, 0));
return true;
case 's': // Move back 5mm
localTransform->preMult(osg::Matrix::translate(0, 5, 0));
return true;
case 'a': // Rotate 10 degrees left
localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS));
return true;
case 'd': // Rotate 10 degrees right
localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS));
return true;
case ' ': // Reset the transformation
localTransform->setMatrix(osg::Matrix::identity());
return true;
}
break;
localTransform = new osg::MatrixTransform();
localTransform->addChild(osgDB::readNodeFile("media/car.ive"));
arTransform->addChild(localTransform.get());
56. Mouse Interaction
Mouse is pointing device…
Use mouse to select objects in an AR scene
OSG provides methods for ray-casting and
intersection testing
Return an osg::NodePath (the path from the hit
node all the way back to the root)
Projection
Plane (screen) scene
57. Mouse Interaction
Compute the list of nodes under the clicked position
Invoke an action on nodes that are hit, e.g. select, delete
case osgGA::GUIEventAdapter::PUSH:
osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa);
osgUtil::LineSegmentIntersector::Intersections intersections;
// Clear previous selections
for (unsigned int i = 0; i < targets.size(); i++) {
targets[i]->setSelected(false);
}
// Find new selection based on click position
if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {
for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin();
iter != intersections.end(); iter++) {
if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) {
std::cout << "HIT!" << std::endl;
target->setSelected(true);
return true;
}
}
}
break;
60. Single Marker Techniques: Proximity
Use distance from camera to marker as
input parameter
e.g. Lean in close to examine
Can use the osg::LOD class to show
different content at different depth
ranges Image: OpenSG Consortium
61. Single Marker Techniques: Proximity
// Load some models
osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");
osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");
osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");
// Use a Level-Of-Detail node to show each model at different distance ranges.
osg::ref_ptr<osg::LOD> lod = new osg::LOD();
lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away
lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away
lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away
arTransform->addChild(lod.get());
Define depth ranges for each node
Add as many as you want
Ranges can overlap
63. Multiple Marker Concepts
Interaction based on the relationship between
markers
e.g. When the distance between two markers
decreases below threshold invoke an action
Tangible User Interface
Applications:
Memory card games
File operations
64. Multiple Marker Proximity
Virtual
Camera
Transform A Transform B
Distance > Threshold
Switch A Switch B
Model Model Model Model
A1 A2 B1 B2
65. Multiple Marker Proximity
Virtual
Camera
Transform A Transform B
Distance <= Threshold
Switch A Switch B
Model Model Model Model
A1 A2 B1 B2
66. Multiple Marker Proximity
Use a node callback to test for proximity and update the relevant nodes
virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {
if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {
if (mMarkerA->valid() && mMarkerB->valid()) {
osg::Vec3 posA = mMarkerA->getTransform().getTrans();
osg::Vec3 posB = mMarkerB->getTransform().getTrans();
osg::Vec3 offset = posA - posB;
float distance = offset.length();
if (distance <= mThreshold) {
if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1);
if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1);
} else {
if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0);
if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0);
}
}
}
traverse(node,nv);
}
68. Paddle Interaction
Use one marker as a tool for selecting and
manipulating objects (tangible user interface)
Another marker provides a frame of reference
A grid of markers can alleviate problems with occlusion
MagicCup (Kato et al) VOMAR (Kato et al)
69. Paddle Interaction
Often useful to adopt a local coordinate system
Allows the camera
to move without
disrupting Tlocal
Places the paddle in
the same coordinate
system as the
content on the grid
Simplifies interaction
osgART computes Tlocal using the osgART::LocalTransformationCallback
70. Tilt and Shake Interaction
Detect types of paddle movement:
Tilt
- gradual change in orientation
Shake
- short, sudden changes in translation
74. Local vs. Global Interactions
Local
Actions determined from single camera to marker
transform
- shaking, appearance, relative position, range
Global
Actions determined from two relationships
- marker to camera, world to camera coords.
- Marker transform determined in world coordinates
• object tilt, absolute position, absolute rotation, hitting
75. Range-based Interaction
Sample File: RangeTest.c
/* get the camera transformation */
arGetTransMat(&marker_info[k], marker_center,
marker_width, marker_trans);
/* find the range */
Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
Zpos = marker_trans[2][3];
range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
77. Finding Multiple Transforms
Create object list
ObjectData_T *object;
Read in objects - in init( )
read_ObjData( char *name, int *objectnum );
Find Transform – in mainLoop( )
for( i = 0; i < objectnum; i++ ) {
..Check patterns
..Find transforms for each marker
}
78. Drawing Multiple Objects
Send the object list to the draw function
draw( object, objectnum );
Draw each object individually
for( i = 0; i < objectnum; i++ ) {
if( object[i].visible == 0 ) continue;
argConvGlpara(object[i].trans, gl_para);
draw_object( object[i].id, gl_para);
}
79. Proximity Based Interaction
Sample File – CollideTest.c
Detect distance between markers
checkCollisions(object[0],object[1], DIST)
If distance < collide distance
Then change the model/perform interaction
80. Multi-marker Tracking
Sample File – multiTest.c
Multiple markers to establish a
single coordinate frame
Reading in a configuration file
Tracking from sets of markers
Careful camera calibration
81. MultiMarker Configuration File
Sample File - Data/multi/marker.dat
Contains list of all the patterns and their exact
positions
#the number of patterns to be recognized
6
Pattern File
#marker 1
Pattern Width +
Data/multi/patt.a
Coordinate Origin
40.0
0.0 0.0 Pattern Transform
1.0000 0.0000 0.0000 -100.0000 Relative to Global
0.0000 1.0000 0.0000 50.0000 Origin
0.0000 0.0000 1.0000 0.0000
…
82. Camera Transform Calculation
Include <AR/arMulti.h>
Link to libARMulti.lib
In mainLoop()
Detect markers as usual
arDetectMarkerLite(dataPtr, thresh,
&marker_info, &marker_num)
Use MultiMarker Function
if( (err=arMultiGetTransMat(marker_info,
marker_num, config)) <
0 ) {
argSwapBuffers();
return;
}
84. Paddle Interaction Code
Sample File – PaddleDemo.c
Get paddle marker location + draw paddle before drawing
background model
paddleGetTrans(paddleInfo, marker_info,
marker_flag, marker_num, &cparam);
/* draw the paddle */
if( paddleInfo->active ){
draw_paddle( paddleInfo);
}
draw_paddle uses a Stencil Buffer to increase realism
85. Paddle Interaction Code II
Sample File – paddleDrawDemo.c
Finds the paddle position relative to global coordinate frame:
setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
Sample File – paddleTouch.c
Finds the paddle position:
findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
Checks for collisions:
checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
86. General Tangible AR Library
command_sub.c, command_sub.h
Contains functions for recognizing a range of
different paddle motions:
int check_shake( );
int check_punch( );
int check_incline( );
int check_pickup( );
int check_push( );
Eg: to check angle between paddle and base
check_incline(paddle->trans, base->trans, &ang)