Lecture 5 from the COSC 426 Graduate course on Augmented Reality. This lecture talks about AR development tools and interaction styles. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury. August 9th 2013
3. Data Gathering Techniques (1)
Questionnaires
Looking for specific information
Qualitative and quantitative results
Good for getting data from a large, dispersed group
Interviews
Good for exploring issues, using props
Structured, unstructured or semi-structured
But are time consuming and difficult to visit everyone
4. Data Gathering Techniques (2)
Workshops or focus groups
Group interviews/activities
Good at gaining a consensus view and/or highlighting areas
of conflict
Observations
Spending time with users in day to day tasks
Good for understanding task context
requires time and can result in a huge amount of data
Documentation
Procedures and rules written down in manuals
5. Elaboration and Reduction
Elaborate - generate solutions. These are the opportunities
Reduce - decide on the ones worth pursuing
Repeat - elaborate and reduce again on those solutions
Source: Laseau,P. (1980) Graphic Thinking for Architects & Designers. John Wiley and Sons
6. Tools for Effective Design
Personas
Scenarios
Storyboards
Wireframes and Mock-ups
Prototypes
7. Persona Technique
• Personas are a design tool to help visualize who you are
designing for and imagine how person will use the product
• A persona is an archetype that represents the behavior and
goals of a group of users
• Based on insights and observations from customer research
• Not real people, but synthesised from real user characteristics
• Bring them to life with a name, characteristics, goals, background
• Develop multiple personas
8. Gunther the Ad Guy
Gunther is from Germany. He
Travels extensively for work and
As he is an advertising executive
he needs to present concepts to
clients quickly and easily. He is
a person very well-versed in new
technologies and wishes he had
easier portable solutions for his
presentations…..
9. Scenarios
Usage Scenarios are narrative descriptions of how the product
meets the needs of a persona
Short (2 pages max)
Focus on unmet needs of persona
Concrete story
Set of stories around essential tasks, problems...
Use to test ideas
10. Storyboarding
Sequence of sketches showing use of system in
everyday use context
Concrete example
Easier (faster) to grasp than text based stories
Means of communication with users and system
developers
Sketches, not drawings...
Use to test interaction and make sure design works
11. Sketching is about design
Sketching is not about drawing
It is about design.
Sketching is a tool to help you:
- express
- develop, and
- communicate design ideas
Sketching is part of a process:
- idea generation,
- design elaboration
- design choices,
- engineering
12. Sketch vs. Prototype
Sketch
Prototype
Invite
A)end
Suggest
Describe
Explore
Refine
Ques;on
Answer
Propose
Test
Provoke
Resolve
Tenta;ve,
non
commi)al
Specific
Depic;on
The primary differences are in the intent
13. Types of Prototypes
Low Fidelity – quick and dirty, easy access
materials like cardboard and paper.
High Fidelity – more involved electronic
versions similar in materials to final product.
14. RAPID Prototyping
Fast and inexpensive
Identifies problems before they’re coded
Elicits more and better feedback from users
Helps developers think creatively
Gets users and other stakeholders involved early
Fosters teamwork and communication
Avoids opinion wars
Helps decide design directions
15. Paper Prototyping (Low Fidelity)
Quick and simple means of sketching interfaces
Use office materials
Easier to criticize, quick to change
Creative process (develop in team)
Can also use for usability test (focus on flow of
interaction rather than visuals)
Used a lot to test out concepts before real design begins.
17. High-fidelity prototyping
• Uses materials that you would expect to be in the
final product.
• Prototype looks more like the final system than a
low-fidelity version.
• For a high-fidelity software prototype common
environments include Macromedia Director,Visual
Basic, and Smalltalk.
• Danger that users think they have a full
system…….see compromises
18. Rapid Prototyping
Speed development time with quick hardware mockups
handheld device connected to PC
LCD screen, USB phone keypad, Camera
Can use PC development tools for rapid development
Flash, Visual Basic, etc
22. ARToolKit (Kato 1998)
Open source – computer vision based AR tracking
http://artoolkit.sourceforge.net/
23. ARToolKit Structure
Three key libraries:
AR32.lib – ARToolKit image processing functions
ARgsub32.lib – ARToolKit graphics functions
ARvideo.lib – DirectShow video capture class
DirectShow
ARvideo.lib
24. Software
Cross platform
Windows, Mac, Linux, IRIX, Symbian, iPhone, etc
Additional basic libraries
Video capture library (Video4Linux, VisionSDK)
OpenGL, GLUT
Requires a rendering library
Open VRML, Open Inventor, osgART, etc
25. Additional Software
ARToolKit just provides tracking
For an AR application you’ll need more software
High level rendering library
Open VRML, Open Inventor, osgART, etc
Audio Library
Fmod, etc
Peripheral support
26. What does ARToolKit Calculate?
Position of makers in the camera coordinates
Pose of markers in the camera coordinates
Output format
3x4 matrix format to represent the
transformation matrix from the marker
coordinates to the camera coordinates
32. Ex. 2: Detecting a Marker
Program : sample2.c
Key points
Threshold value
Important external variables
arDebug – keep thresholded image
arImage – pointer for thresholded image
arImageProcMode – use 50% image for image
processing
- AR_IMAGE_PROC_IN_FULL
- AR_IMAGE_PROC_IN_HALF
33. Sample2.c – marker detection
/* detect the markers in the video frame */
if(arDetectMarker(dataPtr, thresh,
&marker_info, &marker_num) < 0 ) {
cleanup();
exit(0);
}
for( i = 0; i < marker_num; i++ ) {
argDrawSquare(marker_info[i].vertex,0,0);
}
34. Making a pattern template
Use of utility program:
mk_patt.exe
Show the pattern
Put the corner of red line
segments on the left-top
vertex of the marker
Pattern stored as a
template in a file
1:2:1 ratio determines the
pattern region used
35. Ex. 4 – Getting 3D information
Program : sample4.c
Key points
Definition of a real marker
Transformation matrix
- Rotation component
- Translation component
37. Finding the Camera Position
This function sets transformation matrix from marker
to camera into marker_trans[3][4]."
arGetTransMat(&marker_info[k], marker_center,
marker_width, marker_trans);
You can see the position information in the values of
marker_trans[3][4]."
" Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
Zpos = marker_trans[2][3];
39. Ex. 5- Virtual Object Display
Program : sample5.c
Key points
OpenGL parameter setting
Setup of projection matrix
Setup of modelview matrix
40. Appending your own OpenGL code
Set the camera parameters to OpenGL Projection matrix.
argDrawMode3D();
argDraw3dCamera( 0, 0 );
Set the transformation matrix from the marker to the camera to
the OpenGL ModelView matrix.
argConvGlpara(marker_trans, gl_para);
glMatrixMode(GL_MODELVIEW);
glLoadMatrixd( gl_para );
After calling these functions, your OpenGL objects are
drawn in the real marker coordinates.
41. ARToolKit in the World
Hundreds of projects
Large research community
43. FLARToolKit
Flash AS3 Version of the ARToolKit
(was ported from NyARToolkit the Java Version of the ARToolkit)
enables augmented reality in the Browser
uses Papervision3D for as 3D Engine
available at http://saqoosha.net/
dual license, GPL and commercial license
45. private function mainEnter(e:Event):void {
/* Capture video frame*/
capture.draw(vid);
/* Detect marker */
if (detector.detectMarkerLite(raster, 80) && detector.getConfidence() > 0.5)
{
//Get the transfomration matrix for the current marker position
detector.getTransformMatrix(trans);
//Translates and rotates the mainContainer so it looks right
mainContainer.setTransformMatrix(trans);
//Render the papervision scene
renderer.render();
}
}
49. Other Languages
NyARToolKit
http://nyatla.jp/nyartoolkit/wp/
AS3, Java, C#, Processing, Unity, etc
openFrameworks
http://www.openframeworks.cc/
https://sites.google.com/site/ofauckland/examples/8-artoolkit-example
Support for other libraries
- Kinect, Audio, Physics, etc
50. void testApp::update(){ //capture video and detect markers
mov.update();
if (mov.isFrameNew()) {
img.setFromPixels(mov.getPixels(), ofGetWidth(), ofGetHeight());
gray = img;
tracker.setFromPixels(gray.getPixels());
}
}
//--------------------------------------------------------------
void testApp::draw(){ //draw AR objects
ofSetColor(0xffffff); mov.draw(0, 0);
for (int i=0; i<tracker.markers.size(); i++) {
ARMarkerInfo &m = tracker.markers[i];
tracker.loadMarkerModelViewMatrix(m);
ofSetColor(255, 255, 0, 100); ofCircle(0,0,25); ofSetColor(0);
ofDrawBitmapString(ofToString(m.id),0,0);
}
}
51. Low Level Mobile AR Tools
Vuforia Tracking Library (Qualcomm)
Vuforia.com
iOS, Android
Computer vision based tracking
Marker tracking, 3D objects, frame
markers
Integration with Unity
Interaction, model loading
52. OSGART Programming Library
Integration of ARToolKit with a High-Level
Rendering Engine (OpenSceneGraph)
OSGART= OpenSceneGraph + ARToolKit
Supporting Geometric + Photometric Registration
53. osgART:Features
C++ (but also Python, Lua, etc).
Multiple Video Input supports:
Direct (Firewire/USB Camera), Files, Network by
ARvideo, PtGrey, CVCam, VideoWrapper, etc.
Benefits of Open Scene Graph
Rendering Engine, Plug-ins, etc
54. mARx Plug-in
3D Studio Max Plug-in
Can model and view AR content at the same time
62. What is a Scene Graph?
Tree-like structure for organising a virtual world
e.g. VRML
Hierarchy of nodes that define:
Groups (and Switches, Sequences etc…)
Transformations
Projections
Geometry
…
And states and attributes that define:
Materials and textures
Lighting and blending
…
64. Benefits of a Scene Graph
Performance
Structuring data facilitates
optimization
- Culling, state management, etc…
Abstraction
Underlying graphics pipeline is
hidden
Low-level programming (“how do I
display this?”) replaced with high-
level concepts (“what do I want to
display?”)
Image: sgi
65. About Open Scene Graph
http://www.openscenegraph.org/
Open-source scene graph implementation
Based on OpenGL
Object-oriented C++ following design pattern principles
Used for simulation, games, research, and industrial projects
Active development community
Maintained by Robert Osfield
~2000 mailing list subscribers
Documentation project: www.osgbooks.com
Uses the OSG Public License (similar to LGPL)
66. About Open Scene Graph (2)
Pirates of the XXI Century Flightgear
3DVRII Research Institute EOR
SCANeR
VRlab Umeå University
67. Open Scene Graph Features
Plugins for loading and saving
3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…
2D: .png, .jpg, .bmp, QuickTime movies
NodeKits to extend functionality
e.g. osgShadow
Cross-platform support for:
Window management (osgViewer)
Threading (OpenThreads)
68. Open Scene Graph Architecture
Plugins read and
write 2D image and
3D model files
NodeKits extend
core functionality,
exposing higher-level
node types
Scene graph and
rendering
functionality
Inter-operability with
other environments,
e.g. Python
69. Some Open Scene Graph Demos
You may want to get the OSG data package:
Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk
osgviewer osgmotionblur osgparticle
osgreflect osgdistortion osgfxbrowser
70. Learning OSG
Check out the Quick Start Guide
Free PDF download at http://osgbooks.com/, Physical copy $13US
Join the mailing list: http://www.openscenegraph.org/projects/osg/wiki/
MailingLists
Browse the website: http://www.openscenegraph.org/projects/osg
Use the forum: http://forum.openscenegraph.org
Study the examples
Read the source?
72. What is osgART?
osgART adds AR to Open Scene Graph
Further developed and enhanced by:
Julian Looser
Hartmut Seichter
Raphael Grasset
Current version 2.0, Open Source
http://www.osgart.org
73. osgART Approach: Basic Scene Graph
Root
Transform
3D Object
0.988 -0.031 -0.145 0
-0.048 0.857 -0.512 0
0.141 0.513 0.846 0
10.939 29.859 -226.733 1
[ ]
To add Video see-through AR:
Integrate live video
Apply correct projection matrix
Update tracked transformations in
realtime
75. osgART Approach: AR Scene Graph
Video
Geode
Root
Transform
3D Object
Virtual
Camera
Video
Layer
76. osgART Approach: AR Scene Graph
Video
Geode
Root
Transform
3D Object
Virtual
Camera
Projection matrix from
tracker calibration
Transformation matrix
updated from marker
tracking in realtimeVideo
Layer
Full-screen quad
with live texture
updated from
Video source
Orthographic
projection
77. osgART Approach: AR Scene Graph
Video
Geode
Root
Transform
3D Object
Virtual
Camera
Projection matrix from
tracker calibration
Transformation matrix
updated from marker
tracking in realtimeVideo
Layer
Full-screen quad
with live texture
updated from
Video source
Orthographic
projection
78. osgART Architecture
Like any video see-through AR library, osgART requires video
input and tracking capabilities.
ARLibrary
Application
Video Source
e.g. DirectShow
Tracking Module
(libAR.lib)
79. osgART Architecture
osgART uses a plugin architecture so that video sources and tracking
technologies can be plugged in as necessary
osgART
Application
VideoPluginTrackerPlugin
ARToolKit4 -
ARToolkitPlus -
MXRToolKit -
ARLib -
bazAR (work in progress) -
ARTag (work in progress) -
OpenCVVideo -
VidCapture -
CMU1394 -
PointGrey SDK -
VidereDesign -
VideoWrapper -
VideoInput -
VideoSource -
DSVL -
Intranel RTSP -
80. Basic osgART Tutorial
Develop a working osgART application from scratch.
Use ARToolKit 2.72 library for tracking and video
capture
81. osgART Tutorial 1: Basic OSG Viewer
Start with the standard Open Scene Graph Viewer
We will modify this to do AR!
82. osgART Tutorial 1: Basic OSG Viewer
The basic osgViewer…
#include <osgViewer/Viewer>
#include <osgViewer/ViewerEventHandlers>
int main(int argc, char* argv[]) {
// Create a viewer
osgViewer::Viewer viewer;
// Create a root node
osg::ref_ptr<osg::Group> root = new osg::Group;
// Attach root node to the viewer
viewer.setSceneData(root.get());
// Add relevant event handlers to the viewer
viewer.addEventHandler(new osgViewer::StatsHandler);
viewer.addEventHandler(new osgViewer::WindowSizeHandler);
viewer.addEventHandler(new osgViewer::ThreadingHandler);
viewer.addEventHandler(new osgViewer::HelpHandler);
// Run the viewer and exit the program when the viewer is closed
return viewer.run();
}
83. osgART Tutorial 2: Adding Video
Add a video plugin
Load, configure, start video capture…
Add a video background
Create, link to video, add to scene-graph
84. osgART Tutorial 2: Adding Video
New code to load and configure a Video Plugin:
// Preload the video and tracker
int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2");
// Load a video plugin.
osg::ref_ptr<osgART::Video> video =
dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id));
// Check if an instance of the video stream could be created
if (!video.valid()) {
// Without video an AR application can not work. Quit if none found.
osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl;
exit(-1);
}
// Open the video. This will not yet start the video stream but will
// get information about the format of the video which is essential
// for the connected tracker.
video->open();
85. osgART Tutorial 2: Adding Video
New code to add a live video background
osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get());
videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin");
root->addChild(videoBackground.get());
video->start();
osg::Group* createImageBackground(osg::Image* video) {
osgART::VideoLayer* _layer = new osgART::VideoLayer();
_layer->setSize(*video);
osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video);
addTexturedQuad(*_geode, video->s(), video->t());
_layer->addChild(_geode);
return _layer;
}
In the main function…
86. osgART Tutorial 3: Tracking
Add a Tracker plugin
Load, configure, link to video
Add a Marker to track
Load, activate
Tracked node
Create, link with marker via tracking callbacks
Print out the tracking data
87. osgART Tutorial 3: Tracking
int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2");
osg::ref_ptr<osgART::Tracker> tracker =
dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id));
if (!tracker.valid()) {
// Without tracker an AR application can not work. Quit if none found.
osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl;
exit(-1);
}
// get the tracker calibration object
osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration();
// load a calibration file
if (!calibration->load("data/camera_para.dat"))
{
// the calibration file was non-existing or couldnt be loaded
osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl;
exit(-1);
}
// set the image source for the tracker
tracker->setImage(video.get());
osgART::TrackerCallback::addOrSet(root.get(), tracker.get());
// create the virtual camera and add it to the scene
osg::ref_ptr<osg::Camera> cam = calibration->createCamera();
root->addChild(cam.get());
Load a tracking plugin and associate it with the video plugin
88. osgART Tutorial 3: Tracking
osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0");
if (!marker.valid())
{
// Without marker an AR application can not work. Quit if none found.
osg::notify(osg::FATAL) << "Could not add marker!" << std::endl;
exit(-1);
}
marker->setActive(true);
osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get());
cam->addChild(arTransform.get());
Load a marker and activate it
Associate it with a transformation node (via event callbacks)
Add the transformation node to the virtual camera node
osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));
Add a debug callback to print out information about the tracked marker
90. osgART Tutorial 4: Adding Content
Now put the tracking data to use!
Add content to the tracked transform
Basic cube code
arTransform->addChild(osgART::testCube());
arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
91. osgART Tutorial 5: Adding 3D Model
Open Scene Graph can load some 3D formats directly:
e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA
Others need to be converted
Support for some formats is much better than others
e.g. OpenFlight good, 3ds hit and miss.
Recommend native .osg and .ive formats
.osg – ASCII representation of scene graph
.ive – Binary OSG file. Can contain hold textures.
osgExp : Exporter for 3DS Max is a good choice
http://sourceforge.net/projects/osgmaxexp
Otherwise .3ds files from TurboSquid can work
92. osgART Tutorial 5: Adding 3D Model
std::string filename = "media/hollow_cube.osg";
arTransform->addChild(osgDB::readNodeFile(filename));
Replace the simple cube with a 3D model
Models are loaded using the osgDB::readNodeFile() function
Note: Scale is important. Units are in mm.
3D Studio Max
Export to .osg
osgART
93. osgART Tutorial 6: Multiple Markers
Repeat the process so far to track more than
one marker simultaneously
94. osgART Tutorial 6: Multiple Markers
osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get());
arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg"));
arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
cam->addChild(arTransformA.get());
osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get());
arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg"));
arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
cam->addChild(arTransformB.get());
Load and activate two markers
osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0");
markerA->setActive(true);
osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0");
markerB->setActive(true);
Create two transformations, attach callbacks, and add models
Repeat the process so far to track more than one marker
96. Basic osgART Tutorial: Summary
Standard OSGViewer Addition ofVideo Addition of Tracking
Addition of basic 3D
graphics
Addition of 3D Model Multiple Markers
99. AR Interaction
Designing AR System = Interface Design
Using different input and output technologies
Objective is a high quality of user experience
Ease of use and learning
Performance and satisfaction
103. More terminology
Interaction Device= Input/Output of User
Interface
Interaction Style= category of similar
interaction techniques
Interaction Paradigm
Modality (human sense)
Usability
104. Back to AR
You can see spatially registered AR..
how can you interact with it?
105. Interaction Tasks
2D (from [Foley]):
Selection, Text Entry, Quantify, Position
3D (from [Bowman]):
Navigation (Travel/Wayfinding)
Selection
Manipulation
System Control/Data Input
AR: 2D + 3D Tasks and.. more specific tasks?
[Foley] The Human Factors of Computer Graphics InteractionTechniques Foley, J. D.,V.Wallace & P. Chan. IEEE Computer
Graphics and Applications(Nov.): 13-48. 1984.
[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
106. AR Interfaces as Data Browsers
2D/3D virtual objects are
registered in 3D
“VR in Real World”
Interaction
2D/3D virtual viewpoint
control
Applications
Visualization, training
107. AR Information Browsers
Information is registered to
real-world context
Hand held AR displays
Interaction
Manipulation of a window
into information space
Applications
Context-aware information displays
Rekimoto, et al. 1997
109. Current AR Information Browsers
Mobile AR
GPS + compass
Many Applications
Layar
Wikitude
Acrossair
PressLite
Yelp
AR Car Finder
…
110. Junaio
AR Browser from Metaio
http://www.junaio.com/
AR browsing
GPS + compass
2D/3D object placement
Photos/live video
Community viewing
114. Advantages and Disadvantages
Important class of AR interfaces
Wearable computers
AR simulation, training
Limited interactivity
Modification of virtual
content is difficult
Rekimoto, et al. 1997
115. 3D AR Interfaces
Virtual objects displayed in 3D
physical space and manipulated
HMDs and 6DOF head-tracking
6DOF hand trackers for input
Interaction
Viewpoint control
Traditional 3D user interface
interaction: manipulation, selection,
etc.
Kiyokawa, et al. 2000
118. Advantages and Disadvantages
Important class of AR interfaces
Entertainment, design, training
Advantages
User can interact with 3D virtual
object everywhere in space
Natural, familiar interaction
Disadvantages
Usually no tactile feedback
User has to use different devices for
virtual and physical objects
Oshima, et al. 2000
119. Augmented Surfaces and
Tangible Interfaces
Basic principles
Virtual objects are
projected on a surface
Physical objects are used
as controls for virtual
objects
Support for collaboration
127. Other Examples
Triangles (Gorbert 1998)
Triangular based story telling
ActiveCube (Kitamura 2000-)
Cubes with sensors
128. Lessons from Tangible Interfaces
Physical objects make us smart
Norman’s “Things that Make Us Smart”
encode affordances, constraints
Objects aid collaboration
establish shared meaning
Objects increase understanding
serve as cognitive artifacts
129. TUI Limitations
Difficult to change object properties
can’t tell state of digital data
Limited display capabilities
projection screen = 2D
dependent on physical display surface
Separation between object and display
ARgroove
130. Advantages and Disadvantages
Advantages
Natural - users hands are used for interacting
with both virtual and real objects.
- No need for special purpose input devices
Disadvantages
Interaction is limited only to 2D surface
- Full 3D interaction and manipulation is difficult
132. Back to the Real World
AR overcomes limitation of TUIs
enhance display possibilities
merge task/display space
provide public and private views
TUI + AR = Tangible AR
Apply TUI methods to AR interface design
133. Space-multiplexed
Many devices each with one function
- Quicker to use, more intuitive, clutter
- Real Toolbox
Time-multiplexed
One device with many functions
- Space efficient
- mouse
142. Advantages and Disadvantages
Advantages
Natural interaction with virtual and physical tools
- No need for special purpose input devices
Spatial interaction with virtual objects
- 3D manipulation with virtual objects anywhere in physical
space
Disadvantages
Requires Head Mounted Display
143. Wrap-up
Browsing Interfaces
simple (conceptually!), unobtrusive
3D AR Interfaces
expressive, creative, require attention
Tangible Interfaces
Embedded into conventional environments
Tangible AR
Combines TUI input + AR display
144. AR User Interface: Categorization
Traditional Desktop: keyboard, mouse,
joystick (with or without 2D/3D GUI)
Specialized/VR Device: 3D VR devices,
specially design device
145. AR User Interface: Categorization
Tangible Interface : using physical object Hand/
Touch Interface : using pose and gesture of hand,
fingers
Body Interface: using movement of body
150. Project Assignment
Design/Related work exercise
Individual
Each person find 2 relevant papers/videos/websites
Write two page literature review
As a team - prototype design
Sketch out the user interface of application
Design the interaction flow/Screen mockups
3 minute Presentation in class August 16th