SlideShare ist ein Scribd-Unternehmen logo
1 von 150
Downloaden Sie, um offline zu lesen
COSC 426: Augmented Reality
Mark Billinghurst
mark.billinghurst@hitlabnz.org
August 9th 2013
Lecture 5: AR Tools and Interaction
Interaction Design Process
Data Gathering Techniques (1)
  Questionnaires
  Looking for specific information
  Qualitative and quantitative results
  Good for getting data from a large, dispersed group
  Interviews
  Good for exploring issues, using props
  Structured, unstructured or semi-structured
  But are time consuming and difficult to visit everyone
Data Gathering Techniques (2)
  Workshops or focus groups
  Group interviews/activities
  Good at gaining a consensus view and/or highlighting areas
of conflict
  Observations
  Spending time with users in day to day tasks
  Good for understanding task context
  requires time and can result in a huge amount of data
  Documentation
  Procedures and rules written down in manuals
Elaboration and Reduction
  Elaborate - generate solutions. These are the opportunities
  Reduce - decide on the ones worth pursuing
  Repeat - elaborate and reduce again on those solutions
Source: Laseau,P. (1980) Graphic Thinking for Architects & Designers. John Wiley and Sons
Tools for Effective Design
 Personas
 Scenarios
 Storyboards
 Wireframes and Mock-ups
 Prototypes
Persona Technique
•  Personas are a design tool to help visualize who you are
designing for and imagine how person will use the product
•  A persona is an archetype that represents the behavior and
goals of a group of users
•  Based on insights and observations from customer research
•  Not real people, but synthesised from real user characteristics
•  Bring them to life with a name, characteristics, goals, background
•  Develop multiple personas
Gunther the Ad Guy
Gunther is from Germany. He
Travels extensively for work and
As he is an advertising executive
he needs to present concepts to
clients quickly and easily. He is
a person very well-versed in new
technologies and wishes he had
easier portable solutions for his
presentations…..
Scenarios
Usage Scenarios are narrative descriptions of how the product
meets the needs of a persona
Short (2 pages max)
Focus on unmet needs of persona
Concrete story
Set of stories around essential tasks, problems...
Use to test ideas
Storyboarding
Sequence of sketches showing use of system in
everyday use context
Concrete example
Easier (faster) to grasp than text based stories
Means of communication with users and system
developers
Sketches, not drawings...
Use to test interaction and make sure design works
Sketching is about design
Sketching is not about drawing
It is about design.
Sketching is a tool to help you:
-  express
-  develop, and
-  communicate design ideas
Sketching is part of a process:
-  idea generation,
-  design elaboration
-  design choices,
-  engineering
Sketch vs. Prototype
Sketch	
   Prototype	
  
Invite	
   A)end	
  
Suggest	
   Describe	
  
Explore	
   Refine	
  
Ques;on	
   Answer	
  
Propose	
   Test	
  
Provoke	
   Resolve	
  
Tenta;ve,	
  non	
  commi)al	
   Specific	
  Depic;on	
  
The primary differences are in the intent
Types of Prototypes
Low Fidelity – quick and dirty, easy access
materials like cardboard and paper.
High Fidelity – more involved electronic
versions similar in materials to final product.
RAPID Prototyping
  Fast and inexpensive
  Identifies problems before they’re coded
  Elicits more and better feedback from users
  Helps developers think creatively
  Gets users and other stakeholders involved early
  Fosters teamwork and communication
  Avoids opinion wars
  Helps decide design directions
Paper Prototyping (Low Fidelity)
Quick and simple means of sketching interfaces
Use office materials
Easier to criticize, quick to change
Creative process (develop in team)
Can also use for usability test (focus on flow of
interaction rather than visuals)
Used a lot to test out concepts before real design begins.
Paper Prototyping
High-fidelity prototyping
•  Uses materials that you would expect to be in the
final product.
•  Prototype looks more like the final system than a
low-fidelity version.
•  For a high-fidelity software prototype common
environments include Macromedia Director,Visual
Basic, and Smalltalk.
•  Danger that users think they have a full
system…….see compromises
Rapid Prototyping
  Speed development time with quick hardware mockups
  handheld device connected to PC
  LCD screen, USB phone keypad, Camera
  Can use PC development tools for rapid development
  Flash, Visual Basic, etc
AR Tools
experiences
applications
tools
components
Sony CSL © 2004
Building Compelling AR Experiences
Tracking, Display
Authoring
AR Authoring Tools
  Low Level Software Libraries
  osgART, Studierstube, MXRToolKit
  Plug-ins to existing software
  DART (Macromedia Director), mARx, Unity,
  Stand Alone
  AMIRE, BuildAR, Metaio Creator etc
  Rapid Prototyping Tools
  Flash, OpenFrameworks, Processing, Arduino, etc
  Next Generation
  iaTAR (Tangible AR)
ARToolKit (Kato 1998)
  Open source – computer vision based AR tracking
  http://artoolkit.sourceforge.net/
ARToolKit Structure
  Three key libraries:
  AR32.lib – ARToolKit image processing functions
  ARgsub32.lib – ARToolKit graphics functions
  ARvideo.lib – DirectShow video capture class
DirectShow
ARvideo.lib
Software
  Cross platform
  Windows, Mac, Linux, IRIX, Symbian, iPhone, etc
  Additional basic libraries
  Video capture library (Video4Linux, VisionSDK)
  OpenGL, GLUT
  Requires a rendering library
  Open VRML, Open Inventor, osgART, etc
Additional Software
  ARToolKit just provides tracking
  For an AR application you’ll need more software
  High level rendering library
  Open VRML, Open Inventor, osgART, etc
  Audio Library
  Fmod, etc
  Peripheral support
What does ARToolKit Calculate?	
  Position of makers in the camera coordinates
  Pose of markers in the camera coordinates
  Output format
  3x4 matrix format to represent the
transformation matrix from the marker
coordinates to the camera coordinates
Coordinate Systems
From Marker To Camera
  Rotation & Translation
TCM : 4x4 transformation matrix
from marker coord. to camera coord.
An ARToolKit Application
  Initialization
  Load camera and pattern parameters
  Main Loop
  Step1. Image capture and display
  Step2. Marker detection
  Step3. Marker identification
  Step4. Getting pose information
  Step5. Object Interactions/Simulation
  Step6. Display virtual objects
  End Application
  Camera shut down
Sample1.c Main Function
main()!
{!
!init();!
!argMainLoop( mouseEvent, !
! !keyEvent, mainLoop); !
}!
Sample1.c - mainLoop Function
if( dataPtr = (ARUint8 *)
arVideoGetImage()) == NULL ) {
arUtilSleep(2);
return;
}
argDrawMode2D();
argDispImage(dataPtr, 0, 0 );
arVideoCapNext();
argSwapBuffers();
Ex. 2: Detecting a Marker
  Program : sample2.c
  Key points
  Threshold value
  Important external variables
  arDebug – keep thresholded image
  arImage – pointer for thresholded image
  arImageProcMode – use 50% image for image
processing
-  AR_IMAGE_PROC_IN_FULL
-  AR_IMAGE_PROC_IN_HALF
Sample2.c – marker detection
/* detect the markers in the video frame */
if(arDetectMarker(dataPtr, thresh,
&marker_info, &marker_num) < 0 ) {
cleanup();
exit(0);
}
for( i = 0; i < marker_num; i++ ) {
argDrawSquare(marker_info[i].vertex,0,0);
}
Making a pattern template
  Use of utility program:
mk_patt.exe
  Show the pattern
  Put the corner of red line
segments on the left-top
vertex of the marker
  Pattern stored as a
template in a file
  1:2:1 ratio determines the
pattern region used
Ex. 4 – Getting 3D information
  Program : sample4.c
  Key points
 Definition of a real marker
 Transformation matrix
-  Rotation component
-  Translation component
Sample4.c – Transformation matrix
double marker_center[2] = {0.0, 0.0};
double marker_width = 80.0;
double marker_trans[3][4];
arGetTransMat(&marker_info[i],
marker_center, marker_width,
marker_trans);
Finding the Camera Position
This function sets transformation matrix from marker
to camera into marker_trans[3][4]."
arGetTransMat(&marker_info[k], marker_center,
marker_width, marker_trans);
You can see the position information in the values of
marker_trans[3][4]."
" Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
Zpos = marker_trans[2][3];
ARToolKit Coordinate Frame
Ex. 5- Virtual Object Display
  Program : sample5.c
  Key points
 OpenGL parameter setting
 Setup of projection matrix
 Setup of modelview matrix
Appending your own OpenGL code
Set the camera parameters to OpenGL Projection matrix.
argDrawMode3D();
argDraw3dCamera( 0, 0 );
Set the transformation matrix from the marker to the camera to
the OpenGL ModelView matrix.
argConvGlpara(marker_trans, gl_para);
glMatrixMode(GL_MODELVIEW);
glLoadMatrixd( gl_para );
After calling these functions, your OpenGL objects are
drawn in the real marker coordinates.
ARToolKit in the World
  Hundreds of projects
  Large research community
ARToolKit Family
ARToolKit ARToolKit NFT
ARToolKit (Symbian)
NyToolKit
- Java, C#,
- Android, WM
JARToolKit (Java)
FLARToolKit (Flash)
FLARManager (Flash)
FLARToolKit
  Flash AS3 Version of the ARToolKit
(was ported from NyARToolkit the Java Version of the ARToolkit)
  enables augmented reality in the Browser
  uses Papervision3D for as 3D Engine
  available at http://saqoosha.net/
  dual license, GPL and commercial license
FLARToolkit
Papervision 3D
Adobe Flash
AR Application Components
private function mainEnter(e:Event):void {
/* Capture video frame*/
capture.draw(vid);
/* Detect marker */
if (detector.detectMarkerLite(raster, 80) && detector.getConfidence() > 0.5)
{
//Get the transfomration matrix for the current marker position
detector.getTransformMatrix(trans);
//Translates and rotates the mainContainer so it looks right
mainContainer.setTransformMatrix(trans);
//Render the papervision scene
renderer.render();
}
}
FLARToolKit Examples
Papervision 3D
  http://www.papervision3d.org/
  Flash-based 3D-Engine
  Supports
  import of 3D Models
  texturing
  animation
  scene graph
  alternatives: Away3d, Sandy,…
Source Packages
  „Original“ FLARToolkit (Libspark, Saqoosha) (
http://www.libspark.org/svn/as3/FLARToolKit/trunk/ )
  Start-up-guides
  Saqoosha (http://saqoosha.net/en/flartoolkit/start-up-guide/ )
  Miko Haapoja (http://www.mikkoh.com/blog/?p=182 )
  „Frameworks“
  Squidder MultipleMarker – Example (
http://www.squidder.com/2009/03/06/flar-how-to-multiple-instances-of-multiple-
markers/ )
  FLARManager (http://words.transmote.com/wp/flarmanager/ )
Other Languages
  NyARToolKit
  http://nyatla.jp/nyartoolkit/wp/
  AS3, Java, C#, Processing, Unity, etc
  openFrameworks
  http://www.openframeworks.cc/
  https://sites.google.com/site/ofauckland/examples/8-artoolkit-example
  Support for other libraries
-  Kinect, Audio, Physics, etc
void testApp::update(){ //capture video and detect markers
mov.update();
if (mov.isFrameNew()) {
img.setFromPixels(mov.getPixels(), ofGetWidth(), ofGetHeight());
gray = img;
tracker.setFromPixels(gray.getPixels());
}
}
//--------------------------------------------------------------
void testApp::draw(){ //draw AR objects
ofSetColor(0xffffff); mov.draw(0, 0);
for (int i=0; i<tracker.markers.size(); i++) {
ARMarkerInfo &m = tracker.markers[i];
tracker.loadMarkerModelViewMatrix(m);
ofSetColor(255, 255, 0, 100); ofCircle(0,0,25); ofSetColor(0);
ofDrawBitmapString(ofToString(m.id),0,0);
}
}
Low Level Mobile AR Tools
  Vuforia Tracking Library (Qualcomm)
  Vuforia.com
  iOS, Android
  Computer vision based tracking
  Marker tracking, 3D objects, frame
markers
  Integration with Unity
  Interaction, model loading
OSGART Programming Library
  Integration of ARToolKit with a High-Level
Rendering Engine (OpenSceneGraph)
OSGART= OpenSceneGraph + ARToolKit
  Supporting Geometric + Photometric Registration
osgART:Features
  C++ (but also Python, Lua, etc).
  Multiple Video Input supports:
  Direct (Firewire/USB Camera), Files, Network by
ARvideo, PtGrey, CVCam, VideoWrapper, etc.
  Benefits of Open Scene Graph
  Rendering Engine, Plug-ins, etc
mARx Plug-in
  3D Studio Max Plug-in
  Can model and view AR content at the same time
BuildAR
  http://www.buildar.co.nz/
  Stand alone application
  Visual interface for AR model viewing application
  Enables non-programmers to build AR scenes
Metaio Creator
  Drag and drop AR
  http://www.metaio.com/creator/
Total Immersion D’Fusion Studio
  Complete commercial authoring platform
  http://www.t-immersion.com/
  Multi-platform
  Markerless tracking
  Scripting
  Face tracking
  Finger tracking
  Kinect support
Others
  AR-Media
  http://www.inglobetechnologies.com/
  Google sketch-up plug-in
  LinceoVR
  http://linceovr.seac02.it/
  AR/VR authoring package
  Libraries
  JARToolKit, MXRToolKit, ARLib, Goblin XNA
More Libraries
  JARToolKit
  MRToolKit, MXRToolKit, ARLib, OpenVIDIA
  DWARF, Goblin XNA
  AMIRE
  D’Fusion
Advanced Authoring: iaTAR (Lee 2004)
  Immersive AR Authoring
  Using real objects to create AR applications
osgART
Developing Augmented Reality
Applications with osgART
What is a Scene Graph?
  Tree-like structure for organising a virtual world
  e.g. VRML
  Hierarchy of nodes that define:
  Groups (and Switches, Sequences etc…)
  Transformations
  Projections
  Geometry
  …
  And states and attributes that define:
  Materials and textures
  Lighting and blending
  …
Scene Graph Example
Benefits of a Scene Graph
  Performance
  Structuring data facilitates
optimization
-  Culling, state management, etc…
  Abstraction
  Underlying graphics pipeline is
hidden
  Low-level programming (“how do I
display this?”) replaced with high-
level concepts (“what do I want to
display?”)
Image: sgi
About Open Scene Graph
  http://www.openscenegraph.org/
  Open-source scene graph implementation
  Based on OpenGL
  Object-oriented C++ following design pattern principles
  Used for simulation, games, research, and industrial projects
  Active development community
  Maintained by Robert Osfield
  ~2000 mailing list subscribers
  Documentation project: www.osgbooks.com
  Uses the OSG Public License (similar to LGPL)
About Open Scene Graph (2)
Pirates of the XXI Century Flightgear
3DVRII Research Institute EOR
SCANeR
VRlab Umeå University
Open Scene Graph Features
  Plugins for loading and saving
  3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…
  2D: .png, .jpg, .bmp, QuickTime movies
  NodeKits to extend functionality
  e.g. osgShadow
  Cross-platform support for:
  Window management (osgViewer)
  Threading (OpenThreads)
Open Scene Graph Architecture
Plugins read and
write 2D image and
3D model files
NodeKits extend
core functionality,
exposing higher-level
node types
Scene graph and
rendering
functionality
Inter-operability with
other environments,
e.g. Python
Some Open Scene Graph Demos
  You may want to get the OSG data package:
  Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk
osgviewer osgmotionblur osgparticle
osgreflect osgdistortion osgfxbrowser
Learning OSG
  Check out the Quick Start Guide
  Free PDF download at http://osgbooks.com/, Physical copy $13US
  Join the mailing list: http://www.openscenegraph.org/projects/osg/wiki/
MailingLists
  Browse the website: http://www.openscenegraph.org/projects/osg
  Use the forum: http://forum.openscenegraph.org
  Study the examples
  Read the source? 
osgART
What is osgART?
  osgART adds AR to Open Scene Graph
  Further developed and enhanced by:
  Julian Looser
  Hartmut Seichter
  Raphael Grasset
  Current version 2.0, Open Source
  http://www.osgart.org
osgART Approach: Basic Scene Graph
Root
Transform
3D Object
0.988 -0.031 -0.145 0
-0.048 0.857 -0.512 0
0.141 0.513 0.846 0
10.939 29.859 -226.733 1
[ ]
  To add Video see-through AR:
  Integrate live video
  Apply correct projection matrix
  Update tracked transformations in
realtime
osgART Approach: AR Scene Graph
Root
Transform
3D Object
osgART Approach: AR Scene Graph
Video
Geode
Root
Transform
3D Object
Virtual
Camera
Video
Layer
osgART Approach: AR Scene Graph
Video
Geode
Root
Transform
3D Object
Virtual
Camera
Projection matrix from
tracker calibration
Transformation matrix
updated from marker
tracking in realtimeVideo
Layer
Full-screen quad
with live texture
updated from
Video source
Orthographic
projection
osgART Approach: AR Scene Graph
Video
Geode
Root
Transform
3D Object
Virtual
Camera
Projection matrix from
tracker calibration
Transformation matrix
updated from marker
tracking in realtimeVideo
Layer
Full-screen quad
with live texture
updated from
Video source
Orthographic
projection
osgART Architecture
  Like any video see-through AR library, osgART requires video
input and tracking capabilities.
ARLibrary
Application
Video Source
e.g. DirectShow
Tracking Module
(libAR.lib)
osgART Architecture
  osgART uses a plugin architecture so that video sources and tracking
technologies can be plugged in as necessary
osgART
Application
VideoPluginTrackerPlugin
ARToolKit4 -
ARToolkitPlus -
MXRToolKit -
ARLib -
bazAR (work in progress) -
ARTag (work in progress) -
OpenCVVideo -
VidCapture -
CMU1394 -
PointGrey SDK -
VidereDesign -
VideoWrapper -
VideoInput -
VideoSource -
DSVL -
Intranel RTSP -
Basic osgART Tutorial
  Develop a working osgART application from scratch.
  Use ARToolKit 2.72 library for tracking and video
capture
osgART Tutorial 1: Basic OSG Viewer
  Start with the standard Open Scene Graph Viewer
  We will modify this to do AR!
osgART Tutorial 1: Basic OSG Viewer
  The basic osgViewer…
#include <osgViewer/Viewer>
#include <osgViewer/ViewerEventHandlers>
int main(int argc, char* argv[]) {
// Create a viewer
osgViewer::Viewer viewer;
// Create a root node
osg::ref_ptr<osg::Group> root = new osg::Group;
// Attach root node to the viewer
viewer.setSceneData(root.get());
// Add relevant event handlers to the viewer
viewer.addEventHandler(new osgViewer::StatsHandler);
viewer.addEventHandler(new osgViewer::WindowSizeHandler);
viewer.addEventHandler(new osgViewer::ThreadingHandler);
viewer.addEventHandler(new osgViewer::HelpHandler);
// Run the viewer and exit the program when the viewer is closed
return viewer.run();
}
osgART Tutorial 2: Adding Video
  Add a video plugin
  Load, configure, start video capture…
  Add a video background
  Create, link to video, add to scene-graph
osgART Tutorial 2: Adding Video
  New code to load and configure a Video Plugin:
// Preload the video and tracker
int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2");
// Load a video plugin.
osg::ref_ptr<osgART::Video> video =
dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id));
// Check if an instance of the video stream could be created
if (!video.valid()) {
// Without video an AR application can not work. Quit if none found.
osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl;
exit(-1);
}
// Open the video. This will not yet start the video stream but will
// get information about the format of the video which is essential
// for the connected tracker.
video->open();
osgART Tutorial 2: Adding Video
  New code to add a live video background
osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get());
videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin");
root->addChild(videoBackground.get());
video->start();
osg::Group* createImageBackground(osg::Image* video) {
osgART::VideoLayer* _layer = new osgART::VideoLayer();
_layer->setSize(*video);
osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video);
addTexturedQuad(*_geode, video->s(), video->t());
_layer->addChild(_geode);
return _layer;
}
  In the main function…
osgART Tutorial 3: Tracking
  Add a Tracker plugin
  Load, configure, link to video
  Add a Marker to track
  Load, activate
  Tracked node
  Create, link with marker via tracking callbacks
  Print out the tracking data
osgART Tutorial 3: Tracking
int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2");
osg::ref_ptr<osgART::Tracker> tracker =
dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id));
if (!tracker.valid()) {
// Without tracker an AR application can not work. Quit if none found.
osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl;
exit(-1);
}
// get the tracker calibration object
osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration();
// load a calibration file
if (!calibration->load("data/camera_para.dat"))
{
// the calibration file was non-existing or couldnt be loaded
osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl;
exit(-1);
}
// set the image source for the tracker
tracker->setImage(video.get());
osgART::TrackerCallback::addOrSet(root.get(), tracker.get());
// create the virtual camera and add it to the scene
osg::ref_ptr<osg::Camera> cam = calibration->createCamera();
root->addChild(cam.get());
  Load a tracking plugin and associate it with the video plugin
osgART Tutorial 3: Tracking
osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0");
if (!marker.valid())
{
// Without marker an AR application can not work. Quit if none found.
osg::notify(osg::FATAL) << "Could not add marker!" << std::endl;
exit(-1);
}
marker->setActive(true);
osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get());
cam->addChild(arTransform.get());
  Load a marker and activate it
  Associate it with a transformation node (via event callbacks)
  Add the transformation node to the virtual camera node
osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));
  Add a debug callback to print out information about the tracked marker
osgART Tutorial 3: Tracking
  Tracking information is
output to console
osgART Tutorial 4: Adding Content
  Now put the tracking data to use!
  Add content to the tracked transform
  Basic cube code
arTransform->addChild(osgART::testCube());
arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
osgART Tutorial 5: Adding 3D Model
  Open Scene Graph can load some 3D formats directly:
  e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA
  Others need to be converted
  Support for some formats is much better than others
  e.g. OpenFlight good, 3ds hit and miss.
  Recommend native .osg and .ive formats
  .osg – ASCII representation of scene graph
  .ive – Binary OSG file. Can contain hold textures.
  osgExp : Exporter for 3DS Max is a good choice
  http://sourceforge.net/projects/osgmaxexp
  Otherwise .3ds files from TurboSquid can work
osgART Tutorial 5: Adding 3D Model
std::string filename = "media/hollow_cube.osg";
arTransform->addChild(osgDB::readNodeFile(filename));
  Replace the simple cube with a 3D model
  Models are loaded using the osgDB::readNodeFile() function
  Note: Scale is important. Units are in mm.
3D Studio Max
Export to .osg
osgART
osgART Tutorial 6: Multiple Markers
  Repeat the process so far to track more than
one marker simultaneously
osgART Tutorial 6: Multiple Markers
osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get());
arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg"));
arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
cam->addChild(arTransformA.get());
osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform();
osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get());
arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg"));
arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
cam->addChild(arTransformB.get());
  Load and activate two markers
osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0");
markerA->setActive(true);
osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0");
markerB->setActive(true);
  Create two transformations, attach callbacks, and add models
  Repeat the process so far to track more than one marker
osgART Tutorial 6: Multiple Markers
Basic osgART Tutorial: Summary
Standard OSGViewer Addition ofVideo Addition of Tracking
Addition of basic 3D
graphics
Addition of 3D Model Multiple Markers
AR Interaction
experiences
applications
tools
components
Sony CSL © 2004
Building Compelling AR Experiences
Tracking, Display
Authoring
Interaction
AR Interaction
  Designing AR System = Interface Design
  Using different input and output technologies
  Objective is a high quality of user experience
  Ease of use and learning
  Performance and satisfaction
User Interface and Tool
  Human  User Interface/Tool  Machine/Object
  Human Machine Interface
© Andreas Dünser
Tools
User
Interface
User Interface: Characteristics
  Input: mono or multimodal
  Output: mono or multisensorial
  Technique/Metaphor/Paradigm
© Andreas Dünser
Input
Output
Sensation of
movement
Metaphor:
“Push” to accelerate
“Turn” to rotate
Human Computer Interface
  Human  User Interface Computer System
  Human Computer Interface=
Hardware +| Software
  Computer is everywhere now HCI electronic
devices, Home Automation, Transport vehicles, etc 
© Andreas Dünser
More terminology
  Interaction Device= Input/Output of User
Interface
  Interaction Style= category of similar
interaction techniques
  Interaction Paradigm
  Modality (human sense)
  Usability
Back to AR
  You can see spatially registered AR..
how can you interact with it?
Interaction Tasks
  2D (from [Foley]):
  Selection, Text Entry, Quantify, Position
  3D (from [Bowman]):
  Navigation (Travel/Wayfinding)
  Selection
  Manipulation
  System Control/Data Input
  AR: 2D + 3D Tasks and.. more specific tasks?
[Foley] The Human Factors of Computer Graphics InteractionTechniques Foley, J. D.,V.Wallace & P. Chan. IEEE Computer
Graphics and Applications(Nov.): 13-48. 1984.
[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
AR Interfaces as Data Browsers
  2D/3D virtual objects are
registered in 3D
  “VR in Real World”
  Interaction
  2D/3D virtual viewpoint
control
  Applications
  Visualization, training
AR Information Browsers
  Information is registered to
real-world context
  Hand held AR displays
  Interaction
  Manipulation of a window
into information space
  Applications
  Context-aware information displays
Rekimoto, et al. 1997
Architecture
Current AR Information Browsers
  Mobile AR
  GPS + compass
  Many Applications
  Layar
  Wikitude
  Acrossair
  PressLite
  Yelp
  AR Car Finder
  …
Junaio
  AR Browser from Metaio
  http://www.junaio.com/
  AR browsing
  GPS + compass
  2D/3D object placement
  Photos/live video
  Community viewing
Web Interface
Adding Models in Web Interface
Advantages and Disadvantages
  Important class of AR interfaces
  Wearable computers
  AR simulation, training
  Limited interactivity
  Modification of virtual
content is difficult
Rekimoto, et al. 1997
3D AR Interfaces
  Virtual objects displayed in 3D
physical space and manipulated
  HMDs and 6DOF head-tracking
  6DOF hand trackers for input
  Interaction
  Viewpoint control
  Traditional 3D user interface
interaction: manipulation, selection,
etc.
Kiyokawa, et al. 2000
AR 3D Interaction
AR Graffiti
www.nextwall.net
Advantages and Disadvantages
  Important class of AR interfaces
  Entertainment, design, training
  Advantages
  User can interact with 3D virtual
object everywhere in space
  Natural, familiar interaction
  Disadvantages
  Usually no tactile feedback
  User has to use different devices for
virtual and physical objects
Oshima, et al. 2000
Augmented Surfaces and
Tangible Interfaces
  Basic principles
  Virtual objects are
projected on a surface
  Physical objects are used
as controls for virtual
objects
  Support for collaboration
Augmented Surfaces
  Rekimoto, et al. 1998
  Front projection
  Marker-based tracking
  Multiple projection surfaces
Tangible User Interfaces (Ishii 97)
  Create digital shadows
for physical objects
  Foreground
  graspable UI
  Background
  ambient interfaces
Tangible Interfaces - Ambient
  Dangling String
  Jeremijenko 1995
  Ambient ethernet monitor
  Relies on peripheral cues
  Ambient Fixtures
  Dahley, Wisneski, Ishii 1998
  Use natural material qualities
for information display
Tangible Interface: ARgroove
  Collaborative Instrument
  Exploring Physically Based Interaction
  Map physical actions to Midi output
-  Translation, rotation
-  Tilt, shake
ARgroove in Use
Visual Feedback
  Continuous Visual Feedback is Key
  Single Virtual Image Provides:
  Rotation
  Tilt
  Height
i/O Brush (Ryokai, Marti, Ishii)
Other Examples
  Triangles (Gorbert 1998)
  Triangular based story telling
  ActiveCube (Kitamura 2000-)
  Cubes with sensors
Lessons from Tangible Interfaces
  Physical objects make us smart
  Norman’s “Things that Make Us Smart”
  encode affordances, constraints
  Objects aid collaboration
  establish shared meaning
  Objects increase understanding
  serve as cognitive artifacts
TUI Limitations
  Difficult to change object properties
  can’t tell state of digital data
  Limited display capabilities
  projection screen = 2D
  dependent on physical display surface
  Separation between object and display
  ARgroove
Advantages and Disadvantages
  Advantages
  Natural - users hands are used for interacting
with both virtual and real objects.
-  No need for special purpose input devices
  Disadvantages
  Interaction is limited only to 2D surface
-  Full 3D interaction and manipulation is difficult
Orthogonal Nature of AR Interfaces
Back to the Real World
  AR overcomes limitation of TUIs
  enhance display possibilities
  merge task/display space
  provide public and private views
  TUI + AR = Tangible AR
  Apply TUI methods to AR interface design
  Space-multiplexed
  Many devices each with one function
-  Quicker to use, more intuitive, clutter
-  Real Toolbox
  Time-multiplexed
  One device with many functions
-  Space efficient
-  mouse
Tangible AR: Tiles (Space Multiplexed)
  Tiles semantics
  data tiles
  operation tiles
  Operation on tiles
  proximity
  spatial arrangements
  space-multiplexed
Space-multiplexed Interface
Data authoring in Tiles
Proximity-based Interaction
Object Based Interaction: MagicCup
  Intuitive Virtual Object Manipulation
on a Table-Top Workspace
  Time multiplexed
  Multiple Markers
-  Robust Tracking
  Tangible User Interface
-  Intuitive Manipulation
  Stereo Display
-  Good Presence
Our system
  Main table, Menu table, Cup interface
Tangible AR: Time-multiplexed Interaction
  Use of natural physical object manipulations to
control virtual objects
  VOMAR Demo
  Catalog book:
-  Turn over the page
  Paddle operation:
-  Push, shake, incline, hit, scoop
VOMAR Interface
Advantages and Disadvantages
  Advantages
  Natural interaction with virtual and physical tools
-  No need for special purpose input devices
  Spatial interaction with virtual objects
-  3D manipulation with virtual objects anywhere in physical
space
  Disadvantages
  Requires Head Mounted Display
Wrap-up
  Browsing Interfaces
  simple (conceptually!), unobtrusive
  3D AR Interfaces
  expressive, creative, require attention
  Tangible Interfaces
  Embedded into conventional environments
  Tangible AR
  Combines TUI input + AR display
AR User Interface: Categorization
  Traditional Desktop: keyboard, mouse,
joystick (with or without 2D/3D GUI)
  Specialized/VR Device: 3D VR devices,
specially design device
AR User Interface: Categorization
  Tangible Interface : using physical object Hand/
Touch Interface : using pose and gesture of hand,
fingers
  Body Interface: using movement of body
AR User Interface: Categorization
  Speech Interface: voice, speech control
  Multimodal Interface : Gesture + Speech
  Haptic Interface : haptic feedback
  Eye Tracking, Physiological, Brain Computer
Interface..
Resources
Websites
  Software Download
  http://artoolkit.sourceforge.net/
  ARToolKit Documentation
  http://www.hitl.washington.edu/artoolkit/
  ARToolKit Forum
  http://www.hitlabnz.org/wiki/Forum
  ARToolworks Inc
  http://www.artoolworks.com/
  ARToolKit Plus
  http://studierstube.icg.tu-graz.ac.at/handheld_ar/
artoolkitplus.php
  osgART
  http://www.osgart.org/
  FLARToolKit
  http://www.libspark.org/wiki/saqoosha/FLARToolKit/
  FLARManager
  http://words.transmote.com/wp/flarmanager/
Project Assignment
  Design/Related work exercise
  Individual
  Each person find 2 relevant papers/videos/websites
  Write two page literature review
  As a team - prototype design
  Sketch out the user interface of application
  Design the interaction flow/Screen mockups
  3 minute Presentation in class August 16th

Weitere ähnliche Inhalte

Was ist angesagt?

Augmented reality intro for mobile apps
Augmented reality intro for mobile appsAugmented reality intro for mobile apps
Augmented reality intro for mobile appsHeather Downing
 
Ar & vr impact on e commerce and e-retail
Ar & vr impact on e commerce and e-retailAr & vr impact on e commerce and e-retail
Ar & vr impact on e commerce and e-retailRahulNayak262229
 
Augmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem BhuttoAugmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem BhuttoJaseem Bhutto
 
Augmented reality technical presentation
 Augmented reality technical presentation Augmented reality technical presentation
Augmented reality technical presentationsairamgoud16
 
Fog screen
Fog  screenFog  screen
Fog screenfavastp
 
Introduction to Optical See-Through HMDs in AR
Introduction to Optical See-Through HMDs in ARIntroduction to Optical See-Through HMDs in AR
Introduction to Optical See-Through HMDs in ARYuta Itoh
 
Augmented reality
Augmented realityAugmented reality
Augmented realityshyamsasi94
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
COSC 426 Lecture 1: Introduction to Augmented Reality
COSC 426 Lecture 1: Introduction to Augmented RealityCOSC 426 Lecture 1: Introduction to Augmented Reality
COSC 426 Lecture 1: Introduction to Augmented RealityMark Billinghurst
 
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityCOMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityMark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
 

Was ist angesagt? (20)

Augmented reality intro for mobile apps
Augmented reality intro for mobile appsAugmented reality intro for mobile apps
Augmented reality intro for mobile apps
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Ar & vr impact on e commerce and e-retail
Ar & vr impact on e commerce and e-retailAr & vr impact on e commerce and e-retail
Ar & vr impact on e commerce and e-retail
 
Augmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem BhuttoAugmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem Bhutto
 
Augmented reality technical presentation
 Augmented reality technical presentation Augmented reality technical presentation
Augmented reality technical presentation
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
Fog screen
Fog  screenFog  screen
Fog screen
 
Mixed reality
Mixed realityMixed reality
Mixed reality
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
smart glasses
smart glassessmart glasses
smart glasses
 
Introduction to Optical See-Through HMDs in AR
Introduction to Optical See-Through HMDs in ARIntroduction to Optical See-Through HMDs in AR
Introduction to Optical See-Through HMDs in AR
 
Invisibility ppt
Invisibility pptInvisibility ppt
Invisibility ppt
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Augmented reality
Augmented  realityAugmented  reality
Augmented reality
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
COSC 426 Lecture 1: Introduction to Augmented Reality
COSC 426 Lecture 1: Introduction to Augmented RealityCOSC 426 Lecture 1: Introduction to Augmented Reality
COSC 426 Lecture 1: Introduction to Augmented Reality
 
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityCOMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 

Andere mochten auch

2013 Lecture4: Designing AR Interfaces
2013 Lecture4: Designing AR Interfaces2013 Lecture4: Designing AR Interfaces
2013 Lecture4: Designing AR InterfacesMark Billinghurst
 
2013 426 Lecture 2: Augmented Reality Technology
2013 426 Lecture 2:  Augmented Reality Technology2013 426 Lecture 2:  Augmented Reality Technology
2013 426 Lecture 2: Augmented Reality TechnologyMark Billinghurst
 
2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design GuidelinesMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 
426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer Tools426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer ToolsMark Billinghurst
 
Making Augmented Reality Applications with Android NDK
Making Augmented Reality Applications with Android NDKMaking Augmented Reality Applications with Android NDK
Making Augmented Reality Applications with Android NDKEvren Coşkun
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMark Billinghurst
 
Tangible AR Interface
Tangible AR InterfaceTangible AR Interface
Tangible AR InterfaceJongHyoun
 
Building Usable AR Interfaces
Building Usable AR InterfacesBuilding Usable AR Interfaces
Building Usable AR InterfacesMark Billinghurst
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3Mark Billinghurst
 
COMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysCOMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysMark Billinghurst
 
Mobile AR Lecture6 - Introduction to Unity 3D
Mobile AR Lecture6 - Introduction to Unity 3DMobile AR Lecture6 - Introduction to Unity 3D
Mobile AR Lecture6 - Introduction to Unity 3DMark Billinghurst
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingMark Billinghurst
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionMark Billinghurst
 
Designing Outstanding AR Experiences
Designing Outstanding AR ExperiencesDesigning Outstanding AR Experiences
Designing Outstanding AR ExperiencesMark Billinghurst
 
Using AR for Vehicle Navigation
Using AR for Vehicle NavigationUsing AR for Vehicle Navigation
Using AR for Vehicle NavigationMark Billinghurst
 

Andere mochten auch (20)

2013 Lecture4: Designing AR Interfaces
2013 Lecture4: Designing AR Interfaces2013 Lecture4: Designing AR Interfaces
2013 Lecture4: Designing AR Interfaces
 
2013 Lecture3: AR Tracking
2013 Lecture3: AR Tracking 2013 Lecture3: AR Tracking
2013 Lecture3: AR Tracking
 
2013 426 Lecture 2: Augmented Reality Technology
2013 426 Lecture 2:  Augmented Reality Technology2013 426 Lecture 2:  Augmented Reality Technology
2013 426 Lecture 2: Augmented Reality Technology
 
2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 
426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer Tools426 lecture 4: AR Developer Tools
426 lecture 4: AR Developer Tools
 
Making Augmented Reality Applications with Android NDK
Making Augmented Reality Applications with Android NDKMaking Augmented Reality Applications with Android NDK
Making Augmented Reality Applications with Android NDK
 
May the Force be with You
May the Force be with YouMay the Force be with You
May the Force be with You
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
 
Tangible AR Interface
Tangible AR InterfaceTangible AR Interface
Tangible AR Interface
 
Building Usable AR Interfaces
Building Usable AR InterfacesBuilding Usable AR Interfaces
Building Usable AR Interfaces
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3
 
COMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysCOMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR Displays
 
COMP 4026 - Lecture 1
COMP 4026 - Lecture 1COMP 4026 - Lecture 1
COMP 4026 - Lecture 1
 
Mobile AR Lecture6 - Introduction to Unity 3D
Mobile AR Lecture6 - Introduction to Unity 3DMobile AR Lecture6 - Introduction to Unity 3D
Mobile AR Lecture6 - Introduction to Unity 3D
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR Tracking
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
Designing Outstanding AR Experiences
Designing Outstanding AR ExperiencesDesigning Outstanding AR Experiences
Designing Outstanding AR Experiences
 
Using AR for Vehicle Navigation
Using AR for Vehicle NavigationUsing AR for Vehicle Navigation
Using AR for Vehicle Navigation
 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
 

Ähnlich wie 2013 Lecture 5: AR Tools and Interaction

TechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - Trivadis
TechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - TrivadisTechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - Trivadis
TechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - TrivadisTrivadis
 
Cloudera Data Science Challenge
Cloudera Data Science ChallengeCloudera Data Science Challenge
Cloudera Data Science ChallengeMark Nichols, P.E.
 
Data Science Challenge presentation given to the CinBITools Meetup Group
Data Science Challenge presentation given to the CinBITools Meetup GroupData Science Challenge presentation given to the CinBITools Meetup Group
Data Science Challenge presentation given to the CinBITools Meetup GroupDoug Needham
 
Software development effort reduction with Co-op
Software development effort reduction with Co-opSoftware development effort reduction with Co-op
Software development effort reduction with Co-oplbergmans
 
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web TestingThe Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web TestingPerfecto by Perforce
 
WELCOME TO AI PROJECT shidhant mittaal.pptx
WELCOME TO AI PROJECT shidhant mittaal.pptxWELCOME TO AI PROJECT shidhant mittaal.pptx
WELCOME TO AI PROJECT shidhant mittaal.pptx9D38SHIDHANTMITTAL
 
Ary Mouse for Image Processing
Ary Mouse for Image ProcessingAry Mouse for Image Processing
Ary Mouse for Image ProcessingIJERA Editor
 
Ary Mouse for Image Processing
Ary Mouse for Image ProcessingAry Mouse for Image Processing
Ary Mouse for Image ProcessingIJERA Editor
 
2014 01-ticosa
2014 01-ticosa2014 01-ticosa
2014 01-ticosaPharo
 
Embracing OOUX for Better Projects and Happier Teams
Embracing OOUX for Better Projects and Happier TeamsEmbracing OOUX for Better Projects and Happier Teams
Embracing OOUX for Better Projects and Happier TeamsCaroline Sober-James
 
Ui Design And Usability For Everybody
Ui Design And Usability For EverybodyUi Design And Usability For Everybody
Ui Design And Usability For EverybodyEmpatika
 
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...Deltares
 
IRJET- Recruitment Chatbot
IRJET- Recruitment ChatbotIRJET- Recruitment Chatbot
IRJET- Recruitment ChatbotIRJET Journal
 

Ähnlich wie 2013 Lecture 5: AR Tools and Interaction (20)

NEXiDA at OMG June 2009
NEXiDA at OMG June 2009NEXiDA at OMG June 2009
NEXiDA at OMG June 2009
 
TechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - Trivadis
TechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - TrivadisTechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - Trivadis
TechEvent 2019: Artificial Intelligence in Dev & Ops; Martin Luckow - Trivadis
 
Cloudera Data Science Challenge
Cloudera Data Science ChallengeCloudera Data Science Challenge
Cloudera Data Science Challenge
 
Data Science Challenge presentation given to the CinBITools Meetup Group
Data Science Challenge presentation given to the CinBITools Meetup GroupData Science Challenge presentation given to the CinBITools Meetup Group
Data Science Challenge presentation given to the CinBITools Meetup Group
 
Business Analyst
Business AnalystBusiness Analyst
Business Analyst
 
Hci 3
Hci 3Hci 3
Hci 3
 
Software development effort reduction with Co-op
Software development effort reduction with Co-opSoftware development effort reduction with Co-op
Software development effort reduction with Co-op
 
Model Based Development For 3 D User Interfaces
Model Based Development For 3 D User InterfacesModel Based Development For 3 D User Interfaces
Model Based Development For 3 D User Interfaces
 
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web TestingThe Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
The Automation Firehose: Be Strategic & Tactical With Your Mobile & Web Testing
 
Rapid prototyping and tooling
Rapid prototyping and toolingRapid prototyping and tooling
Rapid prototyping and tooling
 
WELCOME TO AI PROJECT shidhant mittaal.pptx
WELCOME TO AI PROJECT shidhant mittaal.pptxWELCOME TO AI PROJECT shidhant mittaal.pptx
WELCOME TO AI PROJECT shidhant mittaal.pptx
 
Ary Mouse for Image Processing
Ary Mouse for Image ProcessingAry Mouse for Image Processing
Ary Mouse for Image Processing
 
Ary Mouse for Image Processing
Ary Mouse for Image ProcessingAry Mouse for Image Processing
Ary Mouse for Image Processing
 
Machine Learning
Machine LearningMachine Learning
Machine Learning
 
2014 01-ticosa
2014 01-ticosa2014 01-ticosa
2014 01-ticosa
 
Embracing OOUX for Better Projects and Happier Teams
Embracing OOUX for Better Projects and Happier TeamsEmbracing OOUX for Better Projects and Happier Teams
Embracing OOUX for Better Projects and Happier Teams
 
Ui Design And Usability For Everybody
Ui Design And Usability For EverybodyUi Design And Usability For Everybody
Ui Design And Usability For Everybody
 
PRELIM-Lesson-2.pdf
PRELIM-Lesson-2.pdfPRELIM-Lesson-2.pdf
PRELIM-Lesson-2.pdf
 
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
DSD-INT 2014 - OpenMI Symposium - Federated modelling of Critical Infrastruct...
 
IRJET- Recruitment Chatbot
IRJET- Recruitment ChatbotIRJET- Recruitment Chatbot
IRJET- Recruitment Chatbot
 

Mehr von Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 

Mehr von Mark Billinghurst (20)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 

Kürzlich hochgeladen

Training state-of-the-art general text embedding
Training state-of-the-art general text embeddingTraining state-of-the-art general text embedding
Training state-of-the-art general text embeddingZilliz
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
What is Artificial Intelligence?????????
What is Artificial Intelligence?????????What is Artificial Intelligence?????????
What is Artificial Intelligence?????????blackmambaettijean
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demoHarshalMandlekar2
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rick Flair
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 

Kürzlich hochgeladen (20)

Training state-of-the-art general text embedding
Training state-of-the-art general text embeddingTraining state-of-the-art general text embedding
Training state-of-the-art general text embedding
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
What is Artificial Intelligence?????????
What is Artificial Intelligence?????????What is Artificial Intelligence?????????
What is Artificial Intelligence?????????
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demo
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 

2013 Lecture 5: AR Tools and Interaction

  • 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org August 9th 2013 Lecture 5: AR Tools and Interaction
  • 3. Data Gathering Techniques (1)   Questionnaires   Looking for specific information   Qualitative and quantitative results   Good for getting data from a large, dispersed group   Interviews   Good for exploring issues, using props   Structured, unstructured or semi-structured   But are time consuming and difficult to visit everyone
  • 4. Data Gathering Techniques (2)   Workshops or focus groups   Group interviews/activities   Good at gaining a consensus view and/or highlighting areas of conflict   Observations   Spending time with users in day to day tasks   Good for understanding task context   requires time and can result in a huge amount of data   Documentation   Procedures and rules written down in manuals
  • 5. Elaboration and Reduction   Elaborate - generate solutions. These are the opportunities   Reduce - decide on the ones worth pursuing   Repeat - elaborate and reduce again on those solutions Source: Laseau,P. (1980) Graphic Thinking for Architects & Designers. John Wiley and Sons
  • 6. Tools for Effective Design  Personas  Scenarios  Storyboards  Wireframes and Mock-ups  Prototypes
  • 7. Persona Technique •  Personas are a design tool to help visualize who you are designing for and imagine how person will use the product •  A persona is an archetype that represents the behavior and goals of a group of users •  Based on insights and observations from customer research •  Not real people, but synthesised from real user characteristics •  Bring them to life with a name, characteristics, goals, background •  Develop multiple personas
  • 8. Gunther the Ad Guy Gunther is from Germany. He Travels extensively for work and As he is an advertising executive he needs to present concepts to clients quickly and easily. He is a person very well-versed in new technologies and wishes he had easier portable solutions for his presentations…..
  • 9. Scenarios Usage Scenarios are narrative descriptions of how the product meets the needs of a persona Short (2 pages max) Focus on unmet needs of persona Concrete story Set of stories around essential tasks, problems... Use to test ideas
  • 10. Storyboarding Sequence of sketches showing use of system in everyday use context Concrete example Easier (faster) to grasp than text based stories Means of communication with users and system developers Sketches, not drawings... Use to test interaction and make sure design works
  • 11. Sketching is about design Sketching is not about drawing It is about design. Sketching is a tool to help you: -  express -  develop, and -  communicate design ideas Sketching is part of a process: -  idea generation, -  design elaboration -  design choices, -  engineering
  • 12. Sketch vs. Prototype Sketch   Prototype   Invite   A)end   Suggest   Describe   Explore   Refine   Ques;on   Answer   Propose   Test   Provoke   Resolve   Tenta;ve,  non  commi)al   Specific  Depic;on   The primary differences are in the intent
  • 13. Types of Prototypes Low Fidelity – quick and dirty, easy access materials like cardboard and paper. High Fidelity – more involved electronic versions similar in materials to final product.
  • 14. RAPID Prototyping   Fast and inexpensive   Identifies problems before they’re coded   Elicits more and better feedback from users   Helps developers think creatively   Gets users and other stakeholders involved early   Fosters teamwork and communication   Avoids opinion wars   Helps decide design directions
  • 15. Paper Prototyping (Low Fidelity) Quick and simple means of sketching interfaces Use office materials Easier to criticize, quick to change Creative process (develop in team) Can also use for usability test (focus on flow of interaction rather than visuals) Used a lot to test out concepts before real design begins.
  • 17. High-fidelity prototyping •  Uses materials that you would expect to be in the final product. •  Prototype looks more like the final system than a low-fidelity version. •  For a high-fidelity software prototype common environments include Macromedia Director,Visual Basic, and Smalltalk. •  Danger that users think they have a full system…….see compromises
  • 18. Rapid Prototyping   Speed development time with quick hardware mockups   handheld device connected to PC   LCD screen, USB phone keypad, Camera   Can use PC development tools for rapid development   Flash, Visual Basic, etc
  • 20. experiences applications tools components Sony CSL © 2004 Building Compelling AR Experiences Tracking, Display Authoring
  • 21. AR Authoring Tools   Low Level Software Libraries   osgART, Studierstube, MXRToolKit   Plug-ins to existing software   DART (Macromedia Director), mARx, Unity,   Stand Alone   AMIRE, BuildAR, Metaio Creator etc   Rapid Prototyping Tools   Flash, OpenFrameworks, Processing, Arduino, etc   Next Generation   iaTAR (Tangible AR)
  • 22. ARToolKit (Kato 1998)   Open source – computer vision based AR tracking   http://artoolkit.sourceforge.net/
  • 23. ARToolKit Structure   Three key libraries:   AR32.lib – ARToolKit image processing functions   ARgsub32.lib – ARToolKit graphics functions   ARvideo.lib – DirectShow video capture class DirectShow ARvideo.lib
  • 24. Software   Cross platform   Windows, Mac, Linux, IRIX, Symbian, iPhone, etc   Additional basic libraries   Video capture library (Video4Linux, VisionSDK)   OpenGL, GLUT   Requires a rendering library   Open VRML, Open Inventor, osgART, etc
  • 25. Additional Software   ARToolKit just provides tracking   For an AR application you’ll need more software   High level rendering library   Open VRML, Open Inventor, osgART, etc   Audio Library   Fmod, etc   Peripheral support
  • 26. What does ARToolKit Calculate?   Position of makers in the camera coordinates   Pose of markers in the camera coordinates   Output format   3x4 matrix format to represent the transformation matrix from the marker coordinates to the camera coordinates
  • 28. From Marker To Camera   Rotation & Translation TCM : 4x4 transformation matrix from marker coord. to camera coord.
  • 29. An ARToolKit Application   Initialization   Load camera and pattern parameters   Main Loop   Step1. Image capture and display   Step2. Marker detection   Step3. Marker identification   Step4. Getting pose information   Step5. Object Interactions/Simulation   Step6. Display virtual objects   End Application   Camera shut down
  • 30. Sample1.c Main Function main()! {! !init();! !argMainLoop( mouseEvent, ! ! !keyEvent, mainLoop); ! }!
  • 31. Sample1.c - mainLoop Function if( dataPtr = (ARUint8 *) arVideoGetImage()) == NULL ) { arUtilSleep(2); return; } argDrawMode2D(); argDispImage(dataPtr, 0, 0 ); arVideoCapNext(); argSwapBuffers();
  • 32. Ex. 2: Detecting a Marker   Program : sample2.c   Key points   Threshold value   Important external variables   arDebug – keep thresholded image   arImage – pointer for thresholded image   arImageProcMode – use 50% image for image processing -  AR_IMAGE_PROC_IN_FULL -  AR_IMAGE_PROC_IN_HALF
  • 33. Sample2.c – marker detection /* detect the markers in the video frame */ if(arDetectMarker(dataPtr, thresh, &marker_info, &marker_num) < 0 ) { cleanup(); exit(0); } for( i = 0; i < marker_num; i++ ) { argDrawSquare(marker_info[i].vertex,0,0); }
  • 34. Making a pattern template   Use of utility program: mk_patt.exe   Show the pattern   Put the corner of red line segments on the left-top vertex of the marker   Pattern stored as a template in a file   1:2:1 ratio determines the pattern region used
  • 35. Ex. 4 – Getting 3D information   Program : sample4.c   Key points  Definition of a real marker  Transformation matrix -  Rotation component -  Translation component
  • 36. Sample4.c – Transformation matrix double marker_center[2] = {0.0, 0.0}; double marker_width = 80.0; double marker_trans[3][4]; arGetTransMat(&marker_info[i], marker_center, marker_width, marker_trans);
  • 37. Finding the Camera Position This function sets transformation matrix from marker to camera into marker_trans[3][4]." arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans); You can see the position information in the values of marker_trans[3][4]." " Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3];
  • 39. Ex. 5- Virtual Object Display   Program : sample5.c   Key points  OpenGL parameter setting  Setup of projection matrix  Setup of modelview matrix
  • 40. Appending your own OpenGL code Set the camera parameters to OpenGL Projection matrix. argDrawMode3D(); argDraw3dCamera( 0, 0 ); Set the transformation matrix from the marker to the camera to the OpenGL ModelView matrix. argConvGlpara(marker_trans, gl_para); glMatrixMode(GL_MODELVIEW); glLoadMatrixd( gl_para ); After calling these functions, your OpenGL objects are drawn in the real marker coordinates.
  • 41. ARToolKit in the World   Hundreds of projects   Large research community
  • 42. ARToolKit Family ARToolKit ARToolKit NFT ARToolKit (Symbian) NyToolKit - Java, C#, - Android, WM JARToolKit (Java) FLARToolKit (Flash) FLARManager (Flash)
  • 43. FLARToolKit   Flash AS3 Version of the ARToolKit (was ported from NyARToolkit the Java Version of the ARToolkit)   enables augmented reality in the Browser   uses Papervision3D for as 3D Engine   available at http://saqoosha.net/   dual license, GPL and commercial license
  • 45. private function mainEnter(e:Event):void { /* Capture video frame*/ capture.draw(vid); /* Detect marker */ if (detector.detectMarkerLite(raster, 80) && detector.getConfidence() > 0.5) { //Get the transfomration matrix for the current marker position detector.getTransformMatrix(trans); //Translates and rotates the mainContainer so it looks right mainContainer.setTransformMatrix(trans); //Render the papervision scene renderer.render(); } }
  • 47. Papervision 3D   http://www.papervision3d.org/   Flash-based 3D-Engine   Supports   import of 3D Models   texturing   animation   scene graph   alternatives: Away3d, Sandy,…
  • 48. Source Packages   „Original“ FLARToolkit (Libspark, Saqoosha) ( http://www.libspark.org/svn/as3/FLARToolKit/trunk/ )   Start-up-guides   Saqoosha (http://saqoosha.net/en/flartoolkit/start-up-guide/ )   Miko Haapoja (http://www.mikkoh.com/blog/?p=182 )   „Frameworks“   Squidder MultipleMarker – Example ( http://www.squidder.com/2009/03/06/flar-how-to-multiple-instances-of-multiple- markers/ )   FLARManager (http://words.transmote.com/wp/flarmanager/ )
  • 49. Other Languages   NyARToolKit   http://nyatla.jp/nyartoolkit/wp/   AS3, Java, C#, Processing, Unity, etc   openFrameworks   http://www.openframeworks.cc/   https://sites.google.com/site/ofauckland/examples/8-artoolkit-example   Support for other libraries -  Kinect, Audio, Physics, etc
  • 50. void testApp::update(){ //capture video and detect markers mov.update(); if (mov.isFrameNew()) { img.setFromPixels(mov.getPixels(), ofGetWidth(), ofGetHeight()); gray = img; tracker.setFromPixels(gray.getPixels()); } } //-------------------------------------------------------------- void testApp::draw(){ //draw AR objects ofSetColor(0xffffff); mov.draw(0, 0); for (int i=0; i<tracker.markers.size(); i++) { ARMarkerInfo &m = tracker.markers[i]; tracker.loadMarkerModelViewMatrix(m); ofSetColor(255, 255, 0, 100); ofCircle(0,0,25); ofSetColor(0); ofDrawBitmapString(ofToString(m.id),0,0); } }
  • 51. Low Level Mobile AR Tools   Vuforia Tracking Library (Qualcomm)   Vuforia.com   iOS, Android   Computer vision based tracking   Marker tracking, 3D objects, frame markers   Integration with Unity   Interaction, model loading
  • 52. OSGART Programming Library   Integration of ARToolKit with a High-Level Rendering Engine (OpenSceneGraph) OSGART= OpenSceneGraph + ARToolKit   Supporting Geometric + Photometric Registration
  • 53. osgART:Features   C++ (but also Python, Lua, etc).   Multiple Video Input supports:   Direct (Firewire/USB Camera), Files, Network by ARvideo, PtGrey, CVCam, VideoWrapper, etc.   Benefits of Open Scene Graph   Rendering Engine, Plug-ins, etc
  • 54. mARx Plug-in   3D Studio Max Plug-in   Can model and view AR content at the same time
  • 55. BuildAR   http://www.buildar.co.nz/   Stand alone application   Visual interface for AR model viewing application   Enables non-programmers to build AR scenes
  • 56. Metaio Creator   Drag and drop AR   http://www.metaio.com/creator/
  • 57. Total Immersion D’Fusion Studio   Complete commercial authoring platform   http://www.t-immersion.com/   Multi-platform   Markerless tracking   Scripting   Face tracking   Finger tracking   Kinect support
  • 58. Others   AR-Media   http://www.inglobetechnologies.com/   Google sketch-up plug-in   LinceoVR   http://linceovr.seac02.it/   AR/VR authoring package   Libraries   JARToolKit, MXRToolKit, ARLib, Goblin XNA
  • 59. More Libraries   JARToolKit   MRToolKit, MXRToolKit, ARLib, OpenVIDIA   DWARF, Goblin XNA   AMIRE   D’Fusion
  • 60. Advanced Authoring: iaTAR (Lee 2004)   Immersive AR Authoring   Using real objects to create AR applications
  • 62. What is a Scene Graph?   Tree-like structure for organising a virtual world   e.g. VRML   Hierarchy of nodes that define:   Groups (and Switches, Sequences etc…)   Transformations   Projections   Geometry   …   And states and attributes that define:   Materials and textures   Lighting and blending   …
  • 64. Benefits of a Scene Graph   Performance   Structuring data facilitates optimization -  Culling, state management, etc…   Abstraction   Underlying graphics pipeline is hidden   Low-level programming (“how do I display this?”) replaced with high- level concepts (“what do I want to display?”) Image: sgi
  • 65. About Open Scene Graph   http://www.openscenegraph.org/   Open-source scene graph implementation   Based on OpenGL   Object-oriented C++ following design pattern principles   Used for simulation, games, research, and industrial projects   Active development community   Maintained by Robert Osfield   ~2000 mailing list subscribers   Documentation project: www.osgbooks.com   Uses the OSG Public License (similar to LGPL)
  • 66. About Open Scene Graph (2) Pirates of the XXI Century Flightgear 3DVRII Research Institute EOR SCANeR VRlab Umeå University
  • 67. Open Scene Graph Features   Plugins for loading and saving   3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…   2D: .png, .jpg, .bmp, QuickTime movies   NodeKits to extend functionality   e.g. osgShadow   Cross-platform support for:   Window management (osgViewer)   Threading (OpenThreads)
  • 68. Open Scene Graph Architecture Plugins read and write 2D image and 3D model files NodeKits extend core functionality, exposing higher-level node types Scene graph and rendering functionality Inter-operability with other environments, e.g. Python
  • 69. Some Open Scene Graph Demos   You may want to get the OSG data package:   Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk osgviewer osgmotionblur osgparticle osgreflect osgdistortion osgfxbrowser
  • 70. Learning OSG   Check out the Quick Start Guide   Free PDF download at http://osgbooks.com/, Physical copy $13US   Join the mailing list: http://www.openscenegraph.org/projects/osg/wiki/ MailingLists   Browse the website: http://www.openscenegraph.org/projects/osg   Use the forum: http://forum.openscenegraph.org   Study the examples   Read the source? 
  • 72. What is osgART?   osgART adds AR to Open Scene Graph   Further developed and enhanced by:   Julian Looser   Hartmut Seichter   Raphael Grasset   Current version 2.0, Open Source   http://www.osgart.org
  • 73. osgART Approach: Basic Scene Graph Root Transform 3D Object 0.988 -0.031 -0.145 0 -0.048 0.857 -0.512 0 0.141 0.513 0.846 0 10.939 29.859 -226.733 1 [ ]   To add Video see-through AR:   Integrate live video   Apply correct projection matrix   Update tracked transformations in realtime
  • 74. osgART Approach: AR Scene Graph Root Transform 3D Object
  • 75. osgART Approach: AR Scene Graph Video Geode Root Transform 3D Object Virtual Camera Video Layer
  • 76. osgART Approach: AR Scene Graph Video Geode Root Transform 3D Object Virtual Camera Projection matrix from tracker calibration Transformation matrix updated from marker tracking in realtimeVideo Layer Full-screen quad with live texture updated from Video source Orthographic projection
  • 77. osgART Approach: AR Scene Graph Video Geode Root Transform 3D Object Virtual Camera Projection matrix from tracker calibration Transformation matrix updated from marker tracking in realtimeVideo Layer Full-screen quad with live texture updated from Video source Orthographic projection
  • 78. osgART Architecture   Like any video see-through AR library, osgART requires video input and tracking capabilities. ARLibrary Application Video Source e.g. DirectShow Tracking Module (libAR.lib)
  • 79. osgART Architecture   osgART uses a plugin architecture so that video sources and tracking technologies can be plugged in as necessary osgART Application VideoPluginTrackerPlugin ARToolKit4 - ARToolkitPlus - MXRToolKit - ARLib - bazAR (work in progress) - ARTag (work in progress) - OpenCVVideo - VidCapture - CMU1394 - PointGrey SDK - VidereDesign - VideoWrapper - VideoInput - VideoSource - DSVL - Intranel RTSP -
  • 80. Basic osgART Tutorial   Develop a working osgART application from scratch.   Use ARToolKit 2.72 library for tracking and video capture
  • 81. osgART Tutorial 1: Basic OSG Viewer   Start with the standard Open Scene Graph Viewer   We will modify this to do AR!
  • 82. osgART Tutorial 1: Basic OSG Viewer   The basic osgViewer… #include <osgViewer/Viewer> #include <osgViewer/ViewerEventHandlers> int main(int argc, char* argv[]) { // Create a viewer osgViewer::Viewer viewer; // Create a root node osg::ref_ptr<osg::Group> root = new osg::Group; // Attach root node to the viewer viewer.setSceneData(root.get()); // Add relevant event handlers to the viewer viewer.addEventHandler(new osgViewer::StatsHandler); viewer.addEventHandler(new osgViewer::WindowSizeHandler); viewer.addEventHandler(new osgViewer::ThreadingHandler); viewer.addEventHandler(new osgViewer::HelpHandler); // Run the viewer and exit the program when the viewer is closed return viewer.run(); }
  • 83. osgART Tutorial 2: Adding Video   Add a video plugin   Load, configure, start video capture…   Add a video background   Create, link to video, add to scene-graph
  • 84. osgART Tutorial 2: Adding Video   New code to load and configure a Video Plugin: // Preload the video and tracker int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2"); // Load a video plugin. osg::ref_ptr<osgART::Video> video = dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id)); // Check if an instance of the video stream could be created if (!video.valid()) { // Without video an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl; exit(-1); } // Open the video. This will not yet start the video stream but will // get information about the format of the video which is essential // for the connected tracker. video->open();
  • 85. osgART Tutorial 2: Adding Video   New code to add a live video background osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get()); videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin"); root->addChild(videoBackground.get()); video->start(); osg::Group* createImageBackground(osg::Image* video) { osgART::VideoLayer* _layer = new osgART::VideoLayer(); _layer->setSize(*video); osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video); addTexturedQuad(*_geode, video->s(), video->t()); _layer->addChild(_geode); return _layer; }   In the main function…
  • 86. osgART Tutorial 3: Tracking   Add a Tracker plugin   Load, configure, link to video   Add a Marker to track   Load, activate   Tracked node   Create, link with marker via tracking callbacks   Print out the tracking data
  • 87. osgART Tutorial 3: Tracking int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2"); osg::ref_ptr<osgART::Tracker> tracker = dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id)); if (!tracker.valid()) { // Without tracker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl; exit(-1); } // get the tracker calibration object osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration(); // load a calibration file if (!calibration->load("data/camera_para.dat")) { // the calibration file was non-existing or couldnt be loaded osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl; exit(-1); } // set the image source for the tracker tracker->setImage(video.get()); osgART::TrackerCallback::addOrSet(root.get(), tracker.get()); // create the virtual camera and add it to the scene osg::ref_ptr<osg::Camera> cam = calibration->createCamera(); root->addChild(cam.get());   Load a tracking plugin and associate it with the video plugin
  • 88. osgART Tutorial 3: Tracking osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0"); if (!marker.valid()) { // Without marker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not add marker!" << std::endl; exit(-1); } marker->setActive(true); osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get()); cam->addChild(arTransform.get());   Load a marker and activate it   Associate it with a transformation node (via event callbacks)   Add the transformation node to the virtual camera node osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));   Add a debug callback to print out information about the tracked marker
  • 89. osgART Tutorial 3: Tracking   Tracking information is output to console
  • 90. osgART Tutorial 4: Adding Content   Now put the tracking data to use!   Add content to the tracked transform   Basic cube code arTransform->addChild(osgART::testCube()); arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
  • 91. osgART Tutorial 5: Adding 3D Model   Open Scene Graph can load some 3D formats directly:   e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA   Others need to be converted   Support for some formats is much better than others   e.g. OpenFlight good, 3ds hit and miss.   Recommend native .osg and .ive formats   .osg – ASCII representation of scene graph   .ive – Binary OSG file. Can contain hold textures.   osgExp : Exporter for 3DS Max is a good choice   http://sourceforge.net/projects/osgmaxexp   Otherwise .3ds files from TurboSquid can work
  • 92. osgART Tutorial 5: Adding 3D Model std::string filename = "media/hollow_cube.osg"; arTransform->addChild(osgDB::readNodeFile(filename));   Replace the simple cube with a 3D model   Models are loaded using the osgDB::readNodeFile() function   Note: Scale is important. Units are in mm. 3D Studio Max Export to .osg osgART
  • 93. osgART Tutorial 6: Multiple Markers   Repeat the process so far to track more than one marker simultaneously
  • 94. osgART Tutorial 6: Multiple Markers osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get()); arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg")); arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformA.get()); osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get()); arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg")); arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformB.get());   Load and activate two markers osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0"); markerA->setActive(true); osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0"); markerB->setActive(true);   Create two transformations, attach callbacks, and add models   Repeat the process so far to track more than one marker
  • 95. osgART Tutorial 6: Multiple Markers
  • 96. Basic osgART Tutorial: Summary Standard OSGViewer Addition ofVideo Addition of Tracking Addition of basic 3D graphics Addition of 3D Model Multiple Markers
  • 98. experiences applications tools components Sony CSL © 2004 Building Compelling AR Experiences Tracking, Display Authoring Interaction
  • 99. AR Interaction   Designing AR System = Interface Design   Using different input and output technologies   Objective is a high quality of user experience   Ease of use and learning   Performance and satisfaction
  • 100. User Interface and Tool   Human  User Interface/Tool  Machine/Object   Human Machine Interface © Andreas Dünser Tools User Interface
  • 101. User Interface: Characteristics   Input: mono or multimodal   Output: mono or multisensorial   Technique/Metaphor/Paradigm © Andreas Dünser Input Output Sensation of movement Metaphor: “Push” to accelerate “Turn” to rotate
  • 102. Human Computer Interface   Human  User Interface Computer System   Human Computer Interface= Hardware +| Software   Computer is everywhere now HCI electronic devices, Home Automation, Transport vehicles, etc © Andreas Dünser
  • 103. More terminology   Interaction Device= Input/Output of User Interface   Interaction Style= category of similar interaction techniques   Interaction Paradigm   Modality (human sense)   Usability
  • 104. Back to AR   You can see spatially registered AR.. how can you interact with it?
  • 105. Interaction Tasks   2D (from [Foley]):   Selection, Text Entry, Quantify, Position   3D (from [Bowman]):   Navigation (Travel/Wayfinding)   Selection   Manipulation   System Control/Data Input   AR: 2D + 3D Tasks and.. more specific tasks? [Foley] The Human Factors of Computer Graphics InteractionTechniques Foley, J. D.,V.Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
  • 106. AR Interfaces as Data Browsers   2D/3D virtual objects are registered in 3D   “VR in Real World”   Interaction   2D/3D virtual viewpoint control   Applications   Visualization, training
  • 107. AR Information Browsers   Information is registered to real-world context   Hand held AR displays   Interaction   Manipulation of a window into information space   Applications   Context-aware information displays Rekimoto, et al. 1997
  • 109. Current AR Information Browsers   Mobile AR   GPS + compass   Many Applications   Layar   Wikitude   Acrossair   PressLite   Yelp   AR Car Finder   …
  • 110. Junaio   AR Browser from Metaio   http://www.junaio.com/   AR browsing   GPS + compass   2D/3D object placement   Photos/live video   Community viewing
  • 111.
  • 113. Adding Models in Web Interface
  • 114. Advantages and Disadvantages   Important class of AR interfaces   Wearable computers   AR simulation, training   Limited interactivity   Modification of virtual content is difficult Rekimoto, et al. 1997
  • 115. 3D AR Interfaces   Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input   Interaction   Viewpoint control   Traditional 3D user interface interaction: manipulation, selection, etc. Kiyokawa, et al. 2000
  • 118. Advantages and Disadvantages   Important class of AR interfaces   Entertainment, design, training   Advantages   User can interact with 3D virtual object everywhere in space   Natural, familiar interaction   Disadvantages   Usually no tactile feedback   User has to use different devices for virtual and physical objects Oshima, et al. 2000
  • 119. Augmented Surfaces and Tangible Interfaces   Basic principles   Virtual objects are projected on a surface   Physical objects are used as controls for virtual objects   Support for collaboration
  • 120. Augmented Surfaces   Rekimoto, et al. 1998   Front projection   Marker-based tracking   Multiple projection surfaces
  • 121. Tangible User Interfaces (Ishii 97)   Create digital shadows for physical objects   Foreground   graspable UI   Background   ambient interfaces
  • 122. Tangible Interfaces - Ambient   Dangling String   Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues   Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities for information display
  • 123. Tangible Interface: ARgroove   Collaborative Instrument   Exploring Physically Based Interaction   Map physical actions to Midi output -  Translation, rotation -  Tilt, shake
  • 125. Visual Feedback   Continuous Visual Feedback is Key   Single Virtual Image Provides:   Rotation   Tilt   Height
  • 126. i/O Brush (Ryokai, Marti, Ishii)
  • 127. Other Examples   Triangles (Gorbert 1998)   Triangular based story telling   ActiveCube (Kitamura 2000-)   Cubes with sensors
  • 128. Lessons from Tangible Interfaces   Physical objects make us smart   Norman’s “Things that Make Us Smart”   encode affordances, constraints   Objects aid collaboration   establish shared meaning   Objects increase understanding   serve as cognitive artifacts
  • 129. TUI Limitations   Difficult to change object properties   can’t tell state of digital data   Limited display capabilities   projection screen = 2D   dependent on physical display surface   Separation between object and display   ARgroove
  • 130. Advantages and Disadvantages   Advantages   Natural - users hands are used for interacting with both virtual and real objects. -  No need for special purpose input devices   Disadvantages   Interaction is limited only to 2D surface -  Full 3D interaction and manipulation is difficult
  • 131. Orthogonal Nature of AR Interfaces
  • 132. Back to the Real World   AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views   TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  • 133.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Real Toolbox   Time-multiplexed   One device with many functions -  Space efficient -  mouse
  • 134. Tangible AR: Tiles (Space Multiplexed)   Tiles semantics   data tiles   operation tiles   Operation on tiles   proximity   spatial arrangements   space-multiplexed
  • 137. Object Based Interaction: MagicCup   Intuitive Virtual Object Manipulation on a Table-Top Workspace   Time multiplexed   Multiple Markers -  Robust Tracking   Tangible User Interface -  Intuitive Manipulation   Stereo Display -  Good Presence
  • 138. Our system   Main table, Menu table, Cup interface
  • 139.
  • 140. Tangible AR: Time-multiplexed Interaction   Use of natural physical object manipulations to control virtual objects   VOMAR Demo   Catalog book: -  Turn over the page   Paddle operation: -  Push, shake, incline, hit, scoop
  • 142. Advantages and Disadvantages   Advantages   Natural interaction with virtual and physical tools -  No need for special purpose input devices   Spatial interaction with virtual objects -  3D manipulation with virtual objects anywhere in physical space   Disadvantages   Requires Head Mounted Display
  • 143. Wrap-up   Browsing Interfaces   simple (conceptually!), unobtrusive   3D AR Interfaces   expressive, creative, require attention   Tangible Interfaces   Embedded into conventional environments   Tangible AR   Combines TUI input + AR display
  • 144. AR User Interface: Categorization   Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)   Specialized/VR Device: 3D VR devices, specially design device
  • 145. AR User Interface: Categorization   Tangible Interface : using physical object Hand/ Touch Interface : using pose and gesture of hand, fingers   Body Interface: using movement of body
  • 146. AR User Interface: Categorization   Speech Interface: voice, speech control   Multimodal Interface : Gesture + Speech   Haptic Interface : haptic feedback   Eye Tracking, Physiological, Brain Computer Interface..
  • 148. Websites   Software Download   http://artoolkit.sourceforge.net/   ARToolKit Documentation   http://www.hitl.washington.edu/artoolkit/   ARToolKit Forum   http://www.hitlabnz.org/wiki/Forum   ARToolworks Inc   http://www.artoolworks.com/
  • 149.   ARToolKit Plus   http://studierstube.icg.tu-graz.ac.at/handheld_ar/ artoolkitplus.php   osgART   http://www.osgart.org/   FLARToolKit   http://www.libspark.org/wiki/saqoosha/FLARToolKit/   FLARManager   http://words.transmote.com/wp/flarmanager/
  • 150. Project Assignment   Design/Related work exercise   Individual   Each person find 2 relevant papers/videos/websites   Write two page literature review   As a team - prototype design   Sketch out the user interface of application   Design the interaction flow/Screen mockups   3 minute Presentation in class August 16th