SlideShare ist ein Scribd-Unternehmen logo
1 von 106
Downloaden Sie, um offline zu lesen
Lecture 4. AR Interaction

         Mark Billinghurst
   mark.billinghurst@hitlabnz.org
             Gun Lee
       gun.lee@hitlabnz.org
             Aug 2011
  COSC 426: Augmented Reality
Building Compelling AR Experiences
B ildi   C    lli      E    i

          experiences

          applications   Interaction


             tools       Authoring


          components     Tracking, Display



                                       Sony CSL © 2004
AR Interaction
 Designing AR System = Interface Design
   Using diff
   U i different input and output technologies
                 i       d           h l i
 Objective is a high quality of user experience
   j              g q      y           p
   Ease of use and learning
   Performance and satisfaction
Interaction Tasks
     2D (from [Foley]):
           Selection, Text Entry, Quantify, Position
                               y         y
     3D (from [Bowman]):
           Navigation (Travel/Wayfinding)
           Selection
           Manipulation
           System Control/Data Input
     AR: 2D + 3D Tasks and.. more specific tasks?

[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE Computer
Graphics and Applications(Nov.): 13-48. 1984.
[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
AR Interfaces as Data Browsers
2D/3D virtual objects are
registered in 3D
  “VR in Real World”
Interaction
I t    ti
  2D/3D virtual viewpoint
  control
        l
Applications
  Visualization, training
AR Information Browsers
Information is registered to
real-world context
   Hand held AR displays
Interaction
   Manipulation f
   M i l i of a windowi d
   into information space
Applications
   Context-aware information displays


                                        Rekimoto, et al. 1997
Architecture
Current AR Information Browsers
Mobile AR
  GPS + compass
Many Applications
  Layar
    y
  Wikitude
  Acrossair
  PressLite
  Yelp
  AR Car Finder
  …
Junaio
AR Browser from Metaio
  http://www.junaio.com/
     p       j
AR browsing
  GPS + compass
  2D/3D object placement
  Photos/live video
  Community viewing
Advantages and Disadvantages

Important class of AR interfaces
  Wearable computers
  AR simulation, training
Limited interactivity
  Modification of virtual
  content is difficult
                               Rekimoto, et al. 1997
3D AR Interfaces
Virtual objects displayed in 3D
physical space and manipulated
  HMDs and 6DOF head-tracking
  6DOF hand trackers for input
       h d      k    f i
Interaction
  Viewpoint control
  Traditional 3D user interface           Kiyokawa, et al. 2000
  interaction: manipulation, selection,
  etc.
AR 3D Interaction
AR G ffiti
    Graffiti




www.nextwall.net
Advantages and Disadvantages
Important class of AR interfaces
   Entertainment, design, training
Advantages
   User can interact with 3D virtual
   object everywhere i space
    bj t          h     in
   Natural, familiar interaction
Disadvantages
   Usually no tactile feedback
   User has to use different devices for
   virtual and physical objects
                                           Oshima, et al. 2000
Augmented S f
            Surfaces and
Tangible Interfaces
 Basic principles
   Virtual objects are
   projected on a surface
   Physical objects are used
   as controls f virtual
             l for       l
   objects
   Support for collaboration
Augmented Surfaces
Rekimoto, et al. 1998
  Front projection
  F         j i
  Marker-based tracking
  Multiple projection surfaces
Tangible User Interfaces (Ishii 97)
Create digital shadows
for h i l bj
f physical objects
Foreground
    g
  graspable UI
Background
B k      d
  ambient interfaces
Tangible Interfaces - Ambient
 Dangling String
 D l S
   Jeremijenko 1995
   Ambient ethernet monitor
   A bi       h           i
   Relies on peripheral cues
 Ambient Fixtures
   Dahley, Wisneski, Ishii 1998
   Use natural material qualities
    for information display
Tangible Interface: ARgroove
Collaborative Instrument
Exploring Physically Based Interaction
   Map physical actions to Midi output
    - Translation rotation
      Translation,
    - Tilt, shake
ARgroove in Use
Visual Feedback
Continuous Visual Feedback is Key
Single Virtual Image Provides:
  Rotation
  Tilt
  Tl
  Height
i/O Brush (Ryokai, Marti, Ishii)
Other Examples
Triangles (Gorbert 1998)
  Triangular based story telling
       g               y       g
ActiveCube (Kitamura 2000-)
  Cubes
  C b with sensors
         h
Lessons from Tangible Interfaces
Physical objects make us smart
  Norman’s “Things that Make Us Smart”
                 g
  encode affordances, constraints
Objects aid collaboration
  establish shared meaning
Objects increase understanding
  serve as cognitive artifacts
TUI Limitations

Difficult to change object properties
  can’t tell state of digital data
Limited display capabilities
  projection screen = 2D
  dependent on physical display surface
Separation between object and display
  p                  j           p y
  ARgroove
Advantages and Disadvantages
Advantages
  Natural - users hands are used for interacting
  with both virtual and real objects.
   - N need for special purpose input devices
     No   df        i l         i   td i

Disadvantages
          g
  Spatial gap
   - Interaction is limited only to 2D surface
      • Full 3D interaction and manipulation is difficult
   - Separation between interaction object and display
Orthogonal Nature of AR Interfaces
Back to the Real World

AR overcomes limitation of TUIs
  enhance display possibilities
  merge task/display space
  provide public and private views


TUI + AR = Tangible AR
              g
  Apply TUI methods to AR interface design
Space vs. Time - Multiplexed
Space-multiplexed
  Many devices each with one function
      y
   - Quicker to use, more intuitive, clutter
   - Real Toolbox

Time multiplexed
Time-multiplexed
   One device with many functions
    - Space efficient
    - mouse
Tangible AR: Tiles (Space Multiplexed)
 Tiles
 T l semantics
   data tiles
   operation tiles
           i  il
 Operation on tiles
   proximity
   spatial arrangements
   space-multiplexed
              lti l d
Space-multiplexed Interface




   Data authoring in Tiles
Proximity-based Interaction
Object Based Interaction: MagicCup
 Intuitive Virtual Object Manipulation
 on a Table-Top Workspace
   Time multiplexed
   Multiple Markers
   M li l M k
    - Robust Tracking
   Tangible User Interface
    - Intuitive Manipulation
   Stereo Display
    - Good Presence
MagicCup system




Main table, Menu table, Cup interface
Tangible AR: Time-multiplexed Interaction
  Use of natural physical object manipulations to
                    j
  control virtual objects
  VOMAR Demo
    Catalog book:
     - Turn over the page
                     p g
    Paddle operation:
     - Push, shake, incline, hit, scoop
VOMAR Interface
Advantages and Disadvantages
Advantages
  Natural interaction with virtual and physical tools
   - No need for special purpose input devices
  Spatial interaction with virtual objects
   - 3D manipulation with virtual objects anywhere in physical
     space

Disadvantages
  Requires Head Mounted Display
Wrap-up
Browsing Interfaces
  simple (conceptually!), unobtrusive
         (conceptually!)
3D AR Interfaces
  expressive, creative, require attention
Tangible Interfaces
  Embedded into conventional environments
Tangible
T ibl AR
  Combines TUI input + AR display
                 p           p y
Designing AR
Applications
Interface Design Path
1/ Prototype Demonstration
2/ Adoption of Interaction Techniques from other
  interface metaphors
                p                Augmented Reality
3/ Development of new interface metaphors
  appropriate to the medium
                                     Virtual Reality
4/ Development of formal theoretical models for
  predicting and modeling user actions
                                     Desktop WIMP
AR Design Space

    Reality                            Virtual Reality

                  Augmented R lit
                  A     t d Reality




Physical Design                       Virtual Design
AR is mixture of physical affordance and
virtual affordance
Physical
  Tangible
  T ibl controllers and objects
              ll      d bj
Virtual
  Virtual graphics and audio
AR Design Principles
Interface Components
 Physical components
 Display elements
  - Visual/audio
 Interaction metaphors
          Physical
             y                      Display
                                       p y
          Elements    Interaction   Elements
                      Metaphor
              Input                  Output
Tangible AR Metaphor
AR overcomes limitation of TUIs
  enhance dis la possibilities
          display ssibilities
  merge task/display space
  provide public and private views


TUI + AR = Tangible AR
  Apply TUI methods to AR interface design
Tangible AR Design Principles
Tangible AR Interfaces use TUI principles
  Physical controllers for moving virtual content
  Support for spatial 3D interaction techniques
  Support for multi-handed interaction
  Match object affordances to task requirements
            j                          q
  Support parallel activity with multiple objects
  Allow collaboration between multiple users
Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
  MagicLenses
    Developed at Xerox PARC in 1993
    View
    Vi a region of the workspace differently to the rest
             i    f h      k       diff   l      h
    Overlap MagicLenses to create composite effects
3D MagicLenses
MagicLenses extended to 3D (Veiga et. al. 96)
  Volumetric and flat lenses
AR Lens Design Principles
Physical Components
  Lens handle
   - Virtual lens attached to real object
Display Elements
  Lens view
   - Reveal layers in dataset
Interaction Metaphor
  Physically holding lens
  Ph i ll h ldi l
3D AR Lenses: Model Viewer
Displays models made up of multiple parts
Each part can be shown or hidden through the lens
Allows the user to peer inside the model
Maintains focus + context
AR Lens Demo
AR FlexiLens




Real handles/controllers with flexible AR lens
Techniques based on AR Lenses
Object Selection
  Select objects by targeting them with the lens
Information Filtering
  Show different representations through the lens
  Hide certain content to reduce clutter, look inside things
Case Study 2 : LevelHead




Block based game
Case Study 2: LevelHead
Physical Components
  Real blocks
Display Elements
  Virtual person and rooms
Interaction M
I       i Metaphor
               h
  Blocks are rooms
Case Study 3: AR Chemistry (Fjeld 2002)
Tangible AR chemistry education
Goal: An AR application to test molecular
G
  structure in chemistry
                       y
 Physical Components
   Real book,
   R l b k rotation cube, scoop, tracking markers
                i     b              ki      k
 Display Elements
    p y
   AR atoms and molecules
 Interaction M t h
 I t    ti Metaphor
   Build your own molecule
AR Chemistry Input Devices
Case Study 4: Transitional Interfaces
Goal: An AR interface supporting transitions
 from reality to virtual reality
 Physical Components
    Real book
 Display Elements
    AR and VR content
 Interaction Metaphor
    Book pages hold virtual scenes
         p g
Milgram’s Continuum (1994)
                      Mixed Reality (MR)


Reality        Augmented           Augmented            Virtuality
                                                                 y
(Tangible      Reality (AR)        Virtuality (AV)      (Virtual
Interfaces)                                             Reality)


  Central Hypothesis
           yp
       The next generation of interfaces will support transitions
       along the Reality-Virtuality continuum
Transitions
Interfaces of the future will need to support
transitions along the RV continuum
     ii      l     h           i
Augmented Reality is preferred for:
  co-located collaboration
Immersive Virtual Reality is preferred for:
I     i Vi t l R lit i          f    df
  experiencing world immersively (egocentric)
  sharing views
  remote collaboration
The MagicBook
Design Goals:
  Allows user to move smoothly between reality
  and virtual reality
  Support collaboration
  S           ll b    i
MagicBook Metaphor
Features
Seamless transition between Reality and Virtuality
  Reliance on real d
  R li           l decreases as virtual increases
                                 i    li
Supports egocentric and exocentric views
  User can pick appropriate view
Computer becomes invisible
  Consistent interface metaphors
  Virtual content seems real
Supports collaboration
Design alternatives for
      common user tasks in Tangible AR
 User Tasks                          Interface Design
               Camera on HMD
Viewpoint
               Fixed camera – top view, front view, mirroring
Control        Handheld camera
               Statically paired virtual and physical objects
Selection      Dynamically paired virtual and physical objects - paddle, pointer
               D       i ll     i d i t l d h i l bj t             ddl     i t
               Direct mapping of whole 6DOF
3D
               Filtered/distorted mapping - snapping, non-linear mapping
Manipulation   Multiplexed mapping - rotation from one, position from another
               Location and pose based - proximity, spatial configuration
               Gestures with props - tilt shake
                                        tilt,
Event &                      Keyboard & mouse
Command        Menu,         2D/3D GUI with tracking objects as pointers
               Buttons       Occlusion based interaction
                             Custom hardware devices
Design Tips for Tangible AR
Using metaphors from the real world
Take advantage of parallel interactions
Use timers to prevent accidents
Interaction volume – user, tracking, whitespace
What happens when tracking gets lost or object
is out of view?
Problems in visualization
  Field of view, occlusions
               ,
OSGART:
From Registration to Interaction
Keyboard and Mouse Interaction
Traditional input techniques
OSG provides a framework for handling keyboard
     p                              g y
and mouse input events (osgGA)
 1. Subclass osgGA::GUIEventHandler
               g
 2. Handle events:
    •   Mouse up / down / move / drag / scroll-wheel
    •   Key up / down
 3. Add instance of new handler to the viewer
Keyboard and Mouse Interaction
          Create your own event handler class
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {

public:
   KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }

     virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa,
        osg::Object* obj, osg::NodeVisitor* nv) {

         switch (ea.getEventType()) {
            // Possible events we can handle
            case osgGA::GUIEventAdapter::PUSH: break;
            case osgGA::GUIEventAdapter::RELEASE: break;
            case osgGA::GUIEventAdapter::MOVE: break;
            case osgGA::GUIEventAdapter::DRAG: break;
            case osgGA::GUIEventAdapter::SCROLL: break;
            case osgGA::GUIEventAdapter::KEYUP: b
                                 d              break;
                                                    k
            case osgGA::GUIEventAdapter::KEYDOWN: break;
         }

         return false;
     }
};


          Add it to the viewer to receive events
viewer.addEventHandler(new KeyboardMouseEventHandler());
Keyboard Interaction
         Handle W,A,S,D keys to move an object
case osgGA::GUIEventAdapter::KEYDOWN: {

   switch (ea.getKey()) {
      case 'w': // Move forward 5mm
         localTransform->preMult(osg::Matrix::translate(0, -5, 0));
         return true;
      case 's': // Move back 5mm
         localTransform->preMult(osg::Matrix::translate(0, 5, 0));
         return true;
      case 'a': // Rotate 10 degrees left
         localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS));
         return true;
      case 'd': // Rotate 10 degrees right
         localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS));
         return t
           t    true;
      case ' ': // Reset the transformation
         localTransform->setMatrix(osg::Matrix::identity());
         return true;
   }

break;


localTransform = new osg::MatrixTransform();
localTransform->addChild(osgDB::readNodeFile("media/car.ive"));
arTransform->addChild(localTransform.get());
arTransform >addChild(localTransform get());
Keyboard Interaction Demo
Mouse Interaction
Mouse is pointing device…
Use mouse to select objects in an AR scene
OSG provides methods for ray-casting and
intersection testing
  Return an osg::NodePath (the p
               g            (   path from the hit
  node all the way back to the root)


                        Projection
                      Plane (screen)    scene
Mouse Interaction
   Compute the list of nodes under the clicked position
   Invoke an action on nodes that are hit, e.g. select, delete
case osgGA::GUIEventAdapter::PUSH:

   osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa);
   osgUtil::LineSegmentIntersector::Intersections intersections;

   // Clear previous selections
   for (
   f   (unsigned i t i = 0 i < t
           i   d int     0;     targets.size(); i++) {
                                     t   i ()
      targets[i]->setSelected(false);
   }

   // Find new selection based on click position
   if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {
      for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin();
         iter != intersections.end(); iter++) {

           if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) {
              std::cout << "HIT!" << std::endl;
              target >setSelected(true);
              target->setSelected(true);
              return true;
           }
       }
   }

   break;
Mouse Interaction Demo
Proximity Techniques
Interaction based on
  the distance between a marker and the camera
  the distance between multiple markers
Single Marker Techniques: Proximity
Use distance from camera to marker as
input parameter
   e.g.
   e g Lean in close to examine
Can use the osg::LOD class to show
different content at different depth
ranges                                  Image: OpenSG Consortium
Single Marker Techniques: Proximity
// Load some models
osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");
osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");
osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");

// Use a Level-Of-Detail node to show each model at different distance ranges.
osg::ref_ptr<osg::LOD> lod = new osg::LOD();
lod->addChild(farNode.get(), 500.0f, 10000.0f);      // Show the "far" node from 50cm to 10m away
lod->addChild(closerNode.get(), 200.0f, 500.0f);     // Show the "closer" node from 20cm to 50cm away
lod->addChild(nearNode.get(), 0.0f, 200.0f);         // Show the "near" node from 0cm to 2cm away

arTransform->addChild(lod.get());




   Define depth ranges for each node
   Add as many as you want
   Ranges can overlap
Single Marker Proximity Demo
Multiple Marker Concepts
Interaction based on the relationship between
markers
  e.g. When the distance between two markers
  decreases below threshold invoke an action
  Tangible User Interface
Applications:
A l
  Memory card g
          y     games
  File operations
Multiple Marker Proximity

                           Virtual
                           Camera
    Transform A                         Transform B

                                                                Distance > Threshold

        Switch A                             Switch B




Model              Model             Model              Model
 A1                 A2                B1                 B2
Multiple Marker Proximity

                           Virtual
                           Camera
    Transform A                         Transform B

                                                                Distance <= Threshold

        Switch A                             Switch B




Model              Model             Model              Model
 A1                 A2                B1                 B2
Multiple Marker Proximity
        Use a node callback to test for proximity and update the relevant nodes

virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {

    if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {
       if (mMarkerA->valid() && mMarkerB->valid()) {

             osg::Vec3 posA =   mMarkerA->getTransform().getTrans();
             osg::Vec3 posB =   mMarkerB->getTransform().getTrans();
             osg::Vec3 offset   = posA - posB;
             float distance =   offset.length();

             if (distance <= mThreshold) {
                if (mSwitchA->getNumChildren()   > 1) mSwitchA->setSingleChildOn(1);
                if (mSwitchB->getNumChildren()   > 1) mSwitchB->setSingleChildOn(1);
             } else {
                if (mSwitchA->getNumChildren()   > 0) mSwitchA->setSingleChildOn(0);
                if (mSwitchB->getNumChildren()   > 0) mSwitchB->setSingleChildOn(0);
             }

         }

    }

    traverse(node,nv);

}
Multiple Marker Proximity
Paddle Interaction
 Use
 U one marker as a tool for selecting and
              k           lf    l        d
 manipulating objects (tangible user interface)
 Another
 A h marker provides a frame of reference
              k        id    f     f f
      A grid of markers can alleviate problems with occlusion




MagicCup (Kato et al)    VOMAR (K t et al)
                               (Kato t l)
Paddle Interaction
Often useful to adopt a local coordinate system

                                                      Allows the camera
                                                      to move without
                                                      disrupting Tlocal

                                                      Places the paddle in
                                                      the
                                                      th same coordinate
                                                                    di t
                                                      system as the
                                                      content on the grid
                                                          Simplifies interaction
osg
osgART co putes Tlocal using the osgART::LocalTransformationCallback
       computes        us g t e osg    :: oca a s o at o Ca bac
Tilt and Shake Interaction

Detect types of paddle movement:
        yp      p
  Tilt
   - gradual change in orientation
  Shake
   - short sudden chan es in translation
     short, s dden changes
More Information
• Mark Billinghurst
       Billingh rst
  – mark.billinghurst@hitlabnz.org
• Gun Lee
  – gun.lee@hitlabnz.org
    g      @           g
• Websites
  – www hitlabnz org
    www.hitlabnz.org
Building Tangible AR Interfaces
        with ARToolKit
Required Code
Calculating Camera P
C l l       C      Position
  Range to marker
Loading Multiple Patterns/Models
                         /
Interaction between objects
  Proximity
  Relative position/orientation
Occlusion
  Stencil buffering
  Multi-marker tracking
Tangible AR Coordinate Frames
Local vs. Global Interactions
Local
  Actions determined from single camera to marker
  transform
   - shaking, appearance, relative position, range
Global
  Actions determined from two relationships
   - marker to camera, world to camera coords.
   - Marker transform determined in world coordinates
        • object tilt, absolute position, absolute rotation, hitting
Range-based Interaction
 Sample Fil R
 S   l File: RangeTest.c
                  T t

/* get the camera transformation */
        h              f     i
arGetTransMat(&marker_info[k], marker_center,
  marker_width, marker_trans);
  marker width marker trans);

/
/* find the range */
                   /
Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
  p                [ ][ ]
Zpos = marker_trans[2][3];
range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
Loading Multiple Patterns
Sample F l L dM l
S   l File: LoadMulti.c
  Uses object.c to load
Object Structure
typedef struct {
  char       name[256];
                   2 6
  int        id;
  int        visible;;
  double     marker_coord[4][2];
  double     trans[3][4];
  double
  d bl       marker_width;
                 k     idth
  double     marker_center[2];
} ObjectData_T;
            _
Finding Multiple Transforms
 Create object list
 C
ObjectData_T      *object;

 Read in objects - in init( )
read_ObjData(
read ObjData( char *name int *objectnum );
                   *name,

 Find Transform – in mainLoop( )
                            p(
for( i = 0; i < objectnum; i++ ) {
    ..Check patterns
            p
    ..Find transforms for each marker
  }
Drawing Multiple Objects
 Send the object list to the draw function
draw( object objectnum );
      object,

 Draw each object individually
for( i = 0; i < objectnum; i++ ) {
   if( object[i].visible == 0 ) continue;
   argConvGlpara(object[i].trans, gl_para);
   draw_object( object[i].id, gl_para);
}
Proximity Based Interaction

Sample File – CollideTest.c
Detect distance between markers
checkCollisions(object[0],object[1], DIST)
If distance < collide distance
Then change the model/perform interaction
Multi-marker Tracking
Sample File – multiTest.c
Multiple markers to establish a
single coordinate frame
   g
  Reading in a configuration file
  Tracking from sets of markers
  T ki f           t f      k
  Careful camera calibration
MultiMarker Configuration File
Sample File - Data/multi/marker dat
              Data/multi/marker.dat
Contains list of all the patterns and their exact
positions
   ii
    #the number of patterns to be recognized
    6
                                 Pattern File
    #marker 1
                                        Pattern Width +
    Data/multi/patt.a
                                        Coordinate Origin
    40.0
    0.0 0.0                                Pattern Transform
    1.0000 0.0000 0.0000 -100.0000         Relative to Global
    0.0000 1.0000 0.0000 50.0000           Origin
    0.0000 0.0000 1.0000 0.0000
    0 0000 0 0000 1 0000 0 0000
    …
Camera Transform Calculation
Include <AR/arMulti.h>
I l d <AR/ M lti h>
Link to libARMulti.lib
In mainLoop()
  Detect markers as usual
  arDetectMarkerLite(dataPtr, thresh,
   &marker_info, &marker_num)
  Use MultiMarker Function
  if( (err=arMultiGetTransMat(marker_info,
          marker_num, config)) < 0 ) {
              k            fi ))
           argSwapBuffers();
           return;
     }
Paddle-based Interaction




Tracking single marker relative to multi-marker set
  - paddle contains single marker
    p                  g
Paddle Interaction Code
  Sample File – PaddleDemo.c
  Get paddle marker location + draw paddle before drawing
      p                             p                   g
  background model
  paddleGetTrans(paddleInfo, marker_info,
      marker_flag, marker_num,
      marker flag marker num &cparam);

  /* draw the paddle */
  if( paddleInfo->active ){
     draw_paddle(
     draw paddle( paddleInfo);
  }
draw_paddle
draw paddle uses a Stencil Buffer to increase realism
Paddle Interaction Code II
Sample File – paddleDrawDemo.c
Finds the paddle position relative to global coordinate frame:
setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
Sample File – paddleTouch.c
Finds the paddle position:
findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
Checks for collisions:
checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
checkCollision(&curPaddlePos myTarget[i] pos 20 0)
General Tangible AR Library
command_sub.c, command_sub.h
         d b              d bh
Contains functions for recognizing a range of
different paddle motions:
diff        ddl     i
int   check_shake( );
int
i t   check_punch( )
       h k      h( );
int   check_incline( );
int   check pickup( );
      check_pickup(
int   check_push( );
Eg: to check angle between paddle and base
check_incline(paddle->trans, base->trans, &ang)

Weitere ähnliche Inhalte

Was ist angesagt?

Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
 
How AR/VRcan be used in urban planning and interior designing
How AR/VRcan be used in urban planning and interior designingHow AR/VRcan be used in urban planning and interior designing
How AR/VRcan be used in urban planning and interior designingSonamChoudhury
 
RBI paper, CHI 2008
RBI paper, CHI 2008RBI paper, CHI 2008
RBI paper, CHI 2008guest0dd2a1
 
Using Augmented Reality to Create Empathic Experiences
Using Augmented Reality to Create Empathic ExperiencesUsing Augmented Reality to Create Empathic Experiences
Using Augmented Reality to Create Empathic ExperiencesMark Billinghurst
 
COSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsCOSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsMark Billinghurst
 
From Interaction to Understanding
From Interaction to UnderstandingFrom Interaction to Understanding
From Interaction to UnderstandingMark Billinghurst
 
Virtual environments in design education
Virtual environments in design educationVirtual environments in design education
Virtual environments in design educationShubh Cheema
 
IRJET- Augmented Reality based Interior Decorator System
IRJET- Augmented Reality based Interior Decorator SystemIRJET- Augmented Reality based Interior Decorator System
IRJET- Augmented Reality based Interior Decorator SystemIRJET Journal
 
Reach into the computer & grab a pixel
Reach into the computer & grab a pixelReach into the computer & grab a pixel
Reach into the computer & grab a pixelKIIT University
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
Augmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and SpaceAugmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and SpaceWoontack Woo
 
augmented reality paper presentation
augmented reality paper presentationaugmented reality paper presentation
augmented reality paper presentationVaibhav Mehta
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 

Was ist angesagt? (20)

Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
How AR/VRcan be used in urban planning and interior designing
How AR/VRcan be used in urban planning and interior designingHow AR/VRcan be used in urban planning and interior designing
How AR/VRcan be used in urban planning and interior designing
 
RBI paper, CHI 2008
RBI paper, CHI 2008RBI paper, CHI 2008
RBI paper, CHI 2008
 
Using Augmented Reality to Create Empathic Experiences
Using Augmented Reality to Create Empathic ExperiencesUsing Augmented Reality to Create Empathic Experiences
Using Augmented Reality to Create Empathic Experiences
 
COSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsCOSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research Directions
 
Spatial computing - extending reality
Spatial computing - extending realitySpatial computing - extending reality
Spatial computing - extending reality
 
From Interaction to Understanding
From Interaction to UnderstandingFrom Interaction to Understanding
From Interaction to Understanding
 
Suman
SumanSuman
Suman
 
Alvaro Cassinelli / Meta Perception Group leader
Alvaro Cassinelli / Meta Perception Group leaderAlvaro Cassinelli / Meta Perception Group leader
Alvaro Cassinelli / Meta Perception Group leader
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Virtual environments in design education
Virtual environments in design educationVirtual environments in design education
Virtual environments in design education
 
IRJET- Augmented Reality based Interior Decorator System
IRJET- Augmented Reality based Interior Decorator SystemIRJET- Augmented Reality based Interior Decorator System
IRJET- Augmented Reality based Interior Decorator System
 
Reach into the computer & grab a pixel
Reach into the computer & grab a pixelReach into the computer & grab a pixel
Reach into the computer & grab a pixel
 
SVR2011 Keynote
SVR2011 KeynoteSVR2011 Keynote
SVR2011 Keynote
 
Jithu
JithuJithu
Jithu
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Augmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and SpaceAugmented Human: AR 4.0 Beyond Time and Space
Augmented Human: AR 4.0 Beyond Time and Space
 
augmented reality paper presentation
augmented reality paper presentationaugmented reality paper presentation
augmented reality paper presentation
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
Augmented Reality ppt
Augmented Reality pptAugmented Reality ppt
Augmented Reality ppt
 

Andere mochten auch

COSC 426 Lect. 7: Evaluating AR Applications
COSC 426 Lect. 7: Evaluating AR ApplicationsCOSC 426 Lect. 7: Evaluating AR Applications
COSC 426 Lect. 7: Evaluating AR ApplicationsMark Billinghurst
 
Designing Mobile AR Applications
Designing Mobile AR ApplicationsDesigning Mobile AR Applications
Designing Mobile AR ApplicationsMark Billinghurst
 
The Glass Class Lecture 6: Interface Guidelines
The Glass Class Lecture 6:  Interface GuidelinesThe Glass Class Lecture 6:  Interface Guidelines
The Glass Class Lecture 6: Interface GuidelinesMark Billinghurst
 
VRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteVRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteMark Billinghurst
 

Andere mochten auch (6)

Science Fair
Science FairScience Fair
Science Fair
 
Hands and Speech in Space
Hands and Speech in SpaceHands and Speech in Space
Hands and Speech in Space
 
COSC 426 Lect. 7: Evaluating AR Applications
COSC 426 Lect. 7: Evaluating AR ApplicationsCOSC 426 Lect. 7: Evaluating AR Applications
COSC 426 Lect. 7: Evaluating AR Applications
 
Designing Mobile AR Applications
Designing Mobile AR ApplicationsDesigning Mobile AR Applications
Designing Mobile AR Applications
 
The Glass Class Lecture 6: Interface Guidelines
The Glass Class Lecture 6:  Interface GuidelinesThe Glass Class Lecture 6:  Interface Guidelines
The Glass Class Lecture 6: Interface Guidelines
 
VRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteVRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst Keynote
 

Ähnlich wie COSC 426 lect. 4: AR Interaction

426 lecture6b: AR Interaction
426 lecture6b: AR Interaction426 lecture6b: AR Interaction
426 lecture6b: AR InteractionMark Billinghurst
 
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR InterfacesMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionMark Billinghurst
 
2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design GuidelinesMark Billinghurst
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionMark Billinghurst
 
Building Usable AR Interfaces
Building Usable AR InterfacesBuilding Usable AR Interfaces
Building Usable AR InterfacesMark Billinghurst
 
COSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to ARCOSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to ARMark Billinghurst
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3Mark Billinghurst
 
Natural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityNatural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityMark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
HCI 3e - Ch 20: Ubiquitous computing and augmented realities
HCI 3e - Ch 20:  Ubiquitous computing and augmented realitiesHCI 3e - Ch 20:  Ubiquitous computing and augmented realities
HCI 3e - Ch 20: Ubiquitous computing and augmented realitiesAlan Dix
 
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...Leonel Merino
 
The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?Mark Billinghurst
 
Multi touch table by vinay jain
Multi touch table by vinay jainMulti touch table by vinay jain
Multi touch table by vinay jainVinay Jain
 
426 lecture1: Introduction to AR
426 lecture1: Introduction to AR426 lecture1: Introduction to AR
426 lecture1: Introduction to ARMark Billinghurst
 

Ähnlich wie COSC 426 lect. 4: AR Interaction (20)

426 lecture6b: AR Interaction
426 lecture6b: AR Interaction426 lecture6b: AR Interaction
426 lecture6b: AR Interaction
 
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
 
Can You See What I See?
Can You See What I See?Can You See What I See?
Can You See What I See?
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR Interaction
 
Tangible A
Tangible  ATangible  A
Tangible A
 
2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
Building Usable AR Interfaces
Building Usable AR InterfacesBuilding Usable AR Interfaces
Building Usable AR Interfaces
 
COSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to ARCOSC 426 Lect. 1 - Introduction to AR
COSC 426 Lect. 1 - Introduction to AR
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3
 
Natural Interfaces for Augmented Reality
Natural Interfaces for Augmented RealityNatural Interfaces for Augmented Reality
Natural Interfaces for Augmented Reality
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
Mobile AR for Urban Design
Mobile AR for Urban DesignMobile AR for Urban Design
Mobile AR for Urban Design
 
UVR2011(icat2011)
UVR2011(icat2011)UVR2011(icat2011)
UVR2011(icat2011)
 
HCI 3e - Ch 20: Ubiquitous computing and augmented realities
HCI 3e - Ch 20:  Ubiquitous computing and augmented realitiesHCI 3e - Ch 20:  Ubiquitous computing and augmented realities
HCI 3e - Ch 20: Ubiquitous computing and augmented realities
 
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
Unleashing the Potentials of Immersive Augmented Reality for Software Enginee...
 
The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?
 
Multi touch table by vinay jain
Multi touch table by vinay jainMulti touch table by vinay jain
Multi touch table by vinay jain
 
426 lecture1: Introduction to AR
426 lecture1: Introduction to AR426 lecture1: Introduction to AR
426 lecture1: Introduction to AR
 

Mehr von Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 

Mehr von Mark Billinghurst (20)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 

Kürzlich hochgeladen

Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAndikSusilo4
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 

Kürzlich hochgeladen (20)

Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & Application
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 

COSC 426 lect. 4: AR Interaction

  • 1. Lecture 4. AR Interaction Mark Billinghurst mark.billinghurst@hitlabnz.org Gun Lee gun.lee@hitlabnz.org Aug 2011 COSC 426: Augmented Reality
  • 2. Building Compelling AR Experiences B ildi C lli E i experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  • 3. AR Interaction Designing AR System = Interface Design Using diff U i different input and output technologies i d h l i Objective is a high quality of user experience j g q y p Ease of use and learning Performance and satisfaction
  • 4. Interaction Tasks 2D (from [Foley]): Selection, Text Entry, Quantify, Position y y 3D (from [Bowman]): Navigation (Travel/Wayfinding) Selection Manipulation System Control/Data Input AR: 2D + 3D Tasks and.. more specific tasks? [Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
  • 5. AR Interfaces as Data Browsers 2D/3D virtual objects are registered in 3D “VR in Real World” Interaction I t ti 2D/3D virtual viewpoint control l Applications Visualization, training
  • 6. AR Information Browsers Information is registered to real-world context Hand held AR displays Interaction Manipulation f M i l i of a windowi d into information space Applications Context-aware information displays Rekimoto, et al. 1997
  • 8. Current AR Information Browsers Mobile AR GPS + compass Many Applications Layar y Wikitude Acrossair PressLite Yelp AR Car Finder …
  • 9. Junaio AR Browser from Metaio http://www.junaio.com/ p j AR browsing GPS + compass 2D/3D object placement Photos/live video Community viewing
  • 10.
  • 11. Advantages and Disadvantages Important class of AR interfaces Wearable computers AR simulation, training Limited interactivity Modification of virtual content is difficult Rekimoto, et al. 1997
  • 12. 3D AR Interfaces Virtual objects displayed in 3D physical space and manipulated HMDs and 6DOF head-tracking 6DOF hand trackers for input h d k f i Interaction Viewpoint control Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
  • 14. AR G ffiti Graffiti www.nextwall.net
  • 15. Advantages and Disadvantages Important class of AR interfaces Entertainment, design, training Advantages User can interact with 3D virtual object everywhere i space bj t h in Natural, familiar interaction Disadvantages Usually no tactile feedback User has to use different devices for virtual and physical objects Oshima, et al. 2000
  • 16. Augmented S f Surfaces and Tangible Interfaces Basic principles Virtual objects are projected on a surface Physical objects are used as controls f virtual l for l objects Support for collaboration
  • 17. Augmented Surfaces Rekimoto, et al. 1998 Front projection F j i Marker-based tracking Multiple projection surfaces
  • 18. Tangible User Interfaces (Ishii 97) Create digital shadows for h i l bj f physical objects Foreground g graspable UI Background B k d ambient interfaces
  • 19. Tangible Interfaces - Ambient Dangling String D l S Jeremijenko 1995 Ambient ethernet monitor A bi h i Relies on peripheral cues Ambient Fixtures Dahley, Wisneski, Ishii 1998 Use natural material qualities for information display
  • 20. Tangible Interface: ARgroove Collaborative Instrument Exploring Physically Based Interaction Map physical actions to Midi output - Translation rotation Translation, - Tilt, shake
  • 22. Visual Feedback Continuous Visual Feedback is Key Single Virtual Image Provides: Rotation Tilt Tl Height
  • 23. i/O Brush (Ryokai, Marti, Ishii)
  • 24. Other Examples Triangles (Gorbert 1998) Triangular based story telling g y g ActiveCube (Kitamura 2000-) Cubes C b with sensors h
  • 25. Lessons from Tangible Interfaces Physical objects make us smart Norman’s “Things that Make Us Smart” g encode affordances, constraints Objects aid collaboration establish shared meaning Objects increase understanding serve as cognitive artifacts
  • 26. TUI Limitations Difficult to change object properties can’t tell state of digital data Limited display capabilities projection screen = 2D dependent on physical display surface Separation between object and display p j p y ARgroove
  • 27. Advantages and Disadvantages Advantages Natural - users hands are used for interacting with both virtual and real objects. - N need for special purpose input devices No df i l i td i Disadvantages g Spatial gap - Interaction is limited only to 2D surface • Full 3D interaction and manipulation is difficult - Separation between interaction object and display
  • 28. Orthogonal Nature of AR Interfaces
  • 29. Back to the Real World AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views TUI + AR = Tangible AR g Apply TUI methods to AR interface design
  • 30. Space vs. Time - Multiplexed Space-multiplexed Many devices each with one function y - Quicker to use, more intuitive, clutter - Real Toolbox Time multiplexed Time-multiplexed One device with many functions - Space efficient - mouse
  • 31. Tangible AR: Tiles (Space Multiplexed) Tiles T l semantics data tiles operation tiles i il Operation on tiles proximity spatial arrangements space-multiplexed lti l d
  • 32. Space-multiplexed Interface Data authoring in Tiles
  • 34. Object Based Interaction: MagicCup Intuitive Virtual Object Manipulation on a Table-Top Workspace Time multiplexed Multiple Markers M li l M k - Robust Tracking Tangible User Interface - Intuitive Manipulation Stereo Display - Good Presence
  • 35. MagicCup system Main table, Menu table, Cup interface
  • 36.
  • 37. Tangible AR: Time-multiplexed Interaction Use of natural physical object manipulations to j control virtual objects VOMAR Demo Catalog book: - Turn over the page p g Paddle operation: - Push, shake, incline, hit, scoop
  • 39. Advantages and Disadvantages Advantages Natural interaction with virtual and physical tools - No need for special purpose input devices Spatial interaction with virtual objects - 3D manipulation with virtual objects anywhere in physical space Disadvantages Requires Head Mounted Display
  • 40. Wrap-up Browsing Interfaces simple (conceptually!), unobtrusive (conceptually!) 3D AR Interfaces expressive, creative, require attention Tangible Interfaces Embedded into conventional environments Tangible T ibl AR Combines TUI input + AR display p p y
  • 42. Interface Design Path 1/ Prototype Demonstration 2/ Adoption of Interaction Techniques from other interface metaphors p Augmented Reality 3/ Development of new interface metaphors appropriate to the medium Virtual Reality 4/ Development of formal theoretical models for predicting and modeling user actions Desktop WIMP
  • 43. AR Design Space Reality Virtual Reality Augmented R lit A t d Reality Physical Design Virtual Design
  • 44. AR is mixture of physical affordance and virtual affordance Physical Tangible T ibl controllers and objects ll d bj Virtual Virtual graphics and audio
  • 45. AR Design Principles Interface Components Physical components Display elements - Visual/audio Interaction metaphors Physical y Display p y Elements Interaction Elements Metaphor Input Output
  • 46. Tangible AR Metaphor AR overcomes limitation of TUIs enhance dis la possibilities display ssibilities merge task/display space provide public and private views TUI + AR = Tangible AR Apply TUI methods to AR interface design
  • 47. Tangible AR Design Principles Tangible AR Interfaces use TUI principles Physical controllers for moving virtual content Support for spatial 3D interaction techniques Support for multi-handed interaction Match object affordances to task requirements j q Support parallel activity with multiple objects Allow collaboration between multiple users
  • 48. Case Study 1: 3D AR Lens Goal: Develop a lens based AR interface MagicLenses Developed at Xerox PARC in 1993 View Vi a region of the workspace differently to the rest i f h k diff l h Overlap MagicLenses to create composite effects
  • 49. 3D MagicLenses MagicLenses extended to 3D (Veiga et. al. 96) Volumetric and flat lenses
  • 50. AR Lens Design Principles Physical Components Lens handle - Virtual lens attached to real object Display Elements Lens view - Reveal layers in dataset Interaction Metaphor Physically holding lens Ph i ll h ldi l
  • 51. 3D AR Lenses: Model Viewer Displays models made up of multiple parts Each part can be shown or hidden through the lens Allows the user to peer inside the model Maintains focus + context
  • 53. AR FlexiLens Real handles/controllers with flexible AR lens
  • 54. Techniques based on AR Lenses Object Selection Select objects by targeting them with the lens Information Filtering Show different representations through the lens Hide certain content to reduce clutter, look inside things
  • 55. Case Study 2 : LevelHead Block based game
  • 56. Case Study 2: LevelHead Physical Components Real blocks Display Elements Virtual person and rooms Interaction M I i Metaphor h Blocks are rooms
  • 57.
  • 58. Case Study 3: AR Chemistry (Fjeld 2002) Tangible AR chemistry education
  • 59. Goal: An AR application to test molecular G structure in chemistry y Physical Components Real book, R l b k rotation cube, scoop, tracking markers i b ki k Display Elements p y AR atoms and molecules Interaction M t h I t ti Metaphor Build your own molecule
  • 61.
  • 62. Case Study 4: Transitional Interfaces Goal: An AR interface supporting transitions from reality to virtual reality Physical Components Real book Display Elements AR and VR content Interaction Metaphor Book pages hold virtual scenes p g
  • 63. Milgram’s Continuum (1994) Mixed Reality (MR) Reality Augmented Augmented Virtuality y (Tangible Reality (AR) Virtuality (AV) (Virtual Interfaces) Reality) Central Hypothesis yp The next generation of interfaces will support transitions along the Reality-Virtuality continuum
  • 64. Transitions Interfaces of the future will need to support transitions along the RV continuum ii l h i Augmented Reality is preferred for: co-located collaboration Immersive Virtual Reality is preferred for: I i Vi t l R lit i f df experiencing world immersively (egocentric) sharing views remote collaboration
  • 65. The MagicBook Design Goals: Allows user to move smoothly between reality and virtual reality Support collaboration S ll b i
  • 67. Features Seamless transition between Reality and Virtuality Reliance on real d R li l decreases as virtual increases i li Supports egocentric and exocentric views User can pick appropriate view Computer becomes invisible Consistent interface metaphors Virtual content seems real Supports collaboration
  • 68. Design alternatives for common user tasks in Tangible AR User Tasks Interface Design Camera on HMD Viewpoint Fixed camera – top view, front view, mirroring Control Handheld camera Statically paired virtual and physical objects Selection Dynamically paired virtual and physical objects - paddle, pointer D i ll i d i t l d h i l bj t ddl i t Direct mapping of whole 6DOF 3D Filtered/distorted mapping - snapping, non-linear mapping Manipulation Multiplexed mapping - rotation from one, position from another Location and pose based - proximity, spatial configuration Gestures with props - tilt shake tilt, Event & Keyboard & mouse Command Menu, 2D/3D GUI with tracking objects as pointers Buttons Occlusion based interaction Custom hardware devices
  • 69. Design Tips for Tangible AR Using metaphors from the real world Take advantage of parallel interactions Use timers to prevent accidents Interaction volume – user, tracking, whitespace What happens when tracking gets lost or object is out of view? Problems in visualization Field of view, occlusions ,
  • 71. Keyboard and Mouse Interaction Traditional input techniques OSG provides a framework for handling keyboard p g y and mouse input events (osgGA) 1. Subclass osgGA::GUIEventHandler g 2. Handle events: • Mouse up / down / move / drag / scroll-wheel • Key up / down 3. Add instance of new handler to the viewer
  • 72. Keyboard and Mouse Interaction Create your own event handler class class KeyboardMouseEventHandler : public osgGA::GUIEventHandler { public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: b d break; k case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; } }; Add it to the viewer to receive events viewer.addEventHandler(new KeyboardMouseEventHandler());
  • 73. Keyboard Interaction Handle W,A,S,D keys to move an object case osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case 'w': // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case 's': // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case 'a': // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case 'd': // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return t t true; case ' ': // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; } break; localTransform = new osg::MatrixTransform(); localTransform->addChild(osgDB::readNodeFile("media/car.ive")); arTransform->addChild(localTransform.get()); arTransform >addChild(localTransform get());
  • 75. Mouse Interaction Mouse is pointing device… Use mouse to select objects in an AR scene OSG provides methods for ray-casting and intersection testing Return an osg::NodePath (the p g ( path from the hit node all the way back to the root) Projection Plane (screen) scene
  • 76. Mouse Interaction Compute the list of nodes under the clicked position Invoke an action on nodes that are hit, e.g. select, delete case osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for ( f (unsigned i t i = 0 i < t i d int 0; targets.size(); i++) { t i () targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target >setSelected(true); target->setSelected(true); return true; } } } break;
  • 78. Proximity Techniques Interaction based on the distance between a marker and the camera the distance between multiple markers
  • 79. Single Marker Techniques: Proximity Use distance from camera to marker as input parameter e.g. e g Lean in close to examine Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
  • 80. Single Marker Techniques: Proximity // Load some models osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg"); osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg"); osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg"); // Use a Level-Of-Detail node to show each model at different distance ranges. osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away arTransform->addChild(lod.get()); Define depth ranges for each node Add as many as you want Ranges can overlap
  • 82. Multiple Marker Concepts Interaction based on the relationship between markers e.g. When the distance between two markers decreases below threshold invoke an action Tangible User Interface Applications: A l Memory card g y games File operations
  • 83. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch B Model Model Model Model A1 A2 B1 B2
  • 84. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch B Model Model Model Model A1 A2 B1 B2
  • 85. Multiple Marker Proximity Use a node callback to test for proximity and update the relevant nodes virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv); }
  • 87. Paddle Interaction Use U one marker as a tool for selecting and k lf l d manipulating objects (tangible user interface) Another A h marker provides a frame of reference k id f f f A grid of markers can alleviate problems with occlusion MagicCup (Kato et al) VOMAR (K t et al) (Kato t l)
  • 88. Paddle Interaction Often useful to adopt a local coordinate system Allows the camera to move without disrupting Tlocal Places the paddle in the th same coordinate di t system as the content on the grid Simplifies interaction osg osgART co putes Tlocal using the osgART::LocalTransformationCallback computes us g t e osg :: oca a s o at o Ca bac
  • 89. Tilt and Shake Interaction Detect types of paddle movement: yp p Tilt - gradual change in orientation Shake - short sudden chan es in translation short, s dden changes
  • 90. More Information • Mark Billinghurst Billingh rst – mark.billinghurst@hitlabnz.org • Gun Lee – gun.lee@hitlabnz.org g @ g • Websites – www hitlabnz org www.hitlabnz.org
  • 91. Building Tangible AR Interfaces with ARToolKit
  • 92. Required Code Calculating Camera P C l l C Position Range to marker Loading Multiple Patterns/Models / Interaction between objects Proximity Relative position/orientation Occlusion Stencil buffering Multi-marker tracking
  • 94. Local vs. Global Interactions Local Actions determined from single camera to marker transform - shaking, appearance, relative position, range Global Actions determined from two relationships - marker to camera, world to camera coords. - Marker transform determined in world coordinates • object tilt, absolute position, absolute rotation, hitting
  • 95. Range-based Interaction Sample Fil R S l File: RangeTest.c T t /* get the camera transformation */ h f i arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans); marker width marker trans); / /* find the range */ / Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; p [ ][ ] Zpos = marker_trans[2][3]; range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
  • 96. Loading Multiple Patterns Sample F l L dM l S l File: LoadMulti.c Uses object.c to load Object Structure typedef struct { char name[256]; 2 6 int id; int visible;; double marker_coord[4][2]; double trans[3][4]; double d bl marker_width; k idth double marker_center[2]; } ObjectData_T; _
  • 97. Finding Multiple Transforms Create object list C ObjectData_T *object; Read in objects - in init( ) read_ObjData( read ObjData( char *name int *objectnum ); *name, Find Transform – in mainLoop( ) p( for( i = 0; i < objectnum; i++ ) { ..Check patterns p ..Find transforms for each marker }
  • 98. Drawing Multiple Objects Send the object list to the draw function draw( object objectnum ); object, Draw each object individually for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para); }
  • 99. Proximity Based Interaction Sample File – CollideTest.c Detect distance between markers checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
  • 100. Multi-marker Tracking Sample File – multiTest.c Multiple markers to establish a single coordinate frame g Reading in a configuration file Tracking from sets of markers T ki f t f k Careful camera calibration
  • 101. MultiMarker Configuration File Sample File - Data/multi/marker dat Data/multi/marker.dat Contains list of all the patterns and their exact positions ii #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 0 0000 0 0000 1 0000 0 0000 …
  • 102. Camera Transform Calculation Include <AR/arMulti.h> I l d <AR/ M lti h> Link to libARMulti.lib In mainLoop() Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num) Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { k fi )) argSwapBuffers(); return; }
  • 103. Paddle-based Interaction Tracking single marker relative to multi-marker set - paddle contains single marker p g
  • 104. Paddle Interaction Code Sample File – PaddleDemo.c Get paddle marker location + draw paddle before drawing p p g background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, marker flag marker num &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( draw paddle( paddleInfo); } draw_paddle draw paddle uses a Stencil Buffer to increase realism
  • 105. Paddle Interaction Code II Sample File – paddleDrawDemo.c Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4]) Sample File – paddleTouch.c Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans); Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0) checkCollision(&curPaddlePos myTarget[i] pos 20 0)
  • 106. General Tangible AR Library command_sub.c, command_sub.h d b d bh Contains functions for recognizing a range of different paddle motions: diff ddl i int check_shake( ); int i t check_punch( ) h k h( ); int check_incline( ); int check pickup( ); check_pickup( int check_push( ); Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)