A four lecture course on how to build AR and VR experiences using Unity, Google Cardboard VR SDK and Vuforia. Taught by Mark Billinghurst from May 10th - 13th, 2016 in XI'an, China
Mark BillinghurstDirector at HIT Lab NZ um University of South Australia
1. BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th, 2016
Xi’an, China
2. Mark Billinghurst
▪ University of South Australia
▪ Past Director of HIT Lab NZ,
University of Canterbury
▪ PhD Univ. Washington
▪ Research on AR, mobile HCI,
Collaborative Interfaces
▪ More than 300 papers in AR, VR,
interface design
3. AR/VR Course
• Lectures
• 2:30 - 4:30 pm everyday
• Lectures/hands-on
• Logistics
• Bring your own laptop if possible
• Use Android phone
• Share computer/phone
• Material
• All material available for download
4. What You Will Learn
• AR/VR fundamentals + history
• Basics of Unity Programming
• How to make Panorama VR Applications
• How to create VR Scenes
• How to add Interactivity to VR Applications
• Using the Vuforia AR tracking library
• Creating AR scenes
• Adding AR interactivity
• Design guidelines
• Research directions
5. Schedule
• Tuesday May 10th
• Introduction, Learning Unity, Building 360 VR scenes
• Wednesday May 11th
• Creating 3D scenes, adding interactivity, good design
• Thursday May 12th
• Introduction to AR, Vuforia basics, building AR scenes
• Field trip to ARA demo space
• Friday May 13th
• Adding interactivity, advanced AR tracking, research
8. A Brief History ofTime
• Trend
• smaller, cheaper, more functions, more intimate
• Technology becomes invisible
• Intuitive to use
• Interface over internals
• Form more important than function
• Human centered design
9. A Brief History of Computing
• Trend
• smaller, cheaper, faster, more intimate, intelligent objects
• Computers need to become invisible
• hide the computer in the real world
• Ubiquitous / Tangible Computing
• put the user inside the computer
• Virtual Reality
10. Making Interfaces Invisible
Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented
interaction with real world environments. In Proceedings of the 8th Annual ACM Symposium on
User interface and Software Technology. UIST '95. ACM, New York, NY, 29-36.
15. Augmented Reality Definition
• Defining Characteristics [Azuma 97]
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
19. Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
23. David Zeltzer’s AIP Cube
! Autonomy – User can to
react to events and stimuli.
! Interaction – User can
interact with objects and
environment.
! Presence – User feels
immersed through sensory
input and output channels
Interaction
Autonomy
Presence
VR
Zeltzer, D. (1992). Autonomy, interaction, and presence. Presence: Teleoperators
& Virtual Environments, 1(1), 127-132.
24. Key Technologies
• Autonomy
• Head tracking, body input
• Intelligent systems
• Interaction
• User input devices, HCI
• Presence
• Graphics/audio/multisensory output
• Multisensory displays
• Visual, audio, haptic, olfactory, etc
38. Mobile Phone AR & VR
• Mobile Phone AR
• Mobile phone
• Live camera view
• Senor input (GPS, compass)
• Mobile Phone VR
• Mobile phone
• Senor input (compass)
• Additional VR viewer
39. VR2GO (2013)
• MxR Lab
• 3D print VR viewer for mobiles
• Open source hardware + software
• http://projects.ict.usc.edu/mxr/diy/vr2go/
47. Version 1.0 vs Version 2.0
• Version 1.0 – Android focused, magnetic switch, small phone
• Version 2.0 – Touch input, iOS/Android, fits many phones
50. Cardboard App
• 7 default experiences
• Earth: Fly on Google Earth
• Tour Guide: Visit sites with guides
• YouTube: Watch popular videos
• Exhibit: Examine cultural artifacts
• Photo Sphere: Immersive photos
• Street View: Drive along a street
• Windy Day: Interactive short story
61. Download and Install
• Go to unity3d.com/download
• Use Download Assistant – pick components you want
62. Getting Started
• First time running Unity you’ll be asked to create a project
• Specify project name and location
• Can pick asset packages (pre-made content)
65. Building Scenes
• Use GameObjects:
• Containers that hold different components
• Eg 3D model, texture, animation
• Use Inspector
• View and edit object properties and other settings
• Use Scene View
• Position objects, camera, lights, other GameObjects etc
• Scripting
• Adding interaction, user input, events, etc
66. GameObjects
• Every object in Scene is a GameObject
• GameObjects contain Components
• Eg Transform Component, Camera Component
67. Adding 3D Content
• Create 3D asset using modeling package, or download
• Fbx, Obj file format for 3D models
• Add file to Assets folder in Project
• When project opened 3D model added to Project View
• Drag mesh from Project View into Hierarchy or Scene View
• Creates a game object
71. Making a Simple Scene
1. Create New Project
2. Create Game Object
3. Moving main camera position
4. Adding lights
5. Adding more objects
6. Adding physics
7. Changing object materials
8. Adding script behaviour
81. Example C# Script
GameObject Rotation
using UnityEngine;
using System.Collections;
public class spin : MonoBehaviour {
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
this.gameObject.transform.Rotate(Vector3.up*10);
}
}
#
82. Scripting C# Unity 3D
• void Awake():
• Is called when the first scene is loaded and the game object is active
• void Start():
• Called on first frame update
• void FixedUpdate():
• Called before physics calculations are made
• void Update():
• Called every frame before rendering
• void LateUpdate():
• Once per frame after update finished
94. AutoPano (Kolor)
• Finds image from panoramas and stitches them together
• http://www.kolor.com/autopano/
95. Steps to Make Immersive Panorama
1. Create a new project
2. Load the Cardboard SDK
3. Load a panorama image asset
4. Create a Skymap
5. Add to VR scene
6. Deploy to mobile phone
Need
• Google Cardboard SDK Unity package
• Android SDK to install on Android phone
100. Add Image Asset to Project
• Assets -> Import Asset
• Select desired image
• Set Texture Type to
Cubemap
• Set mapping to Latitude-
Longitude (Cylindrical)
101. Create Skybox Material
• Assets -> Create -> Material
• Name material
• Set Shader to Skybox -> Cubemap
• Drag texture to cubemap
104. One Last Thing..
• CardboardMain -> Head -> Main Camera
• Set Clear Flags to Skybox
105. Test It Out
• Hit play, use alt/option key + mouse to look around
106. Deploy to Mobile (Android)
1. Plug phone into USB
• make sure device in debug mode
2. Set correct build settings
3. Player settings
• Other settings
• Set Bundle Idenitfier -> com.Company.ProductName
• Resolution and Presentation
• Default Orientation -> Landscape Left
4. Build and run
107. Deploying to Phone
1. Plug phone into USB
2. Open Build Settings
3. Change Target platform to Android
4. Select Player Settings
5. Resolution and Presentation
• Default Orientation -> Landscape Left
6. Under Other Settings
• Edit Bundle Identifier – eg com.UniSA.cubeTest
• Minimum API level
7. Build and Run
• Select .apk file name
109. Making Immersive Movie
• Create movie texture
• Convert 360 video to .ogg or ,mp4 file
• Add video texture as asset
• Make Sphere
• Equirectangular UV mapping
• Inward facing normals
• Move camera to centre of sphere
• Texture map video to sphere
• Easy Movie Texture ($65)
• Apply texture to 3D object
• For 3D 360 video
• Render two Spheres
• http://bernieroehl.com/360stereoinunity/
110. BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th
Xi’an
LECTURE 3: 3D SCENES
114. Key Steps
1. Creating a new project
2. Load Cardboard SDK
3. Replace camera with CardboardMain
4. Loading in 3D asset packages
5. Loading a SkyDome
6. Adding a plane floor
117. Load Asset + Add to Scene
• Assets -> Import Package -> Custom Package
• Look for MagicLamp.unitypackage (If not installed already)
• Drag MagicLamp_LOD0 to Hierarchy
• Position and rotate
125. Moving through space
• Move through looking
• Look at target to turn on/off moving
• Button/tapping screen
• Being in a vehicle (e.g. Roller Coaster)
126. Adding Movement
Goal: Move in direction user looking when
Cardboard Button pressed or screen touched
• Key Steps
1. Start with static screen
2. Create movement script
3. Add movement script to Camera head
4. Deploy to mobile
131. Steps
• Add physics ray caster
• Casts a ray from camera
• Add function to object to respond to gaze
• Eg particle effect
• Add event trigger to object
• Add event system to scene
• Add collider object to target object
132. Adding Physics Raycaster
• Select Main Camera
• CardboardMain -> Head -> Main Camera
• Add Physics Raycaster Component
• Add component -> Physics Raycaster
133. Add Gaze Function
• Select target object (Lamp)
• Add component -> script
• Add stareAtLamp() public function
134. Add Event Trigger
• Select Target Object (Lamp)
• Add component
• EventTriger
• Add New Event Type -> PointerExit
• Add object to event
• Hit ‘+’ tag
• Drag Lamp object to box under RuntimeOn
• Select Function to run
• Select function list -> scroll to stareAtLamp
135. Adding Event System
• Need Event System for trigger to work
• Select Lamp object
• UI -> Event System
• Add gazeInputModule
• Add component -> Cardboard -> Gaze Input Module
136. Add Collider to Object
• Need to detect when target being looked at
• Select Lamp Object
• Add Sphere Collider
• Add component -> Sphere Collider (type in “Sphere”)
• Adjust position and radius of Sphere Collider if needed
137. Add Gaze Event
• Particles triggered looking at lamp
• Add particle system
• Add Component -> Particle System
• Pick colour
• set Emission rate to 0
• Add code to stareAtLamp() function
• GetComponent<ParticleSystem>().Emit(10);
• Turns particle system on when looked at
142. Google Design Guidelines
• Google’s Guidelines for good VR experiences:
• Physiological Considerations
• Interactive Patterns
• Setup
• Controls
• Feedback
• Display Reticle
• From http://www.google.com/design/spec-vr/designing-
for-google-cardboard/a-new-dimension.html
143. Physiological Considerations
• Factors to Consider
• Head tracking
• User control of movement
• Use constant velocity
• Grounding with fixed objects
• Brightness changes
144. Interactive Patterns - Setup
• Setup factors to consider:
• Entering and exiting
• Headset adaptation
• Full Screen mode
• API calls
• Indicating VR apps
146. Interactive Patterns - Feedback
• Use audio and haptic feedback
• Reduce visual overload
• Audio alerts
• 3D spatial sound
• Phone vibrations
147. Interactive Patterns - Display Reticle
• Easier for users to target objects with a display reticle
• Can display reticle only when near target object
• Highlight objects (e.g. with light source) that user can target
148. Cardboard Design Lab Application
• Use Cardboard Design Lab app to explore design ideas
149. BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th
Xi’an
LECT. 4: AUGMENTED REALITY
152. Augmented Reality Definition
• Defining Characteristics [Azuma 97]
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
156. Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
157. Summary
• Augmented Reality has three key features
• Combines Real andVirtual Images
• Interactive in real-time
• Registered in 3D
• AR can be classified alongside other technologies
• Milgram’s Mixed Reality continuum
163. Objects Registered in 3D
• Registration
• Positioning virtual object wrt real world
• Tracking
• Continually locating the users viewpoint
• Position (x,y,z), Orientation (r,p,y)
170. • Web based AR
• Flash, HTML 5 based AR
• Marketing, education
• Outdoor Mobile AR
• GPS, compass tracking
• Viewing Points of Interest in real world
• Eg: Junaio, Layar, Wikitude
• Handheld AR
• Vision based tracking
• Marketing, gaming
• Location Based Experiences
• HMD, fixed screens
• Museums, point of sale, advertising
Typical AR Experiences
174. Demo:colAR
• Turn colouring books pages into AR scenes
• Markerless tracking, use your own colours..
• Try it yourself: http://www.colARapp.com/
175. What Makes a GoodAR Experience?
• Compelling
• Engaging,‘Magic’ moment
• Intuitive, ease of use
• Uses existing skills
• Anchored in physical world
• Seamless combination of real and digital
178. What you will learn
• Introduction to Vuforia
• Platform and features
• How to install/set-up Vuforia
• Vuforia Basics
• Marker Tracking, Object tracking
• Deploying to Mobile Device
• Android, iOS
180. Vuforia Overview
• Platform for Mobile Computer Vision
• https://www.qualcomm.com/products/vuforia
• Released by Qualcomm in 2010, acquired by PTC 2015
• Used by over 200K developers, >20K applications
• Main Features:
• Recognition
• Image, text, object recognition
• Tracking
• Image, marker, scene, object
181. Vuforia Provides
• Android
• iOS
• Unity Extension
Device SDK
• Target Management System
• App Development Guide
• Vuforia Web Services
Tools & Services
• Dedicated technical support
engineers
• Thousands of posts
Support Forum
194. Unity Asset Structure
• Editor - Contains the scripts required to
interact with Target data in the Unity editor
• Plugins - Contains Java and native binaries
that integrate the Vuforia AR SDK with the
Unity Android or Unity iOS application
• Vuforia - Contains the prefabs and scripts
required to bring AR to your application
• Streaming Assets / QCAR - Contains the
Device Database configuration XML and
DAT files from the online Target Manager
196. Setting up a Vuforia Project
• Register as Developer
• Create a Project
• Obtain a License Key
• Load Vuforia package into Unity
• Add license key to AR Camera
• Add Tracking Targets
• Move ImageTarget into Scene
• Add sample object to ImageTarget
198. Download Vuforia Packages
• Go to download URL – log in
• https://developer.vuforia.com/downloads/sdk
• Download Current SDK for Unity
• vuforia-unity-5-5-9.unitypackage
• Download Core Features Sample
• vuforia-samples-core-unity-5-5-9.zip
200. Obtain a License Key
• Vuforia 5.5 apps utilize a license key that uniquely identifies
each app. License keys are created in the License Manager
• The steps to creating a new license key are..
• Choose a SDK
• Choose a licensing option based on your requirements
• Provide your Billing Information if you've chosen to use a paid license
• Obtain your license Key
202. Load Vuforia Package
• Open Unity
• Load package
• Assets -> Import Package -> Custom Package
• Load vuforia-unity-5-5-9 (or current version)
• Note:
• On Windows Vuforia only works with 32 bit version of Unity
• You may need to download Unity 32 version to work
203. Add License Key to Vuforia Project
• Open ARCamera Inspector in Vuforia
• Assets -> Vuforia -> Prefabs
• Move AR Camera to scene hierarchy (Delete Main Camera)
• Paste License Key
204. Adding Tracking Targets
• Create a target on the Target Manager
• https://developer.vuforia.com/targetmanager/
• OR - Use existing targets from other projects
205. Which Type of Database
• Device Database vs. Cloud Database?
• Device: local, Cloud: online
209. Loaded Image Target
• Rating indicates how good a target
• Download Dataset -> create unity package
• Eg StoneImage.unitypackage
210. Loading the Tracking Image
• Import tracking dataset package
• Assets -> Import Package -> Custom Package
• Drag ImageTarget prefab into Scene Hierarchy
• Select ImageTarget, pick Data Set then Image Target
• Set image width
• On AR Camera load target database and activate
• Database Load Behaviour
213. Add 3D Content
• As a test, create a simple Cube object
• GameObject > Create Other > Cube
• Add the cube as a child of the ImageTarget object by
dragging it onto the ImageTarget item.
• Move the cube until it is centered on the Image Target.
217. Building for Android
• Open Build Settings
• Change Target platform to Android
• Switch Platform
• Under Player Settings
• Edit Bundle Identifier – eg com.UniSA.cubeTest
• Minimum API level
• Build and Run
• Select .apk file name
223. Cardboard Resources
• Google Cardboard main page
• https://www.google.com/get/cardboard/
• Developer Website
• https://www.google.com/get/cardboard/developers/
• Building a VR app for Cardboard
• http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/
• Creating VR game for Cardboard
• http://danielborowski.com/posts/create-a-virtual-reality-game-for-
google-cardboard/
• Moving in VR space
• http://www.instructables.com/id/Prototyping-Interactive-Environments-
in-Virtual-Re/
225. Unity Resources
• Unity Main site
• http://www.unity3d.com/
• Holistic Development with Unity
• http://holistic3d.com
• Official Unity Tutorials
• http://unity3d.com/learn/tutorials
• Unity Coder Blog
• http://unitycoder.com