Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Marker based augmented reality app
1. MAULANA AZAD NATIONAL INSTITUTE OF
TECHNOLOGY
BHOPAL-462051
SESSION:2014-2015
DEPARTMENT OF COMPUTER SCIENCE AND
ENGINEERING
MINOR PROJECT REPORT
ON
MARKER BASED AUGMENTED REALITY
APPLICATION
SUBMITTED TO : PROJECT CORDINATOR:
DR. JYOTI BHARTI DR. R. K. PATERIYA
SUBMITTED BY :
SHANKUL KHARE 121112207
AKASH RAI 121112136
NAVIN KUMAR YADAV 121112164
NIKHIL SAH SUDI 121112139
2. MinorProject 2014-15 Page ii
MAULANA AZAD NATIONAL INSTITUTE OF
TECHNOLOGY
BHOPAL-462051
SESSION:2014-2015
DEPARTMENT OF COMPUTER SCIENCE AND
ENGINEERING
CERTIFICATE
This is to certify that Shankul Khare, Akash Rai, Navin Kumar
Yadav, and Nikhil Sah Sudi, student of B.Tech 3rd
year (Computer Science and
Engineering) has successfully completed their project “MARKER BASED
AUGMENTED REALITY APPLICATION” in partial fulfilment on their
minor project in Computer Science and Engineering.
SUBMITTED TO : PROJECT CORDINATOR:
DR. JYOTI BHARTI DR. R. K. PATERIYA
3. MinorProject 2014-15 Page iii
MAULANA AZAD NATIONAL INSTITUTE OF
TECHNOLOGY
BHOPAL-462051
SESSION:2014-2015
DEPARTMENT OF COMPUTER SCIENCE AND
ENGINEERING
DECLARATION
We hereby declare that the following report work which is being presented as Minor
Project entitled " MARKER BASED AUGMENETED REALITY APPLICATION” is
the partial fulfilment of the requirements of the third year (sixth semester)Minor
Project in the field of Computer Science And Engineering .it is an authentic
documentation of our own original work carried out under the guidance of Dr. R.K.
Pateriya .None of the matter contained therein has been copied or extracted from
anywhere else .The work has been carried out at Maulana Azad national institute
of technology, Bhopal. The following Project and its report ,in part or whole ,has
not been presented or submitted by us for any purpose in any other institute or
organization. We, hereby, declare that the facts mentioned above are true to the best
of our knowledge. In case of any unlikely discrepancy that may possible occur,we
will be the ones to take responsibility
SUBMITTED BY :
NAME Sch. No. Sign
SHANKUL KHARE 121112207
AKASH RAI 121112136
NAVIN KUMAR YADAV 121112164
NIKHIL SAH SUDI 121112139
4. MinorProject 2014-15 Page iv
ACKNOWLEDGEMENT
We are deeply indebted to DR. R.K.Pateriya (Department- computer
science and engineering) for providing us the opportunity to work on this
intriguing project.
We would like to express our sincere gratitude to DR. R. K. Pateriya for his
continuous support, patience, motivation, enthusiasm and immense knowledge.
His guidance helped us in all the time of our minor project. We could not have
imagined having a better advisor, mentor and guide for our minor project.
Besides being our guide he is also our project co-ordinator. We would also like
to thank other staff members for their encouragement, insightful comments,
constructive criticism and hard question during the project.
5. MinorProject 2014-15 Page v
TABLE OF CONTENTS
Serial Number Topic PageNumber
1. Abstract viii
2. Chapter 1 9-14
1.1 Introduction 9
1.1.1 Marker Based 9
1.1.2 Location Based 9
1.2 Applications 11-14
1.2.1 Navigation 11-11
1.2.2 Sight Seeing 12-12
1.2.3 Military 13-13
1.2.4 Medical 14-14
3. Chapter 2 15-18
2.1 Marker Detection 15-15
2.2 Object Creation 16-16
2.3 Game Making 17-17
2.3.1 Environment initialisation 18-18
2.3.2 Scripting Of Dynamic Objects 18-18
4. Chapter 3 19-37
3.1 ENEMY HEALTH 19-21
3.2 ENEMY ATTACK 22-24
3.3 ENEMY MOVEMENT 25-26
3.4 ENEMY MANAGER 27-27
3.5 ENEMY MANAGER 1 28-28
3.6 ENEMY MANAGER 2 29-29
3.7 GAMEOVER MANAGER 30-30
3.8 PLAYER HEALTH 31-32
3.9 PLAYER SHOOTING 33-35
3.10 SCORE MANAGER 36-37
5. Chapter 4 38-39
4.1 Blender 38-38
4.1.1 File format 38-38
4.2 Unity 3D 39-39
6. Screen Shots 40-45
6. MinorProject 2014-15 Page vi
7. System Specification 46-46
8. Conclusion and Future work 47-47
9. Bibliography 48-48
7. MinorProject 2014-15 Page vii
List Of Figures
Fig no. Figure PageNumber
Fig 1 Promotion Of Events 11
Fig 2 Navigation 11
Fig 3 Recognising Building And Displaying Facts. 12
Fig 4 Military Using Head Up Display 13
Fig 5 Virtual Surgery 14
Fig 6 Marker detection and object super imposition 15
Fig 7 Object Creation 16
Fig 8 Game Making 17
Fig 9 Snapshot of MARKER 18
Figures Chapter 5 Screen Shots 40-45
8. MinorProject 2014-15 Page viii
ABSTRACT
This project deals with an application of Augmented Reality, which is marker based
Augmented Reality on hand held devices. Most augmented reality applications rely on
superimposing either 3D-generated computer imagery or some form of descriptive
knowledge over the real-time images obtained through a camera, webcam or phone. This
requires a good understanding of image processing and computer vision techniques, mainly
for tracking either markers or the natural features on which this imagery is superimposed.
Augmented Reality Applications generally use one of two approaches: marker-based and
location-based.
Augmented Reality technology mainly works with the help of the sensors. The users will be
able to experience a real experience while using this technology Markers attached to the real
objects enable the system (via a camera or webcam) to track the position and orientation of
each object as it is moved. The system can then augment the captured image of the real
environment with computer-generated graphics to present a variety of game or task-driven
scenarios to the user
We have developed a Marker based Augmented Reality game which is super imposed over a
marker and when this MARKER is seen via mobile phone an augmented world will be
displayed.
9. MinorProject 2014-15 Page 9
Chapter 1
A break from the ordinary.
See how furniture will look in your home before buying it. Interact with a toy
without opening the box. Play a video game on your coffee table, reinvent the
way you interact with the world.
1.1 Introduction
Augmented reality (AR) is a live direct or indirect view of a physical, real-world
environment whose elements are augmented (or supplemented) by computer-generated
sensory input such as camera, sound, video, graphics or GPS data.
AR Applications generally use one of two approaches: marker-based and location-based.
1.1.1 Marker-Based
Markers work by having software recognise a particular pattern, such as a barcode or symbol,
when a camera points at it, and overlaying a digital image at that point on the screen. If the
image is three-dimensional or animated, the effect is of a digital experience unfolding on the
surface upon which the pattern is printed.
1.1.2 Location-Based
Location-based applications use the ability of a particular device to record its position in the
world and then offer data that’s relevant to that location: finding your way around a city,
remembering where you parked the car, naming the mountains around you or the stars in the
sky.
Most augmented reality applications rely on superimposing either 3D-generated computer
imagery or some form of descriptive knowledge over the real-time images obtained through a
camera, webcam or phone. This requires a good understanding of image processing and
computer vision techniques, mainly for tracking either markers or the natural features on
which this imagery is superimposed.
Computer-generated imagery has to look realistic and be properly aligned with the real
environment in order to create an authentic impression. Most of the applications are designed
for the general public so a good understanding of intuitive user interfaces is also required to
provide a seamless experience.
10. MinorProject 2014-15 Page 10
Augmentation is conventionally in real-time and in semantic context with environmental
elements, such as sports scores on TV during a match. With the help of advanced AR
technology (e.g. adding computer vision and object recognition) the information about the
surrounding real world of the user becomes interactive and digitally manipulable. Artificial
information about the environment and its objects can be overlaid on the real world.
There are essentially 3 different kinds of displays of augmented reality:
1. The head mounted display (HMD) is worn on the head or attached to a helmet. This
display can resemble goggles or glasses. In some instances, there is a screen that covers a
single eye.
2. The handheld device is a portable computer or mobile smartphone such as the iPhone.
3. Spatial display makes use of projected graphical displays onto fixed surfaces.
11. MinorProject 2014-15 Page 11
1.2 Applications
Applications for augmented reality are broad. The military uses augmented reality to assist
men and women making repairs in the field. The gaming industry is moving games outside
like the old days…equipped with wearable head gear of course. And then there is everything
in between.
Fig 1 Promotion Of Events
1.2.1 Navigation
Navigation applications are possibly the most natural fit of augmented reality with our
everyday lives. Enhanced GPS systems are using augmented reality to make it easier to get
from point A to point B.
Fig 2 Navigation
12. MinorProject 2014-15 Page 12
1.2.2 Sightseeing
There are a number of applications for augmented reality in the sightseeing and tourism
industries. The ability to augment a live view of displays in a museum with facts and figures
is a natural use of the technology.
Out in the real world, sightseeing has been enhanced using augmented reality. Using a
smartphone equipped with a camera, tourists can walk through historic sites and see facts and
figures presented as an overlay on their live screen. These applications use GPS and image
recognition technology to look up data from an online database. In addition to information
about a historic site, applications exists that look back in history and show how the location
looked 10, 50 or even 100 years ago.
Fig 3 Recognising Historic Buildings A nd Displaying their Facts
13. MinorProject 2014-15 Page 13
1.2.3 Military
The Heads-Up Display (HUD) is the typical example of augmented reality when it comes to
military applications of the technology. A transparent display is positioned directly in the
fighter pilots view. Data typically displayed to the pilot includes altitude, airspeed and the
horizon line in addition to other critical data. The term "heads-up" comes from the fact that
the pilot doesn't have to look down at the aircraft's instrumentation to get the data they need.
The Head-Mounted Display (HMD) is used by ground troops. Critical data such as enemy
location can be presented to the soldier within their line of sight. This technology is also used
for simulations for training purposes.
Fig 4 Military Using Head Up Display To Augment Real Time Information.
14. MinorProject 2014-15 Page 14
1.2.4 Medical
There have been really interesting advances in medical application of augmented reality.
Medical students use the technology to practice surgery in a controlled environment.
Visualizations aid in explaining complex medical conditions to patients. Augmented reality
can reduce the risk of an operation by giving the surgeon improved sensory perception. This
technology can be combined with MRI or X-ray systems and bring everything into a single
view for the surgeon.
Neurosurgery is at the forefront when it comes to surgical applications of augmented reality.
The ability to image the brain in 3D on top of the patient's actual anatomy is very powerful
for the surgeon. Since the brain is somewhat fixed compared to other parts of the body, the
registration of exact coordinates can be achieved. Concern still exists surrounding the
movement of tissue during surgery. This can affect the exact positioning required for
augmented reality to work.
Fig 5 Virtual Surgery
15. MinorProject 2014-15 Page 15
Chapter 2
Technical Overview
2.1 Marker Detection
Vuforia Plugin is used to detect marker from mobile camera as mobiles have very limited
RAM so technologies like MATLAB and computer vision becomes unusable. So, a Unity3d
plug in vuforia is used. It is Used to set a image as marker which can then be detected by
mobile camera at the run time. So a image is set as target/Marker, and is incorporated with
the project.
Fig 6 Marker detection and object super imposition
16. MinorProject 2014-15 Page 16
2.2 Object Creation
Blender is used to make static 3 dimensional objects which are used as basic environment of
the game.
Fig 7 Object Creation
17. MinorProject 2014-15 Page 17
2.3 Game Making
Unity3d gaming engine is used to make a game which is Environment being set on the
detected marker by assuming the centre of the marker as the origin.Unity3d is used to deal
with all the technical aspects of the game.
Fig 8 Game Making
18. MinorProject 2014-15 Page 18
Fig 9 Snapshot of MARKER
2.3.1 Environment initialisation
Environment has to be augmented on to the marker when detected so the position of all the
objects is set up. Initialisation of general parameters of the environment.
2.3.2 Scripting Of Dynamic Objects
Dynamic Objects are the objects created at the run time, these objects are spawned from
multiple locations at particular interval of time. So these are handled by the scripts included
In the next section.
19. MinorProject 2014-15 Page 19
Chapter 3
Important Scripts
3.1 ENEMY HEALTH
using UnityEngine;
using UnityEngine.UI;
public class EnemyHealth : MonoBehaviour
{
public int startingHealth = 100;
public int currentHealth;
public float sinkSpeed = 2.5f;
public int scoreValue = 10;
public AudioClip deathClip;
Animator anim;
AudioSource enemyAudio;
ParticleSystem hitParticles;
CapsuleCollider capsuleCollider;
bool isDead;
bool isSinking;
void Awake ()
{
anim = GetComponent <Animator> ();
enemyAudio = GetComponent <AudioSource> ();
hitParticles = GetComponentInChildren <ParticleSystem> ();
capsuleCollider = GetComponent <CapsuleCollider> ();
currentHealth = startingHealth
}
27. MinorProject 2014-15 Page 27
3.4 ENEMY MANAGER
using UnityEngine;
public class EnemyManager : MonoBehaviour
{
public PlayerHealth playerHealth;
public GameObject enemy;
public float spawnTime = 3f;
public Transform[] spawnPoints;
void Start ()
{
InvokeRepeating ("Spawn", spawnTime, spawnTime);
}
void Spawn ()
{
if(playerHealth.currentHealth <= 0f)
{
return;
}
int spawnPointIndex = Random.Range (0, spawnPoints.Length);
Instantiate (enemy, spawnPoints[spawnPointIndex].position,
spawnPoints[spawnPointIndex].rotation);
}
}
28. MinorProject 2014-15 Page 28
3.5 ENEMY MANAGER 1
using UnityEngine;
public class EnemyManager1 : MonoBehaviour
{
public PlayerHealth playerHealth;
public GameObject enemy;
public float spawnTime = 3f;
public Transform[] spawnPoints;
void Start ()
{
InvokeRepeating ("Spawn", spawnTime, spawnTime);
}
void Spawn ()
{
if(playerHealth.currentHealth <= 0f)
{
return;
}
int spawnPointIndex = Random.Range (0, spawnPoints.Length);
if(ScoreManager.score >= 100f)
Instantiate (enemy, spawnPoints[spawnPointIndex].position,
spawnPoints[spawnPointIndex].rotation);
}
}
29. MinorProject 2014-15 Page 29
3.6 ENEMY MANAGER 2
using UnityEngine;
public class EnemyManager2 : MonoBehaviour
{
public PlayerHealth playerHealth;
public GameObject enemy;
public float spawnTime = 3f;
public Transform[] spawnPoints;
void Start ()
{
InvokeRepeating ("Spawn", spawnTime, spawnTime);
}
void Spawn ()
{
if(playerHealth.currentHealth <= 0f)
{
return;
}
int spawnPointIndex = Random.Range (0, spawnPoints.Length);
if(ScoreManager.score>=150f)
Instantiate (enemy, spawnPoints[spawnPointIndex].position,
spawnPoints[spawnPointIndex].rotation);
}
}
30. MinorProject 2014-15 Page 30
3.7 GAMEOVER MANAGER
using UnityEngine;
public class GameOverManager : MonoBehaviour
{
public PlayerHealth playerHealth;
public float restartDelay;
float restartTimer;
Animator anim;
void Awake()
{
anim = GetComponent<Animator>();
}
void Update()
{
if (playerHealth.currentHealth <= 0)
{
anim.SetTrigger("GameOver");
restartTimer +=Time.deltaTime;
if(restartTimer>=restartDelay)
{
Application.LoadLevel(Application.loadedLevel);
}
}
}
}
31. MinorProject 2014-15 Page 31
3.8 PLAYER HEALTH
using UnityEngine;
using UnityEngine.UI;
using System.Collections;
public class PlayerHealth : MonoBehaviour
{
public int startingHealth = 5000;
public int currentHealth;
public Slider healthSlider;
public PlayerShooting playerShooting;
public AudioClip deathClip;
AudioSource playerAudio;
bool isDead;
void Awake ()
{
playerAudio = GetComponent <AudioSource> ();
currentHealth = startingHealth;
}
public void TakeDamage (int amount)
{
currentHealth -= amount;
healthSlider.value = currentHealth;
playerAudio.Play ();
if(currentHealth <= 0 && !isDead)
38. MinorProject 2014-15 Page 38
Chapter 4
Softwares Used
4.1 Blender
Blender is a professional free and open-source 3D computer graphics software product used
for creating animated films, visual effects, art, 3D printed models, interactive 3D applications
and video games. Blender's features include 3D modelling, UV unwrapping, texturing, raster
graphics editing, rigging and skinning, fluid and smoke simulation, particle simulation, soft
body simulation, sculpting, animating, match moving, camera tracking, rendering, video
editing and compositing. Alongside the modelling features it also has an integrated game
engine.
4.1.1 File format
Blender features an internal file system that can pack multiple scenes into a single file (called
a ".blend" file).
All scenes, objects, materials, textures, sounds, images, post-production effects for an
entire animation can be stored in a single ".blend" file. Data loaded from external
sources, such as images and sounds, can also be stored externally and referenced
through either an absolute or relative pathname. Likewise, ".blend" files themselves
can also be used as libraries of Blender assets.
Interface configurations are retained in the ".blend" files.
A wide variety of import/export scripts that extend Blender capabilities (accessing the object
data via an internal API) make it possible to inter-operate with other 3D tools.
Blender organizes data as various kinds of "data blocks", such as Objects, Meshes, Lamps,
Scenes, Materials, Images and so on. An object in Blender consists of multiple data blocks.
39. MinorProject 2014-15 Page 39
4.2 Unity 3D
Unity3D is a powerful cross-platform 3D engine and a user friendly development
environment. Easy enough for the beginner and powerful enough for the expert; Unity should
interest anybody who wants to easily create 3D games and applications for mobile, desktop,
the web, and consoles
The Unity application is a complete 3D environment, suitable for laying out levels, creating
menus, doing animation, writing scripts, and organizing projects. The user interface is well
organized and the panels can be fully customized by dragging and dropping.
The Project panel is where all the assets within a project are stored. When assets are
imported, they will first appear here.
The hierarchy panel is where assets are organized in a scene. Assets from the Project panel
can be dragged into the Hierarchy panel to add them to the current scene.
The Inspector panel lets you inspect and adjust all the attributes of a selected asset.
Everything from its position and rotation, to whether it’s affected by gravity or able to cast a
shadow.
The Scene panel is a 3D viewport where you can physically arrange assets by moving them
around in 3D space. You can navigate the viewport by panning, rotating, and zooming the
view.
46. MinorProject 2014-15 Page 46
System Specifications
Following are the hardware and Software requirements for the proper functioning of the
system.
Software Requirements:
Minimum API level :Andriod 2.3.1 or Windows 8 .Net 2.0 subset
Hardware Requirements:
750 MB of RAM
3.2 or higher megapixel camera
Environment Requirements:
Good lighting
47. MinorProject 2014-15 Page 47
Conclusion
i) Conclusion
Our project implements a single player game based on real time marker detection. The
marker uses Vuforia to detect marker in real time and environment is Augmented which can
deal with all 3Dimensional motion around the marker. With augmented objects staying in
their locations. And their location being dynamically adjusted.
Thus this game is an application to marker based real time Augmented Reality.
ii) Future Work
Augmented reality is another step further into the digital age as we will soon see our
environments change dynamically either through a smartphone, glasses, car windshields and
even windows in the near future to display enhanced content and media right in front of us.
This has amazing applications that can very well allow us to live our lives more productively,
more safely, and more informatively.
Maybe in the future, we will see our environments become augmented to display information
based on our own interests through built-in RFID tags and augmentations being implemented
through holographic projections surrounding the environments without a use of an enabling
technology. It would be incredible to no longer wonder where to eat, where to go, or what to
do; our environment will facilitate our interactions seamlessly. We will no longer be able to
discern what is real and what is virtual, our world will become a convergence of digital and
physical media.