SlideShare ist ein Scribd-Unternehmen logo
1 von 27
PROJECT TANGO
Department of CSE Page 1
CHAPTER 1
INTRODUCTION
3D models represent a 3D object using a collection of points in a given 3D space,
connected by various entities such as curved surfaces, triangles, lines, etc.Being a collection
of data which includes points and other information, 3D models can be created by hand,
scanned (procedural modeling), or algorithmically. The "Project Tango" prototype is an
Android smartphone-like device which tracks the 3D motion of particular device, and creates
a 3D model of the environment around it.
Project Tango was introduced by Google initially in early 2013,they described this as
a Simultaneous Localisation and Mapping(SLAM) system capable of operating in real-time
on a phone.Google’s ATAP teamed up with a number of organizations to create Project
Tango from this description.
The team at Google’s Advanced Technology and Projects Group (ATAP) has been
working with various Universities and Research labs to harvest ten years of research in
Robotics and Computer Vision to concentrate that technology into a very unique mobile
phone. We are physical being that live in a 3D world yet the mobile devices today assume that
the physical world ends the boundaries of the screen. Project Tango’s goal is to give mobile
devices a human scale understanding of space and motion. This project will help people
interact with the environment in a fundamentally different way and using this technology we
can prototype in a couple of hours something that would take us months or even years before
because we did not have this technology readily available. Imagine having all this in a
smartphone and see how things would change.
This device runs Android and includes development APIs to provide alignment,
position or location, and depth data to regular Android apps written in C/C++, Java as well as
the Unity Game Engine(UGE). These early algorithms, prototypes, and APIs are still in active
development. So, these are experimental devices and are intended only for the exploratory and
adventurous are not a final shipping product.
Project Tango is a prototype phone containing highly customized hardware and
software designed to allow the phone to track its motion in full 3D in real-time. The sensors
make over a quarter million 3D measurements every single second updating the position and
PROJECT TANGO
Department of CSE Page 2
rotation of the phone, blending this data into a single 3D model of the environment. It tracks
ones position as one goes around the world and also makes a map of that. It can scan a small
section of your room and then are able to generate a little game world in it. It is an open
source technology. ATAP has around 200 development kits which has already been
distributed among the developers.
Google has produced two devices to demonstrate the Project Tango technology: the
Peanut phone(no longer available) and the Yellowstone 7-inch tablet. More than 3000 of these
devices had been sold as of June 2015,chiefly to researchers and software developers
interested in building applications for the platform.In the summer of 2015,Qualcomm and
Intel both announced that they are developing Project Tango reference devices as models for
device manufacturers who use their mobile chipsets.
At CES, in January 2016, Google announced a partnership with Lenovo to release a
consumer smartphone during the summer of 2016 to feature Project Tango technology
marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5
inches. At the same time, both companies also announced an application incubator to get
applications developed to be on the device on launch.
Fig(1): Google’s Project Tango Logo
PROJECT TANGO
Department of CSE Page 3
Which companies are behind Project Tango?
A number of companies came together to develop Project Tango.All of
these are listed in the credit of Google Project Tango introduction video called “Say hello to
Project Tango!”. Each companies has had a different amount of involvement. The following
are the list of participating companies listed in that video:
 Bosch
 BSquare
 ETH Zurich
 Flyby Media
 George Washington University
 HiDOF
 MMSolutions
 Movidius
 University of Minnesota
 NASA JPL
 Ologic
 OmniVision
 Open Source Robotic Foundation
 ParaCosm
 Sunny Optical Tech
 Speck Design
PROJECT TANGO
Department of CSE Page 4
CHAPTER 2
OVERVIEW
WHAT IS PROJECTTANGO?
Tango allows a device to build up an accurate 3D model of it’s
immediate surroundings, which Google says which is useful for everything from AR gaming
to navigating large shopping centres.
Fig(2): A view of Google’s Project Tango 3D Model Mapping
Google isn’t content with making softwares for phones that can merely
capture 2D photos and videos. Nor does it just want to take stereoscopic 3D snaps. Instead
Project Tango is bid to equip every mobile device with a powerful suite of software and
sensors that can capture a complete 3D picture of the world around it, in real time. Why? So
PROJECT TANGO
Department of CSE Page 5
you can map your house,furniture, and all simply by walking around it. Bingo-no more
measuring up before going shopping for a wardrobe. Or so you can avoid getting lost next
time you go to the hospital – you’ll have instant access to a 3D plan of it’s labyrinthine
corridors. Or so can play augmented reality games. Or so that can the visually impaired can
receive extra help in getting around.
WHAT DOES THE PHONE LOOK LIKE?
There are two prototypes of Tango phone yet. A 7 inch tablet and another
prototype of a 5 inch phone.
Fig(3): Prototype 1
It is a fairly standard 7 inch slate with a slight wedge at the back to
accommodate the extre sensors. As far as we can tell, it has three cameras including the
webcam. Inside it has one Nvidia’s so-far-untested Tegra K1 mobile processors with a beefy
4GB of RAM and a 128GB SSD. Google is at pains to point out that it’s not a consumer
device, but one is supposedly on the way. The depth-sensing array consists of an infrared
projector, 4MB rear camera and front-facing fisheye view lens with 180- degree field of
vision. Physically, it’s a standard phone shape but rather chunky compared to the class of
2014. More like something about 2010.
PROJECT TANGO
Department of CSE Page 6
Fig(4): Prototype 2
Prototype 2 is an android 5 inch smartphone with same tango hardware as that of the tablet.
PROJECT TANGO
Department of CSE Page 7
Fig(5): A simple Overview of Components of Tango Phone
Google's Project Tango is a smartphone equipped with a variety of cameras and vision
sensors that provides a whole new perspective on the world around it. The Tango smartphone
can capture a wealth of data never before available to application developers, including depth
and object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as
a typical smartphone.
Project Tango is different from other emerging 3D-sensing computer vision products,
such as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet
and is chiefly concerned with determining the device's position and orientation within the
environment.
The high-end Android tablet with 7-inch HD display, 4GB of RAM, 128GB of
internal SSD storage and an NVIDIA Tegra K1 graphics chip (the first in the US and second
in the world) that features desktop GPU architecture. It also has a distinctive design that
consists of an array of cameras and sensors near the top and a couple of subtle grips on the
sides. Movidius which is the company that developed some of the technology which has been
used in Tango has been working on computer vision technology for the past seven years - it
developed the processing chips used in Project Tango, which Google paired with sensors and
cameras to give the smartphone the same level of computer vision and tracking that formerly
PROJECT TANGO
Department of CSE Page 8
required much larger equipment. The phone is equipped with a standard 4-megapixel camera
paired with a special combination of RGB and IR sensor and a lower-resolution image-
tracking camera. These combos of image sensors give the smartphone a similar perspective on
the world, complete with 3-D awareness and a awareness of depth. They supply information
to Movidius custom Myriad 1 low- power computer-vision processor, which can then process
the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking
camera which is used to keep track of all the motions made by the user.
CHAPTER 3
SMARTPHONE SPECIFICATION
Tango wants to deconstruct reality, taking a quarter million 3D measurements each
second to create a real-time 3D model that describes the physical depth of its surroundings.
The smartphone specs are
PROJECT TANGO
Department of CSE Page 9
The above specs include Snapdragon 800 quad core CPU running up to 2.3 GHz per
core, 2GB or 4GB of memory, an expandable 64GB or 128 of internal storage, and a nine axis
accelerometer/gyroscope/compass. There’s also a Mini-USB, a Micro-USB, and USB 3.0.
In addition to above specs Tango’s specs also include: a rear-facing four megapixel
RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree
field- of-view front facing camera, and a 320 x 180 depth sensor – plus a vision processor
with one teraflop of computer power. Project Tango uses a 3000 mAh battery.
CHAPTER 4
HARDWARE
Project Tango is basically a camera and sensor array that happens to run on an
Android phone. The smartphone is equipped with a variety of cameras and vision sensors that
provides a whole new perspective on the world around it. The Tango smartphone can capture
a wealth of data never before available to application developers, including depth and object-
PROJECT TANGO
Department of CSE Page 10
tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical
smartphone. The Front View and Back View of a Tango Phone is shown below.
It is same like some other phones but the phone is having variety of cameras and
sensors that make the 3D modelling of the environment possible.
Fig (6) Tango Phone Front View
The device tracks the 3D motion and creates a 3D model of the environment around it
by using the array of cameras and sensors. The phone emits pulses of infrared light from the
IR projector and records how it is reflected back allowing it to build a detailed depth map of
the surrounding space.
There are three cameras that capture a 120-degree wide-angle field of view. 3D
camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a
projection of the scene onto the camera's imaging plane; any depth information is lost. In
contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D
data).A rear-facing four megapixel RGB/infrared camera, a 180-degree field-of-view fisheye
rear-facing camera, a 120-degree field- of-view front facing camera, and a 320 x 180 depth
sensor are the components of the phone at the rear end that works together to give the 3D
structure of the scene.
PROJECT TANGO
Department of CSE Page 11
Fig (7) Tango Phone Back View
Project Tango, which Google paired with sensors and cameras to give the smartphone
the same level of computer vision and tracking that formerly required much larger equipment.
The phone is equipped with a standard 4-megapixel camera paired with a special combination
of RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image
sensors give the smartphone a similar perspective on the world, complete with 3-D awareness
and a awareness of depth. They supply information to Movidius custom Myriad 1 low-power
computer-vision processor, which can then process the data and feed it to apps through a set
of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all
the motions made by the user.
The phone is equipped with a standard 4-megapixel camera paired with a special
combination of RGB and IR sensor and a lower-resolution image-tracking camera..As the
main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile
device. The OV4682 is a 4MP RGB IR image sensor that captures high-resolution images and
video as well as IR information, enabling depth analysis.
PROJECT TANGO
Department of CSE Page 12
Fig(9):Front and rear camera Fig(10):Fisheye camera
Fig(11):IR projector
Integrated depth sensor
PROJECT TANGO
Department of CSE Page 13
CHAPTER 5
TECHNOLOGY BEHIND TANGO
5.1 TANGO’S SENSOR
Myriad 1 vision processor platform developed by Movidius Company. The sensors
allow the device to make "over a quarter million 3D measurements every second, updating its
position and orientation in real time, combining that data into a single 3D model of the space
around you. Movidius which is the company that developed some of the technology which
has been used in Tango has been working on computer vision technology for the past seven
years - it developed the processing chips used in Project Tango, which Google paired with
sensors and cameras to give the smartphone the same level of computer vision and tracking
that formerly required much larger equipment.
5.2 IMAGE SENSORS
Image sensors give the smartphone a similar perspective on the world, complete with
3-D awareness and a awareness of depth which is then supplied information to Movidius
custom Myriad 1 low-power computer-vision processor, which can then process the data and
feed it to apps through a set of APIs. The Motion Tracking camera keeps track of all the
motions made by the user. . There are three cameras that capture a 120-degree wide-angle
field of view from the front. An even wider 180 degree span from the back.The phone is
equipped with a standard 4-megapixel camera paired with a special combination of RGB and
IR sensor and a lower-resolution image-tracking camera. Its depth-sensing array consists of an
infrared projector, 4MP rear camera and front-facing fisheye view lens with 180-degree field
of vision. The phone emits pulses of infrared light from the IR projector and records how it is
reflected back allowing it to build a detailed depth map of the surrounding space. The data
PROJECT TANGO
Department of CSE Page 14
collected from sensors and camera is processed by the Myriad vision processor for delivering
3D structure of the view to the apps.
CHAPTER 6
WORKING CONCEPT
Project Tango devices combine the camera, gyroscope and accelerometer to estimate
six degrees of freedom motion tracking, providing developers the ability to track 3D motion
of a device while simultaneously creating a map of the environment.
An IR projector provides infrared light that other (non-RGB) cameras can use to get a
sense of an area in 3D space. The phone emits pulses of infrared light from the IR projector
and records how it is reflected back allowing it to build a detailed depth map of the
surrounding space. There are three cameras that capture a 120-degree wide-angle field of
view from the front. An even wider 180 degree span from the back. A 4-MP color camera
sensor can also be used for snapping regular pics. A 3D camera captures the 3D structure of a
scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's
imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth
dimension (in addition to the standard 2D data).
The main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project
Tango’s mobile device. The OV4682 is a 4MP RGB IR image sensor that captures high-
resolution images and video as well as IR information, enabling depth analysis. The sensor
features a 2um OmniBSI-2 pixel and records 4MP images and video in a 16:9 format at 90fps.
The sensor's 2-micron OmniBSI-2 pixel delivers excellent signal-to-noise ratio and IR
sensitivity, and offers best-in-class low-light sensitivity. The OV4682's unique architecture
and pixel optimization bring not only the best IR performance but also best-in-class image
quality. The OV4682 records full-resolution 4-megapixel video in a native 16:9 format at 90
frames per second (fps), with a quarter of the pixels dedicated to capturing IR. The 1/3- inch
sensor can also record 1080p high definition (HD) video at 120 fps with electronic image
stabilization (EIS), or 720p HD at 180 fps. The OV7251 Camera Chip sensor is capable of
PROJECT TANGO
Department of CSE Page 15
capturing VGA resolution video at 100fps using a global shutter. RGB infrared (IR) single
sensor that captures high-resolution images and video as well as IR information. Its dual RGB
and IR capabilities allow it to bring a host of additional features to mobile and machine vision
applications, including gesture sensing, depth analysis, iris detection and eye tracking.
The another camera is fisheye lens enables a 180Âș FOV, while the sensor balances
resolution and frames per second to record black and white images for motion tracking. So if
the users moves the devices left or right, it draws the path that the devices and that path
followed is show in the image on the right in real-time. Thus through this we have a motion
capture capabilities in our device. The device also has a depth sensor.
Fig(12):The image represents the feed from the fish-eye lens
Fig(13):Computer vision
PROJECT TANGO
Department of CSE Page 16
The figure above illustrates depth sensing by displaying a distance heat map on top of
what the camera sees, showing blue colors on distant objects and red colors on close by
objects. It also the data from the image sensors and paired with the device's standard motion
sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and then
plot that onto an interactive 3D map. It uses the Sensor fusion technology which combines
sensory data or data derived from sensory data from disparate sources such that the resulting
information is in some sense better than would be possible when these sources were used
separately. Thus it means a more precise, more comprehensive, or more reliable, or refer to
the result of an emerging view, such as stereoscopic vision.
These combos of image sensors give the smartphone a similar perspective on the
world, complete with 3-D awareness and a awareness of depth. They supply information to
Movidius custom Myriad 1 low-power computer-vision processor, which can then process the
data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking
camera which is used to keep track of all the motions made by the user.
Mantis Vision, a developer of some of the world's most advanced 3D enabling
technologies research MV4D technology platform is the core 3D engine behind Google's
Project Tango. Mantis Vision provides the 3D sensing platform, consisting of flash projector
hardware components and Mantis Vision's core MV4D technology which includes structured
light-based depth sensing algorithms which generates realistic, dense maps of the world. It
focuses to provide reliable estimates of the pose of a phone i.e. position and alignment,
relative to its environment, dense maps of the world. It focuses to provide reliable estimates of
the pose of a phone (position and alignment), relative to its environment.
PROJECT TANGO
Department of CSE Page 17
CHAPTER 7
PROJECT TANGO CONCEPTS
Project Tango is different from other emerging 3D-sensing computer vision products,
such as Microsoft HoloLens, in that it's designed to run on a standalone mobile phone or
tablet and is chiefly concerned with determining the device's position and orientation within
the environment.
The software works by integrating three types of functionality:
7.1 Motion Tracking
Motion tracking allows a device to understand position and orientation using Project
Tango's custom sensors. This gives you real-time information about the 3D motion of a
device. Motion-tracking: using visual features of the environment, in combination with
accelerometer and gyroscope data, to closely track the device's movements in space. Project
Tango’s core functionality is measuring movement through space and understanding the area
moved through. Google API’s provide the position and orientation of the user’s device in full
six degrees of freedom, referred to as its pose.
PROJECT TANGO
Department of CSE Page 18
Fig (14) Motion Tracking
7.2 Area Learning
Using area learning, a Project Tango device can remember the visual features of the
area it is moving through and recognize when it sees those features again. These features can
be saved in an Area Description File (ADF) to use again later. Project Tango devices can use
visual cues to help recognize the world around them. With an ADF loaded, Project Tango
devices gain a new feature called drift corrections or improved motion tracking.
Area learning is the way of storing environment data in a map that can be re-used
later, shared with other Project Tango devices, and enhanced with metadata such as notes,
instructions, or points of interest.
PROJECT TANGO
Department of CSE Page 19
Fig (15) Area Learning
7.3 Depth Perception
Project Tango devices are equipped with integrated 3D sensors that measure the
distance from a device to objects in the real world. This configuration gives good depth at a
distance while balancing power requirements for infrared illumination and depth processing.
PROJECT TANGO
Department of CSE Page 20
The depth data allows an application to understand the distance of visible objects to
the device. By combining depth perception with motion tracking, you can also measure
distance between points in an area that aren’t in the same fame.
Project Tango devices are equipped with integrated 3D sensors that measure the
distance from a device to objects in the real world. Current devices are designed to work best
indoors at moderate distances (0.5 to 4 meters). It may not be ideal for close range object
scanning. Because the technology relies on viewing infrared light using the device's camera,
there are some situations where accurate depth perception is difficult. Areas lit with light
sources high in IR like sunlight or incandescent bulbs, or objects that do not reflect IR light
cannot be scanned well.
By combining depth perception with motion tracking, you can also measure distances
between points in an area that aren't in the same frame.
Fig (16) Depth Perception
PROJECT TANGO
Department of CSE Page 21
Together, these generate data about the device in "six degrees of freedom" (3 axes of
orientation plus 3 axes of motion) and detailed three-dimensional information about the
environment.
Applications on mobile devices use Project Tango's C and Java APIs to access this
data in real time. In addition, an API is also provided for integrating Project Tango with the
Unity game engine; this enables the rapid conversion or creation of games that allow the user
to interact and navigate in the game space by moving and rotating a Project Tango device in
real space. These APIs are documented on the Google developer website.
CHAPTER 8
DEVICES DEVELOPED SO FAR
As a platform for software developers and a model for device manufacturers, Google
has created two Project Tango devices to date.
The Yellowstone tablet
Google's Project Tango tablet, 2014 "Yellowstone" is a 7-inch tablet with full
Project Tango functionality, released in June 2014, and sold as the Project Tango Tablet
Development Kit.[7] It features a 2.3 GHz quad-core Nvidia Tegra K1 processor, 128GB
flash memory, 1920x1200-pixel touchscreen, 4MP color camera, fisheye-lens (motion-
tracking) camera, integrated depth sensing, and 4G LTE connectivity. The device is sold
through the official Project Tango website and the Google Play Store.
PROJECT TANGO
Department of CSE Page 22
The Peanut phone
"Peanut" was the first production Project Tango device, released in the first quarter
of 2014. It was a small Android phone with a Qualcomm MSM8974 quad-core processor and
additional special hardware including a fisheye-lens camera (for motion tracking), "RGB-IR"
camera (for color images and infrared depth detection), and Movidius image-processing chips.
A high- performance accelerometer and gyroscope were added after testing several
competing models in the MARS lab at the University of Minnesota.
Several hundred Peanut devices were distributed to early-access partners including
university researchers in computer vision and robotics, as well as application developers and
technology. Google stopped supporting the Peanut device in September 2015, as by then the
Project Tango software stack had evolved beyond the versions of Android that run on the
device.
Testing by NASA
In May 2014, two Peanut phones were delivered to the InternatInternat
ionalional Space Station to be part of a NASA project to develop autonomous robots
that navigate in a variety of environments, including outer space. The soccer-ball-sized, 18-
sided polyhedral SPHERES robots were developed at the NASA Ames Research Center,
adjacent to the Google campus in Mountain View, California. Andres Martinez, SPHERES
manager at NASA, said "We are researching how effective Project Tango's vision-based
navigation abilities are for performing localization and navigation of a mobile free flyer on
ISS.
PROJECT TANGO
Department of CSE Page 23
CHAPTER 9
FUTURE SCOPE
Project Tango seeks to take the next step in this mapping evolution. Instead of
depending on the infrastructure, expertise, and tools of others to provide maps of the world,
Tango empowers users to build their own understanding, all with a phone. Imagine knowing
your exact position to within inches. Imagine building 3D maps of the world in parallel with
other users around you. Imagine being able to track not just the top down location of a device,
but also its full 3D position and alignment. The technology is ambitious, the potential
applications are powerful. The Tango device really enables augmented reality which opens a
PROJECT TANGO
Department of CSE Page 24
whole frontier for playing games in the scenery around you. You can capture the room, you
can then render the scene that includes the room but also adds characters and adds objects so
that you can create games that operate in your natural environment. The applications even go
beyond gaming. Imagine if you could see what room would look like and decorate it with
different types of furniture and walls and create a very realistic scene. This Technology can be
used the guide the visually impaired to give them auditory queues or where they are going.
Can even be used by soldiers in war to replicate the war-zone and prepare for combat or can
even be used to live out one’s own creative fantasies. The possibilities are really endless for
this amazing technology and the future is looking very bright.
Things Project Tango can do:
DIRECTIONS:
When you need directions inside a building or structure that current mapping solutions
just don’t provide. Shopping - who just like to get in and out as quickly as possible. Having an
indoor map of the store in your hand could make shopping trips more efficient by leading you
directly to the shelf you want.
EMERGENCY RESPONSE:
To help emergency response workers such as firefighters find their way through
buildings by projecting the blueprints onto the screen. It has the potential to provide valuable
information in situations where knowing the exact layout of a room can be a matter of life or
death.
AUGMENTED REALITY GAMING:
It could combine the room-mapping with augmented reality.Imagine competing
against a friend for control over territories in your own home with your own miniature army.
Mapping in-game textures onto your real walls through the smartphone would
arguably produce the best game of Cops and Robbers in history.
PROJECT TANGO
Department of CSE Page 25
CHAPTER 10
CONCLUSION
Project Tango enables apps to track a device's position and orientation within a
detailed 3D environment, and to recognize known environments. This makes possible
applications such as in- store navigation, visual measurement and mapping utilities,
presentation and design tools, and a variety of immersive games.
PROJECT TANGO
Department of CSE Page 26
At this moment, Tango is just a project but is developing quite rapidly with early
prototypes and development kits already distributed among many developers. It is all up to the
developers now to create more clever and innovative apps to take advantage of this
technology. It is just the beginning and there is a lot of work to do to fine-tune this amazing
technology. Thus, if Project Tango works – and we've no reason to suspect it won't - it could
prove every bit as revolutionary as Maps or earth or android. It just might take a while for its
true genius to become clear.
CHAPTER 11
REFERENCE
 Announcement on ATAP Google, 30 January 2015.
 "Future Phones Will Understand, See the World". 3 June 2015. Retrieved 4
November 2015.
 Slamdance: inside the weird virtual reality of Google's Project Tango". 29 May 2015.
PROJECT TANGO
Department of CSE Page 27
 Qualcomm Powers Next Generation Project Tango Development Platform, 2015-05-
29
 IDF 2015: Intel teams with Google to bring Real Sense to Project Tango, 2015-08-18
 https://developers.google.com/project-tango/ Google developer website.
 Product announcement on ATAP Google, 5 June 2014, retrieved 4 November 2015
 https://www.google.com/atap/project-tango/ Project Tango website

Weitere Àhnliche Inhalte

Was ist angesagt?

google glass
google glassgoogle glass
google glass
Akash Senger
 
Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
bhavyakishore
 

Was ist angesagt? (6)

Ppt on Google glass
Ppt on Google glassPpt on Google glass
Ppt on Google glass
 
Siggraph 2016 - Vulkan and nvidia : the essentials
Siggraph 2016 - Vulkan and nvidia : the essentialsSiggraph 2016 - Vulkan and nvidia : the essentials
Siggraph 2016 - Vulkan and nvidia : the essentials
 
google glass
google glassgoogle glass
google glass
 
Augmented reality technical presentation
 Augmented reality technical presentation Augmented reality technical presentation
Augmented reality technical presentation
 
Dissecting the Rendering of The Surge
Dissecting the Rendering of The SurgeDissecting the Rendering of The Surge
Dissecting the Rendering of The Surge
 
Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
 

Ähnlich wie Google''s Project Tango

Google glass
Google glass Google glass
Google glass
Amith
 
CMPE- 280-Research_paper
CMPE- 280-Research_paperCMPE- 280-Research_paper
CMPE- 280-Research_paper
Sanjeedha Sanofer
 

Ähnlich wie Google''s Project Tango (20)

Tango
TangoTango
Tango
 
Google's project tango seminar ppt
Google's project tango seminar pptGoogle's project tango seminar ppt
Google's project tango seminar ppt
 
google tango technology ppt
google tango technology pptgoogle tango technology ppt
google tango technology ppt
 
Aijaz tango
Aijaz tangoAijaz tango
Aijaz tango
 
Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...
 
Google Project Tango
Google Project TangoGoogle Project Tango
Google Project Tango
 
Presentation on Google Tango By Atharva Jawalkar
Presentation on Google Tango By Atharva Jawalkar Presentation on Google Tango By Atharva Jawalkar
Presentation on Google Tango By Atharva Jawalkar
 
Project tango
Project tangoProject tango
Project tango
 
Google glass
Google glassGoogle glass
Google glass
 
Project Tango
Project TangoProject Tango
Project Tango
 
Seminar report on google glass
Seminar report on google glassSeminar report on google glass
Seminar report on google glass
 
Seminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green ITSeminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green IT
 
Google glass
Google glassGoogle glass
Google glass
 
google tango technology Seminar report
google tango technology Seminar reportgoogle tango technology Seminar report
google tango technology Seminar report
 
Googleglassppt 140825105830-phpapp02
Googleglassppt 140825105830-phpapp02Googleglassppt 140825105830-phpapp02
Googleglassppt 140825105830-phpapp02
 
Google glass
Google glass Google glass
Google glass
 
CMPE- 280-Research_paper
CMPE- 280-Research_paperCMPE- 280-Research_paper
CMPE- 280-Research_paper
 
Google Glass ppt
Google Glass pptGoogle Glass ppt
Google Glass ppt
 
Mitchell Reifel (pmdtechnologies ag): pmd Time-of-Flight – the Swiss Army Kni...
Mitchell Reifel (pmdtechnologies ag): pmd Time-of-Flight – the Swiss Army Kni...Mitchell Reifel (pmdtechnologies ag): pmd Time-of-Flight – the Swiss Army Kni...
Mitchell Reifel (pmdtechnologies ag): pmd Time-of-Flight – the Swiss Army Kni...
 
Projectglassppt 130418102721-phpapp01
Projectglassppt 130418102721-phpapp01Projectglassppt 130418102721-phpapp01
Projectglassppt 130418102721-phpapp01
 

KĂŒrzlich hochgeladen

Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdfVishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
ssuserdda66b
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
ZurliaSoop
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 

KĂŒrzlich hochgeladen (20)

ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdfVishram Singh - Textbook of Anatomy  Upper Limb and Thorax.. Volume 1 (1).pdf
Vishram Singh - Textbook of Anatomy Upper Limb and Thorax.. Volume 1 (1).pdf
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 

Google''s Project Tango

  • 1. PROJECT TANGO Department of CSE Page 1 CHAPTER 1 INTRODUCTION 3D models represent a 3D object using a collection of points in a given 3D space, connected by various entities such as curved surfaces, triangles, lines, etc.Being a collection of data which includes points and other information, 3D models can be created by hand, scanned (procedural modeling), or algorithmically. The "Project Tango" prototype is an Android smartphone-like device which tracks the 3D motion of particular device, and creates a 3D model of the environment around it. Project Tango was introduced by Google initially in early 2013,they described this as a Simultaneous Localisation and Mapping(SLAM) system capable of operating in real-time on a phone.Google’s ATAP teamed up with a number of organizations to create Project Tango from this description. The team at Google’s Advanced Technology and Projects Group (ATAP) has been working with various Universities and Research labs to harvest ten years of research in Robotics and Computer Vision to concentrate that technology into a very unique mobile phone. We are physical being that live in a 3D world yet the mobile devices today assume that the physical world ends the boundaries of the screen. Project Tango’s goal is to give mobile devices a human scale understanding of space and motion. This project will help people interact with the environment in a fundamentally different way and using this technology we can prototype in a couple of hours something that would take us months or even years before because we did not have this technology readily available. Imagine having all this in a smartphone and see how things would change. This device runs Android and includes development APIs to provide alignment, position or location, and depth data to regular Android apps written in C/C++, Java as well as the Unity Game Engine(UGE). These early algorithms, prototypes, and APIs are still in active development. So, these are experimental devices and are intended only for the exploratory and adventurous are not a final shipping product. Project Tango is a prototype phone containing highly customized hardware and software designed to allow the phone to track its motion in full 3D in real-time. The sensors make over a quarter million 3D measurements every single second updating the position and
  • 2. PROJECT TANGO Department of CSE Page 2 rotation of the phone, blending this data into a single 3D model of the environment. It tracks ones position as one goes around the world and also makes a map of that. It can scan a small section of your room and then are able to generate a little game world in it. It is an open source technology. ATAP has around 200 development kits which has already been distributed among the developers. Google has produced two devices to demonstrate the Project Tango technology: the Peanut phone(no longer available) and the Yellowstone 7-inch tablet. More than 3000 of these devices had been sold as of June 2015,chiefly to researchers and software developers interested in building applications for the platform.In the summer of 2015,Qualcomm and Intel both announced that they are developing Project Tango reference devices as models for device manufacturers who use their mobile chipsets. At CES, in January 2016, Google announced a partnership with Lenovo to release a consumer smartphone during the summer of 2016 to feature Project Tango technology marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the same time, both companies also announced an application incubator to get applications developed to be on the device on launch. Fig(1): Google’s Project Tango Logo
  • 3. PROJECT TANGO Department of CSE Page 3 Which companies are behind Project Tango? A number of companies came together to develop Project Tango.All of these are listed in the credit of Google Project Tango introduction video called “Say hello to Project Tango!”. Each companies has had a different amount of involvement. The following are the list of participating companies listed in that video:  Bosch  BSquare  ETH Zurich  Flyby Media  George Washington University  HiDOF  MMSolutions  Movidius  University of Minnesota  NASA JPL  Ologic  OmniVision  Open Source Robotic Foundation  ParaCosm  Sunny Optical Tech  Speck Design
  • 4. PROJECT TANGO Department of CSE Page 4 CHAPTER 2 OVERVIEW WHAT IS PROJECTTANGO? Tango allows a device to build up an accurate 3D model of it’s immediate surroundings, which Google says which is useful for everything from AR gaming to navigating large shopping centres. Fig(2): A view of Google’s Project Tango 3D Model Mapping Google isn’t content with making softwares for phones that can merely capture 2D photos and videos. Nor does it just want to take stereoscopic 3D snaps. Instead Project Tango is bid to equip every mobile device with a powerful suite of software and sensors that can capture a complete 3D picture of the world around it, in real time. Why? So
  • 5. PROJECT TANGO Department of CSE Page 5 you can map your house,furniture, and all simply by walking around it. Bingo-no more measuring up before going shopping for a wardrobe. Or so you can avoid getting lost next time you go to the hospital – you’ll have instant access to a 3D plan of it’s labyrinthine corridors. Or so can play augmented reality games. Or so that can the visually impaired can receive extra help in getting around. WHAT DOES THE PHONE LOOK LIKE? There are two prototypes of Tango phone yet. A 7 inch tablet and another prototype of a 5 inch phone. Fig(3): Prototype 1 It is a fairly standard 7 inch slate with a slight wedge at the back to accommodate the extre sensors. As far as we can tell, it has three cameras including the webcam. Inside it has one Nvidia’s so-far-untested Tegra K1 mobile processors with a beefy 4GB of RAM and a 128GB SSD. Google is at pains to point out that it’s not a consumer device, but one is supposedly on the way. The depth-sensing array consists of an infrared projector, 4MB rear camera and front-facing fisheye view lens with 180- degree field of vision. Physically, it’s a standard phone shape but rather chunky compared to the class of 2014. More like something about 2010.
  • 6. PROJECT TANGO Department of CSE Page 6 Fig(4): Prototype 2 Prototype 2 is an android 5 inch smartphone with same tango hardware as that of the tablet.
  • 7. PROJECT TANGO Department of CSE Page 7 Fig(5): A simple Overview of Components of Tango Phone Google's Project Tango is a smartphone equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a wealth of data never before available to application developers, including depth and object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical smartphone. Project Tango is different from other emerging 3D-sensing computer vision products, such as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly concerned with determining the device's position and orientation within the environment. The high-end Android tablet with 7-inch HD display, 4GB of RAM, 128GB of internal SSD storage and an NVIDIA Tegra K1 graphics chip (the first in the US and second in the world) that features desktop GPU architecture. It also has a distinctive design that consists of an array of cameras and sensors near the top and a couple of subtle grips on the sides. Movidius which is the company that developed some of the technology which has been used in Tango has been working on computer vision technology for the past seven years - it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly
  • 8. PROJECT TANGO Department of CSE Page 8 required much larger equipment. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image- tracking camera. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low- power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. CHAPTER 3 SMARTPHONE SPECIFICATION Tango wants to deconstruct reality, taking a quarter million 3D measurements each second to create a real-time 3D model that describes the physical depth of its surroundings. The smartphone specs are
  • 9. PROJECT TANGO Department of CSE Page 9 The above specs include Snapdragon 800 quad core CPU running up to 2.3 GHz per core, 2GB or 4GB of memory, an expandable 64GB or 128 of internal storage, and a nine axis accelerometer/gyroscope/compass. There’s also a Mini-USB, a Micro-USB, and USB 3.0. In addition to above specs Tango’s specs also include: a rear-facing four megapixel RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree field- of-view front facing camera, and a 320 x 180 depth sensor – plus a vision processor with one teraflop of computer power. Project Tango uses a 3000 mAh battery. CHAPTER 4 HARDWARE Project Tango is basically a camera and sensor array that happens to run on an Android phone. The smartphone is equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a wealth of data never before available to application developers, including depth and object-
  • 10. PROJECT TANGO Department of CSE Page 10 tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical smartphone. The Front View and Back View of a Tango Phone is shown below. It is same like some other phones but the phone is having variety of cameras and sensors that make the 3D modelling of the environment possible. Fig (6) Tango Phone Front View The device tracks the 3D motion and creates a 3D model of the environment around it by using the array of cameras and sensors. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. There are three cameras that capture a 120-degree wide-angle field of view. 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D data).A rear-facing four megapixel RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree field- of-view front facing camera, and a 320 x 180 depth sensor are the components of the phone at the rear end that works together to give the 3D structure of the scene.
  • 11. PROJECT TANGO Department of CSE Page 11 Fig (7) Tango Phone Back View Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera..As the main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile device. The OV4682 is a 4MP RGB IR image sensor that captures high-resolution images and video as well as IR information, enabling depth analysis.
  • 12. PROJECT TANGO Department of CSE Page 12 Fig(9):Front and rear camera Fig(10):Fisheye camera Fig(11):IR projector Integrated depth sensor
  • 13. PROJECT TANGO Department of CSE Page 13 CHAPTER 5 TECHNOLOGY BEHIND TANGO 5.1 TANGO’S SENSOR Myriad 1 vision processor platform developed by Movidius Company. The sensors allow the device to make "over a quarter million 3D measurements every second, updating its position and orientation in real time, combining that data into a single 3D model of the space around you. Movidius which is the company that developed some of the technology which has been used in Tango has been working on computer vision technology for the past seven years - it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. 5.2 IMAGE SENSORS Image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth which is then supplied information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The Motion Tracking camera keeps track of all the motions made by the user. . There are three cameras that capture a 120-degree wide-angle field of view from the front. An even wider 180 degree span from the back.The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. Its depth-sensing array consists of an infrared projector, 4MP rear camera and front-facing fisheye view lens with 180-degree field of vision. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. The data
  • 14. PROJECT TANGO Department of CSE Page 14 collected from sensors and camera is processed by the Myriad vision processor for delivering 3D structure of the view to the apps. CHAPTER 6 WORKING CONCEPT Project Tango devices combine the camera, gyroscope and accelerometer to estimate six degrees of freedom motion tracking, providing developers the ability to track 3D motion of a device while simultaneously creating a map of the environment. An IR projector provides infrared light that other (non-RGB) cameras can use to get a sense of an area in 3D space. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. There are three cameras that capture a 120-degree wide-angle field of view from the front. An even wider 180 degree span from the back. A 4-MP color camera sensor can also be used for snapping regular pics. A 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D data). The main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile device. The OV4682 is a 4MP RGB IR image sensor that captures high- resolution images and video as well as IR information, enabling depth analysis. The sensor features a 2um OmniBSI-2 pixel and records 4MP images and video in a 16:9 format at 90fps. The sensor's 2-micron OmniBSI-2 pixel delivers excellent signal-to-noise ratio and IR sensitivity, and offers best-in-class low-light sensitivity. The OV4682's unique architecture and pixel optimization bring not only the best IR performance but also best-in-class image quality. The OV4682 records full-resolution 4-megapixel video in a native 16:9 format at 90 frames per second (fps), with a quarter of the pixels dedicated to capturing IR. The 1/3- inch sensor can also record 1080p high definition (HD) video at 120 fps with electronic image stabilization (EIS), or 720p HD at 180 fps. The OV7251 Camera Chip sensor is capable of
  • 15. PROJECT TANGO Department of CSE Page 15 capturing VGA resolution video at 100fps using a global shutter. RGB infrared (IR) single sensor that captures high-resolution images and video as well as IR information. Its dual RGB and IR capabilities allow it to bring a host of additional features to mobile and machine vision applications, including gesture sensing, depth analysis, iris detection and eye tracking. The another camera is fisheye lens enables a 180Âș FOV, while the sensor balances resolution and frames per second to record black and white images for motion tracking. So if the users moves the devices left or right, it draws the path that the devices and that path followed is show in the image on the right in real-time. Thus through this we have a motion capture capabilities in our device. The device also has a depth sensor. Fig(12):The image represents the feed from the fish-eye lens Fig(13):Computer vision
  • 16. PROJECT TANGO Department of CSE Page 16 The figure above illustrates depth sensing by displaying a distance heat map on top of what the camera sees, showing blue colors on distant objects and red colors on close by objects. It also the data from the image sensors and paired with the device's standard motion sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and then plot that onto an interactive 3D map. It uses the Sensor fusion technology which combines sensory data or data derived from sensory data from disparate sources such that the resulting information is in some sense better than would be possible when these sources were used separately. Thus it means a more precise, more comprehensive, or more reliable, or refer to the result of an emerging view, such as stereoscopic vision. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. Mantis Vision, a developer of some of the world's most advanced 3D enabling technologies research MV4D technology platform is the core 3D engine behind Google's Project Tango. Mantis Vision provides the 3D sensing platform, consisting of flash projector hardware components and Mantis Vision's core MV4D technology which includes structured light-based depth sensing algorithms which generates realistic, dense maps of the world. It focuses to provide reliable estimates of the pose of a phone i.e. position and alignment, relative to its environment, dense maps of the world. It focuses to provide reliable estimates of the pose of a phone (position and alignment), relative to its environment.
  • 17. PROJECT TANGO Department of CSE Page 17 CHAPTER 7 PROJECT TANGO CONCEPTS Project Tango is different from other emerging 3D-sensing computer vision products, such as Microsoft HoloLens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly concerned with determining the device's position and orientation within the environment. The software works by integrating three types of functionality: 7.1 Motion Tracking Motion tracking allows a device to understand position and orientation using Project Tango's custom sensors. This gives you real-time information about the 3D motion of a device. Motion-tracking: using visual features of the environment, in combination with accelerometer and gyroscope data, to closely track the device's movements in space. Project Tango’s core functionality is measuring movement through space and understanding the area moved through. Google API’s provide the position and orientation of the user’s device in full six degrees of freedom, referred to as its pose.
  • 18. PROJECT TANGO Department of CSE Page 18 Fig (14) Motion Tracking 7.2 Area Learning Using area learning, a Project Tango device can remember the visual features of the area it is moving through and recognize when it sees those features again. These features can be saved in an Area Description File (ADF) to use again later. Project Tango devices can use visual cues to help recognize the world around them. With an ADF loaded, Project Tango devices gain a new feature called drift corrections or improved motion tracking. Area learning is the way of storing environment data in a map that can be re-used later, shared with other Project Tango devices, and enhanced with metadata such as notes, instructions, or points of interest.
  • 19. PROJECT TANGO Department of CSE Page 19 Fig (15) Area Learning 7.3 Depth Perception Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. This configuration gives good depth at a distance while balancing power requirements for infrared illumination and depth processing.
  • 20. PROJECT TANGO Department of CSE Page 20 The depth data allows an application to understand the distance of visible objects to the device. By combining depth perception with motion tracking, you can also measure distance between points in an area that aren’t in the same fame. Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. Current devices are designed to work best indoors at moderate distances (0.5 to 4 meters). It may not be ideal for close range object scanning. Because the technology relies on viewing infrared light using the device's camera, there are some situations where accurate depth perception is difficult. Areas lit with light sources high in IR like sunlight or incandescent bulbs, or objects that do not reflect IR light cannot be scanned well. By combining depth perception with motion tracking, you can also measure distances between points in an area that aren't in the same frame. Fig (16) Depth Perception
  • 21. PROJECT TANGO Department of CSE Page 21 Together, these generate data about the device in "six degrees of freedom" (3 axes of orientation plus 3 axes of motion) and detailed three-dimensional information about the environment. Applications on mobile devices use Project Tango's C and Java APIs to access this data in real time. In addition, an API is also provided for integrating Project Tango with the Unity game engine; this enables the rapid conversion or creation of games that allow the user to interact and navigate in the game space by moving and rotating a Project Tango device in real space. These APIs are documented on the Google developer website. CHAPTER 8 DEVICES DEVELOPED SO FAR As a platform for software developers and a model for device manufacturers, Google has created two Project Tango devices to date. The Yellowstone tablet Google's Project Tango tablet, 2014 "Yellowstone" is a 7-inch tablet with full Project Tango functionality, released in June 2014, and sold as the Project Tango Tablet Development Kit.[7] It features a 2.3 GHz quad-core Nvidia Tegra K1 processor, 128GB flash memory, 1920x1200-pixel touchscreen, 4MP color camera, fisheye-lens (motion- tracking) camera, integrated depth sensing, and 4G LTE connectivity. The device is sold through the official Project Tango website and the Google Play Store.
  • 22. PROJECT TANGO Department of CSE Page 22 The Peanut phone "Peanut" was the first production Project Tango device, released in the first quarter of 2014. It was a small Android phone with a Qualcomm MSM8974 quad-core processor and additional special hardware including a fisheye-lens camera (for motion tracking), "RGB-IR" camera (for color images and infrared depth detection), and Movidius image-processing chips. A high- performance accelerometer and gyroscope were added after testing several competing models in the MARS lab at the University of Minnesota. Several hundred Peanut devices were distributed to early-access partners including university researchers in computer vision and robotics, as well as application developers and technology. Google stopped supporting the Peanut device in September 2015, as by then the Project Tango software stack had evolved beyond the versions of Android that run on the device. Testing by NASA In May 2014, two Peanut phones were delivered to the InternatInternat ionalional Space Station to be part of a NASA project to develop autonomous robots that navigate in a variety of environments, including outer space. The soccer-ball-sized, 18- sided polyhedral SPHERES robots were developed at the NASA Ames Research Center, adjacent to the Google campus in Mountain View, California. Andres Martinez, SPHERES manager at NASA, said "We are researching how effective Project Tango's vision-based navigation abilities are for performing localization and navigation of a mobile free flyer on ISS.
  • 23. PROJECT TANGO Department of CSE Page 23 CHAPTER 9 FUTURE SCOPE Project Tango seeks to take the next step in this mapping evolution. Instead of depending on the infrastructure, expertise, and tools of others to provide maps of the world, Tango empowers users to build their own understanding, all with a phone. Imagine knowing your exact position to within inches. Imagine building 3D maps of the world in parallel with other users around you. Imagine being able to track not just the top down location of a device, but also its full 3D position and alignment. The technology is ambitious, the potential applications are powerful. The Tango device really enables augmented reality which opens a
  • 24. PROJECT TANGO Department of CSE Page 24 whole frontier for playing games in the scenery around you. You can capture the room, you can then render the scene that includes the room but also adds characters and adds objects so that you can create games that operate in your natural environment. The applications even go beyond gaming. Imagine if you could see what room would look like and decorate it with different types of furniture and walls and create a very realistic scene. This Technology can be used the guide the visually impaired to give them auditory queues or where they are going. Can even be used by soldiers in war to replicate the war-zone and prepare for combat or can even be used to live out one’s own creative fantasies. The possibilities are really endless for this amazing technology and the future is looking very bright. Things Project Tango can do: DIRECTIONS: When you need directions inside a building or structure that current mapping solutions just don’t provide. Shopping - who just like to get in and out as quickly as possible. Having an indoor map of the store in your hand could make shopping trips more efficient by leading you directly to the shelf you want. EMERGENCY RESPONSE: To help emergency response workers such as firefighters find their way through buildings by projecting the blueprints onto the screen. It has the potential to provide valuable information in situations where knowing the exact layout of a room can be a matter of life or death. AUGMENTED REALITY GAMING: It could combine the room-mapping with augmented reality.Imagine competing against a friend for control over territories in your own home with your own miniature army. Mapping in-game textures onto your real walls through the smartphone would arguably produce the best game of Cops and Robbers in history.
  • 25. PROJECT TANGO Department of CSE Page 25 CHAPTER 10 CONCLUSION Project Tango enables apps to track a device's position and orientation within a detailed 3D environment, and to recognize known environments. This makes possible applications such as in- store navigation, visual measurement and mapping utilities, presentation and design tools, and a variety of immersive games.
  • 26. PROJECT TANGO Department of CSE Page 26 At this moment, Tango is just a project but is developing quite rapidly with early prototypes and development kits already distributed among many developers. It is all up to the developers now to create more clever and innovative apps to take advantage of this technology. It is just the beginning and there is a lot of work to do to fine-tune this amazing technology. Thus, if Project Tango works – and we've no reason to suspect it won't - it could prove every bit as revolutionary as Maps or earth or android. It just might take a while for its true genius to become clear. CHAPTER 11 REFERENCE  Announcement on ATAP Google, 30 January 2015.  "Future Phones Will Understand, See the World". 3 June 2015. Retrieved 4 November 2015.  Slamdance: inside the weird virtual reality of Google's Project Tango". 29 May 2015.
  • 27. PROJECT TANGO Department of CSE Page 27  Qualcomm Powers Next Generation Project Tango Development Platform, 2015-05- 29  IDF 2015: Intel teams with Google to bring Real Sense to Project Tango, 2015-08-18  https://developers.google.com/project-tango/ Google developer website.  Product announcement on ATAP Google, 5 June 2014, retrieved 4 November 2015  https://www.google.com/atap/project-tango/ Project Tango website