SlideShare ist ein Scribd-Unternehmen logo
1 von 39
Downloaden Sie, um offline zu lesen
JP INSTITUTE OF ENGINEERING & TECHNOLOGY,
MEERUT
TANGO TECHNOLOGY
Submitted in partial fulfillment of the requirements
for the award of the degree of
Bachelor of Technology
in
Computer Science & Engineering
Submitted by:
RUPESH KUMAR (1628210067)
Under The Guidance of
Mr. Varun Pundir
(ASST PRO. CSE)
DR.A.P.J ABDUL KALAM TECHNICAL UNIVERSITY,
LUCKNOW, UTTAR PRADESH
Session: 2019-20
TABLE OF CONTENTS
Page
DECLARATION ..................................................... ii
CERTIFICATE ...................................................... iii
ACKNOWLEDGEMENTS ........................................................ iv
ABSTRACT ..................................................... v
LIST OF FIGURES. ..................................................... vi
LIST OF SYMBOLS ........................................................ vii
LIST OF ABBREVIATIONS ......................................................... viii
1. INTRODUCTION …………………………………… 1
2. OVERVIEW. …………………………………… 4
3. SMARTPHONE SPECIFICATIO …………………………………… 8
4. HARDWARE. ……………………………………. 9
5. TECHNOLOGY BEHIND TANGO ……………………………………. 14
5.1 TANGO SENSOR. ……………………………………. 14
5.2 IMAGE SENSOR. ……………………………………. 14
6. WORKING CONCEPT ……………………………………. 15
7. PROJECT TANGO CONCEPT. ……………………………………. 18
7.1. MOTION TRACKING. ……………………………………. 18
7.2. AREA LEARNING. ……………………………………. 19
7.3. DEPTH PRECEPTION. ……………………………………. 19
8. DEVICES DEVELOPED SO FAR …………………………………. 22
8.1 THE YELLOWSTONE TABLET …………………………………. 22
8.2 THE PEANUT PHONE …………………………………. 23
8.3. TESTING BY NASA ………………………………….. 23
8.4. INTEL REAL SENSE SMARTPHONE……………………………. ……. 23
8.5 LENOVO PHAB 2 PRO …………………………………… 24
8.6 ASUS ZENFONE AR ………………………………….. 26
9. FUTURE SCOPE. ……………………………………. 27
10. CONCLUSION …………………………. ………… 29
11.REFRENCES …………………………………….. 30
i
DECLARATION
I hereby declare that this submission is our own work and that, to the best of our
knowledge and belief, it contains no material previously published or written by
another person nor material which to a substantial extent has been accepted for
the award of any other degree or diploma of the university or other institute of
higher learning, except where due acknowledgment has been made in the text.
Signature. Rupesh Kumar
Name. RUPESH KUMAR
Roll No. 1628210067
Date 16/06/2020
ii
CERTIFICATE
This is to certify that this Project Report entitled “TANGO TECHNOLOGY”
which is submitted by Rupesh Kumar (1628210067) in the partial fulfillment,
for the award of degree of Bachelor of Technology in Department of Computer
Science & Engineering, of JP INSTITUTE OF ENGINEERING & TECH-
NOLOGY, Meerut, affiliated to DR. A.P.J. ABDUL KALAM TECHNICAL
UNIVERSITY, Lucknow; is carried out by him under my supervision.
The matter embodied in this Project Work has not been submitted earlier for
award of any degree or diploma in any university/institution to the best of our
knowledge and belief.
(Mr. Varun Pundir ) (Mr. Sreesh Gaur)33
Project Guide Head (CSE)
Date: 16/06/2020
iii
ACKNOWLEDGEMENT
It gives us a great sense of pleasure to present the report of the Technology
Seminar, undertaken during B. Tech. Final Year. We owe special debt of grati-
tude to Mr. Varun Pundir Department of Computer Science & Engineering, JP
Institute of Engineering & Technology, Meerut for his constant support and
guidance throughout the course of our work. His sincerity, thoroughness and
perseverance have been a constant source of inspiration for us. It is only his
cognizant efforts that our endeavors have seen light of the day.
We also take the opportunity to acknowledge the contribution of Mr.
Sheesh Gaur, HOD, Department of Computer Science & Engineering, JP Insti-
tute of Engineering & Technology, Meerut for his full support and assistance
during the development of the project.
We also do not like to miss the opportunity to acknowledge the contribu-
tion of all faculty members of the department for their kind assistance and coop-
eration during the development of our project.
Last but not the least, we acknowledge our friends for their contribution
in the completion of the project.
Signature: Rupesh Kumar
Name : RUPESH KUMAR
Roll No.: 1628210067
Date : 16/06/2020
iv
ABSTRACT
Tango (formerly named Project Tango, while in testing) was an augmented
reality computing platform, developed and authored by the Advanced Technology and
Projects (ATAP), a skunkworks division of Google. It used computer vision to
enable mobile devices, such as smartphones and tablets, to detect their position relative
to the world around them without using GPS or other external signals. This allowed ap-
plication developers to create user experiences that include indoor navigation, 3D map-
ping, physical space measurement, environmental recognition, augmented reality, and
windows into a virtual world.
At CES, in January 2016, Google announced a partnership with Lenovo to release a
consumer smartphone during the summer of 2016 to feature Tango technology market-
ed at consumers, noting a less than $500 price-point and a small form factor below 6.5
inches. At the same time, both companies also announced an application incubator to
get applications developed to be on the device on launch.
v
LIST OF FIGURES
Fig (1) Google’s Project Tango Logo
Fig (2) A view of Googles Project Tango 3D Model Mapping
Fig (3) Prototype 1 (phone look like, Internal)
Fig (4) Prototype 2 (phone look like, External)
Fig (5) A simple Overview of Components of Tango Phone
Fig (6) Tango Phone Front View
Fig (7) Tango Phone Back View
Fig (8) Tango phone Motherboard
Fig (9) Front and Rear Camera
Fig (10) Fisheye Camera
Fig (11) IR Projector Integrated Depth Sensor
Fig (12) The image represents the feed from the fish-eye lens
Fig (13) Computer Vision
Fig (14) Motion Tracking
Fig (15) Area Learning
Fig (16) Depth Perception
Fig (17) The Yellowstone tablet
Fig (18) Intel RealSense smartphone
Fig (19) Lenovo Phab 2 Pro
Fig (20) Asus Zenfone AR
Fig (21) Simple Model
vi
LIST OF SYMBOLS
[x] Integer value of x.
≠ Not Equal
∈ Belongs to
€ Euro- A Currency
_ Optical distance
_
o
Optical thickness or optical half thickness
vii
LIST OF ABBREVIATIONS
SLAM. Simultaneous Localization and Mapping
ATAP Advanced Technology and Projects Group
UGE. Unity Game Engine
CES Consumer Electronics Show
RAM Random Access Memory
SSD Solid State Drive
API Application Program Interface
CPU Central Processing Unit
USB. Universal Serial Bus
FOV. Field Of View
ADF Area Description File
LTE Long-Term Evolution
VGA Video Graphics Array
GPS Global Positioning System
SDK Software Development Kit
viii
1. INTRODUCTION
3D models represent a 3D object using a collection of points in a given 3D space, connected
by various entities such as curved surfaces, triangles, lines, etc. Being a collection of data
which includes points and other information, 3D models can be created by hand, scanned
(procedural modeling), or algorithmically. The "Project Tango" prototype is an Android
smartphone- like device which tracks the 3D motion of particular device, and creates a 3D
model of the environment around it.
Project Tango was introduced by Google initially in early 2013, they described this as a
Simultaneous Localization and Mapping (SLAM) system capable of operating in real-time
on a phone. Google’s ATAP teamed up with a number of organizations to create Project
Tango from this description.
The team at Google’s Advanced Technology and Projects Group (ATAP) has been
working with various Universities and Research labs to harvest ten years of research in
Robotics and Computer Vision to concentrate that technology into a very unique mobile
phone. We are physical being that live in a 3D world yet the mobile devices today assume
that the physical world ends the boundaries of the screen. Project Tango’s goal is to give
mobile devices a human-scale understanding of space and motion. This project will help
people interact with the environment in a fundamentally different way and using this
technology we can prototype in a couple of hours something that would take us months or
even years before because we did not have this technology readily available. Imagine having
all this in a smartphone and see how thingswould change.
The first product to emerge from Google's ATAP skunkworks group,
[1]
Project Tango was
developed by a team led by computer scientist Johnny Lee, a core
contributor to Microsoft's Kinect. In an interview in June 2015, Lee said, "We're developing
the hardware and software technologies to help everything and everyone understand precisely
where they are, anywhere.”
This device runs Android and includes development APIs to provide alignment, position
or location, and depth data to regular Android apps written in C/C++, Java as well as the
Unity Game Engine (UGE). These early algorithms, prototypes, and APIs are still in active
development. So, these are experimental devices and are intended only for the exploratory
and adventurous are not a finalshipping product.
Project Tango technology gives a mobile device the ability to navigate the physical world
similarto how we do as humans.
Project Tango brings a new kind of spatial perception to the Android device platform by
adding advanced computer vision, imageprocessing, and special vision sensors.
Project Tango is a prototype phone containing highly customized hardware and software
designed to allow the phone to track its motion in full 3D in real-time. The sensors make
Page 1
over a quarter million 3D measurements every single second updating the position and
rotation of the phone, blending this data into a single 3D model of the environment. It tracks
ones position as one goes around the world and also makes a map of that. It can scan a small
section of your room and then are able to generate a little game world in it. It is an open
source technology. ATAP has around 200 development kits which has already been
distributed among the developers.
Google has produced two devices to demonstrate the Project Tango technology: the Peanut
phone (no longer available) and the Yellowstone 7-inch tablet. More than 3,000 of these
devices had been sold as of June 2015, chiefly to researchers and software developers
interested in building applications for the platform. In the summer of 2015, Qualcomm
and Intel both announced that they are developing Project Tango reference devices as
models for device manufacturers who use their mobile chipsets.
At CES, in January 2016, Google announced a partnership with Lenovo to release a
consumer smartphone during the summer of 2016 to feature Project Tango technology
marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5
inches. At the same time, both companies also announced an application incubator to get
applications developed to be on the device on launch.
Fig (1) Google’s Project Tango Logo
Page 2
Which companies are behind Project Tango?
A number of companies came together to develop Project Tango. All of these are listed in
the credits of the Google Project Tango introduction video called “Say hello to Project
Tango!” Each company has had a different amount of involvement. The following are the list
of participating companies listed in that video:
· Bosch
· BSquare
· CompalComm
· ETH Zürich
· Flyby Media
· George Washington University
· HiDOF
· MMSolutions
· Movidius
· University of Minnesota
· NASA JPL
· Ologic
· OmniVision
· Open Source Robotics Foundation
· ParaCosm
· Sunny Optical tech
· Speck Design
Page 3
2. OVERVIEW
Google's Project Tango is a smartphone equipped with a variety of cameras and vision
sensors that provides a whole new perspective on the world around it. The Tango
smartphone can capture a wealth of data never before available to application
developers, including depth and object-tracking and instantaneous 3D mapping. And it
is almost as powerful and as big as a typical smartphone. Text Font of Entire Document
It's also available as a high-end Android tablet with 7-inch HD display.
WHAT IS PROJECT TANGO?
Tango allows a device to build up an accurate 3D model of its immediate surroundings,
which Google says will be useful for everything from AR gaming to navigating large shopping
centres.
Fig (2) A view of Googles Project Tango 3D ModelMapping
Google isn't content with making software for phones that can merely capture 2D photos
and videos. Nor does it just want to take stereoscopic 3D snaps. Instead, Project Tango is a
bid to equip every mobile device with a powerful suite of software and sensors that can
capture a complete 3D picture of the world around it, in real-time. Why? So you can map
your house, furniture and all, simply by walking around it. Bingo - no more measuring up
before going shopping for a new wardrobe. Or so you can avoid getting lost next time you
go to the hospital - you'll have instant access to a 3D plan of its labyrinthine corridors. Or
so you can easily find the 'unhealthy snacks' section in your local megamart. Or so can play
amazing augmented reality games. Or so that the visually impaired can receive extra help in
Page 4
getting around. In fact, as with most Google projects, the ways in which Tango could prove
useful are only limited by our imagination.
WHAT DOES THE PHONE LOOK LIKE?
There are two prototypes of Tango phone yet. A 7inch tablet and another prototype of a 5
inch phone.
Fig (3) Prototype 1
It's a fairly standard 7in slate with a slight wedge at the back to accommodate the extra
sensors. As far as we can tell, it has three cameras including the webcam. Inside, it has one of
Nvidia's so-far-untested Tegra K1 mobile processors with a beefy 4GB of RAM and
a 128GB SSD. Google is at pains to point out that it's not a consumer device, but one is
supposedly on the way. The depth-sensing array consists of an infrared projector, 4MP rear
camera and front- facing fisheye view lens with 180-degree field of vision. Physically, it's a
standard phone shape but rather chunky compared to the class of 2014. More like
something from about 2010.
Page 5
Fig (4) Prototype 2
Prototype 2 is an android 5 inch smartphone with the same tango hardware as that of the
tablet.
Fig (5) A simple Overview of Components of Tango Phone
Google's Project Tango is a smartphone equipped with a variety of cameras and vision sensors
that provides a whole new perspective on the world around it. The Tango smartphone can
capture a wealth of data never before available to application developers, including depth
Page 6
and object-tracking and instantaneous 3D mapping.And it is almost as powerful and as big as
a typical smartphone.
Project Tango is different from other emerging 3D-sensing computer vision products, such
as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet
and is chiefly concerned with determining the device's position and orientation within the
environment.
The high-end Android tablet with 7-inch HD display, 4GB of RAM, 128GB of internal
SSD storage and an NVIDIA Tegra K1 graphics chip (the first in
the US and second in the world) that features desktop GPU architecture. It also has a
distinctive design that consists of an array of cameras and sensors near the top and a couple
of subtle grips on the sides. Movidious which is the company that developed some of the
technology which has been used in Tango has been working on computer vision technology
for the past seven years — it developed the processing chips used in Project Tango, which
Google paired with sensors and cameras to give the smartphone the same level of computer
vision and tracking that formerly required much larger equipment. The phone is equipped with
a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and
a lower-resolution image-tracking camera. These combos of image sensors give the
smartphone a similar perspective on the world, complete with 3-D awareness and a-
awareness of depth. They supply information to Movidious custom Myriad 1 low- power
computer-vision processor, which can then process the data and feed it to apps through a
set of APIs. The phone also contains a Motion Tracking camera which is used to keep
track of all the motions made by the user.
Page 7
3. SMARTPHONE SPECIFICATION
Tango wants to deconstruct reality, taking a quarter million 3D measurements each
second to create a real-time 3D model that describes the physical depth of its
surroundings.
The smartphone specs are
The above specs include Snapdragon 800 quad core CPU running up to 2.3 GHz per core,
2GB or 4GB of memory, an expandable 64GB or 128 of internal storage, and a nine axis
accelerometer/gyroscope/compass. There’s also a Mini-USB, a Micro-USB, and USB 3.0.
In addition to above specs Tango’s specs also include: a rear-facing four megapixel RGB/
infrared camera, a 180-degree field-of- view fisheye rear-facing camera, a 120-degree field-
of-view front facing camera, and a 320 x 180 depth sensor – plus a vision processor with
one teraflop of computer power. Project Tango uses a 3000 mAh battery
Page 8
4. HARDWARE
Project Tango is basically a camera and sensor array that happens to run on an Android
phone.
The smartphone is equipped with a variety of cameras and vision sensors that provides a
whole new perspective on the world around it. The Tango smartphone can capture a wealth of
data never before available to application developers, including depth and object-tracking
and instantaneous 3D mapping. And it is almost as powerful and as big as a typical
smartphone. The Front View and Back View of a Tango Phone is shown below.
It is same like some other phones but the phone is having variety of cameras and sensors that
make the 3D modelling of the environment possible.
Fig (6) Tango Phone Front View
The device tracks the 3D motion and creates a 3D model of the environment around it
by using the array of cameras and sensors. The phone emits pulses of infrared light from
the IR projector and records how it is reflected back allowing it to build a detailed depth map
of the surrounding space.
Page 9
There are three cameras that capture a 120-degree wide-angle field of view. 3D camera
captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of
the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D
camera also captures the depth dimension (in addition to the standard 2D data).A rear-
facing four megapixel RGB/infrared camera, a 180-degree field-of- view fisheye rear-facing
camera, a 120-degree field-of-view front facing camera, and a 320 x 180 depth sensor are
the components of the phone at the rear end that works together to give the 3D structure of
the scene.
Fig (7) Tango Phone Back View
Project Tango, which Google paired with sensors and cameras to give the smartphone the
same level of computer vision and tracking that formerly required much larger equipment. The
phone is equipped with a standard 4-megapixel camera paired with a special combination of
RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image
sensors give the smartphone a similar perspective on the world, complete with 3-D awareness
and a awareness of depth. They supply information to Movidius custom Myriad 1 low-
power computer-vis ion processor, which can then process the data and feed it to apps
through a set of APIs. The phone also contains a Motion Tracking camera which is used
to keep track of all the motions made by the user. The motherboard which contains all of
these components is shownbelow
Page 10
Fig (8) Tango phone Motherboard
• Elpida FA164A1PB 2 GB LPDDR3 RAM, layered above a Qualcomm
8974(Snapdragon 800) processor. (RED)
• Two Movidius Myriad 1 computer vision co-processors. (ORANGE)
• Two AMICA25L016 16 Mbit low voltage serial flash memory ICs. (YELLOW)
• InvenSense MPU-9150 9-axis gyroscope/accelerometer/compass MEMS motion
tracking device. (GREEN)
• Skyworks 77629 multimode multiband power amplifier module for quad-band
GSM/EDGE. (BLUE)
• PrimeSense PSX1200 Capri PS1200 3D sensor SoC. (VIOLET)
The figure above is the motherboard: the red is 2GB LPDDR3 RAM, along with Qualcomm
Snapdragon 800 CPU, the orange is computer image processor Movidius Myriad 1, the
green which contain 9-axis acceleration sensor / gyroscope / compass, motion tracking, the
yellow is two memory ICs AMIC A25L016 flash 16Mbit, the purple is the SoC 3D sensor
PrimeSense PSX1200 Capri PS1200, the blue is SPI flash memory Winbond W25Q16CV
16Mbit. Internally, the Myriad 2 consists of 12 128-bit vector processors called Streaming
Hybrid Architecture Vector Engines, or SHAVE in general, which run at 60MHz. The
Myriad 2 chip gives five times the SHAVE performance of the Myriad 1, and the SIPP
engines are 15x to 25x more powerful than the 1
st
generation chip.
The phone is equipped with a standard 4-megapixel camera paired with a special
combination of RGB and IR sensor and a lower-resolution image-tracking camera..As the
main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile
device. The OV4682 is a 4MP RGBIR image sensor that c a p t u r e s
high-resolution images and video as well as IR information, enabling depth analysis.
Page 11
Fig (9) Front andRear Camera
Fig (10) Fisheye Camera
Page 12
Fig (11) IR Projector Integrated Depth Sensor
Page 13
5. TECHNOLOGY BEHIND TANGO
5.1. TANGO SENSOR
Myriad 1 vision processor platform developed by Movidius Company. The sensors allow the
device to make "over a quarter million 3D measurements every second, updating its
position and orientation in real time, combining that data into a single 3D model of the space
around you. Movidius which is the company that developed some of the technology which
has been used in Tango has been working on computer vision technology for the past
seven years — it developed the processing chips used in Project Tango, which Google paired
with sensors and cameras to give the smartphone the same level of computer vision and
tracking that formerly required much larger equipment.
5.2 IMAGE SENSOR
Image sensors give the smartphone a similar perspective on the world, complete
with 3-D awareness and a awareness of depth which is then supplied information to
Movidius custom Myriad 1 low-power computer-vision processor, which can then
process the data and feed it to apps through a set of APIs. The Motion Tracking camera
keeps track of all the motions made by the user. . There are three cameras that capture a
120-degree wide-angle field of view from the front. An even wider 180 degree span
from the back The phone is equipped with a standard 4-megapixel camera paired with
a special combinat ion of RGB and IR sensor and a lower-resolution image-tracking
camera. Its depth-sensing array consists of an infrared projector, 4MP rear camera and
front-facing fisheye view lens with 180-degree field of vision. The phone emits pulses
of infrared light from the IR projector and records how it is reflected back allowing
it to build a detailed depth map of the surrounding space. The data collected from
sensors and camera is processed by the Myriad vision processor for delivering 3D
structure of the view to the apps.
Page 14
6. WORKING CONCEPT
Project Tango devices combine the camera, gyroscope and accelerometer to estimate six
degrees of freedom motion tracking, providing developers the ability to track 3D motion of a
device while simultaneously creating a map of the environment
An IR projector provides infrared lightthat other (non-RGB) cameras can use to get a sense
of an area in 3D space. The phone emits pulses of infrared light from the IR projector
and records how it is reflected back allowing it to build a detailed depth map of the
surrounding space. There are three cameras that capture a 120-degree wide-angle field of
view from the front. An even wider 180 degree span from the back. A 4-MP color
camera sensor can also be used for snapping regular pics. A 3D camera captures the 3D
structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto
the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also
captures the depth dimension (in addition to the standard 2D data).
The main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s
mobile device. The OV4682 is a 4MP RGB IR image sensor that captures
high-resolution images and video as well as IR information, enabling depth analysis. The
sensor features a 2um OmniBSI-2 pixel and records 4MP images and video in a 16:9 format
at 90fps. The sensor's 2-micron OmniBSI-2 pixel delivers excellent signal-to-noise ratio and
IR sensitivity, and offers best-in-class low-light sensitivity. The OV4682's unique architecture
and pixel optimization bring not only the best IR performance but also best-in-class image
quality. The OV4682 records full-resolution 4-megapixel video in a native 16:9 format at 90
frames per second (fps), with a quarter of the pixels dedicated to capturing IR. The 1/3-
inch sensor can also record 1080p high definition (HD) video at 120 fps with electronic
image stabilization (EIS), or 720p HD at 180 fps. The OV7251 Camera Chip sensor is
capable of capturing VGA resolution video at 100fps using a global shutter. RGB infrared
(IR) single sensor that captures high-resolution images and video as well as IR
information. It’s dual RGB and IR capabilities allow it to bring a host of additional features
to mobile and machine vision applications, including gesture sensing, depth analysis, iris
detection and eyetracking.
The another camera is fisheye lens enables a 180º FOV, while the sensor balances
resolution and frames per second to record black and white images for motion tracking. So
if the users moves the devices left or right, it draws the path that the devices and that path
followed is show in the image on the right in real-time. Thus through this we have a
motion capture capabilities in our device. The device also has a depth sensor.
Page 15
Fig (12) The image represents the feed from the fish-eye lens
Fig (13) Computer Vision
Page 16
The figure above illustrates depth sensing by displaying a distance heat map on top of what
the camera sees, showing blue colors on distant objects and red colors on close by objects.
It also the data from the image sensors and paired with the device's standard motion
sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and
then plot that onto an interactive 3D map. It uses the Sensor fusion technology which
combines sensory data or data derived from sensory data from disparate sources such that
the resulting information is in some sense better than would be possible when these sources
were used separately. Thus it means a more precise, more comprehensive, or more reliable, or
refer to the result of an emerging view, such as stereoscopic vision.
These combos of image sensors give the smartphone a similar perspective on the world,
complete with 3-D awareness and a awareness of depth. They supply information to
Movidious custom Myriad 1 low-power computer-vision processor, which can
then process the data and feed it to apps through a set of APIs. The phone also contains a
Motion Tracking camera which is used to keep track of all the motions made by the user.
Mantis Vision, a developer of some of the world's most advanced 3D enabling technolo-
gies research MV4D technology platform is the core 3D engine behind Google's Project
Tango. Mantis Vision provides the 3D sensing platform, consisting of flash projector hardware
components and Mantis Vision's core MV4D technology which includes structured light-
based depth sensing algorithms which generates realistic, dense maps of the world. It
focuses to provide reliable estimates of the pose of a phone i.e. position and
alignment, relative to its environment, dense maps of the world. It focuses to provide
reliable estimates of the pose of a phone (position and alignment), relative to its
environment
Page 17
7. PROJECT TANGO CONCEPTS
Project Tango is different from other emerging 3D-sensing computer vision products, such as
Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is
chiefly concerned with determining the device's position and orientation within the
environment.
The software works by integrating three types of functionality:
7.1 MOTION TRACKING
Motion tracking allows a device to understand position and orientation using
Project Tango's custom sensors. This gives you real-time information about the 3D
motion of a device. Motion-tracking: using visual features of the environment, in
combination with accelerometer and gyroscope data, to closely track the device's
movements in space. Project Tango’s core functionality is measuring movement
through space and understanding the area moved through. Google API’s provide the
position and orientation of the user’s device in full six degrees of freedom, referred to
as its pose.
Fig (14) Motion Tracking
Page 18
7.2 AREA LEARNING
Using area learning, a Project Tango device can remember the visual features of the area
it is moving through and recognize when it sees those features again. These features can
be saved in an Area Description File (ADF) to use again later.
Project Tango devices can use visual cues to help recognize the world around them. They
can self-correct errors in motion tracking and relocalize in areas they've seen before. . With
an ADF loaded, Project Tango devices gain a new feature called drift corrections or
improved motion tracking.
Area learning is the way of storing environment data in a map that can be re-used later,
shared with other Project Tango devices, and enhanced with metadata such as notes,
instructions, or points ofinterest
Fig (15) Area Learning
7.3 DEPTH PRECEPTION
Project Tango devices are equipped with integrated 3D sensors that measure the distance from
a device to objects in the real world. This configuration gives good depth at a distance
whilebalancing power requirements for infrared illumination and depth processing.
The depth data allows an application to understand the distance of visible objects to the
device. By combining depth perception with motion tracking, you can also measure distance
between points in an area that aren’t in the same fame.
Page 19
Project Tango devices are equipped with integrated 3D sensors that measure the distance
from a device to objects in the real world. Current devices are designed to work best indoors
at moderate distances (0.5 to 4 meters). It may not be ideal for close range object scanning.
Because the technology relies on viewing infrared light using the device's camera, there
are some situations where accurate depth perception is difficult. Areas lit with light sources
highin IR like sunlight or incandescent bulbs, or objects that do not reflect IR light cannot
be scanned well.
By combining depth perception with motion tracking, you can also measure distances
between points in an area that aren't in the same frame.
Fig (16) Depth Perception
Together, these generate data about the device in "six degrees of freedom"
(3 axes of orientation plus 3 axes of motion) and detailed three-dimensional information
about the environment.
Page 20
Applications on mobile devices use Project Tango's C and Java APIs to access this data in
real time. In addition, an API is also provided for integrating Project Tango with the Unity
game engine; this enables the rapid conversion or creation of games that allow the user to
interact and navigate in the game space by moving and rotating a Project Tango device in
real space. These APIs are documented on the Google developer website.
Page 21
8. DEVICES DEVELOPED SO FAR
As a platform for software developers and a model for device manufacturers, Google
has created two Project Tango devices to date.
8.1 The Yellowstone tablet
Google's Project Tango tablet, 2014
"Yellowstone" is a 7-inch tablet with full Project Tango functionality, released in June 2014,
and sold as the Project Tango Tablet Development Kit. It features a 2.3 GHz
quad-core Nvidia Tegra K1 processor, 128GB flash memory, 1920x1200-pixel
touchscreen, 4MP color camera, fisheye-lens (motion-tracking) camera, integrated depth
sensing, and 4G LTE connectivity. The device is sold through the official Project Tango
website
[8]
and the Google Play Store.
Fig (17) The Yellowstone tablet
Page 22
8.2 The Peanut phone
"Peanut" was the first production Project Tango device, released in the first quarter of 2014. It was a
small Android phone with a Qualcomm MSM8974 quad-core processor and additional special
hardware including a fisheye- lens camera (for motion tracking), "RGB-IR" camera (for color
images and infrared depth detection), and Movidius image-processing chips. A high- performance
accelerometer and gyroscope were added after testing several competing models in the MARS lab
at the University of Minnesota.
Several hundred Peanut devices were distributed to early-access partners including university
researchers in computer vision and robotics, as well as application developers and technology.
Google stopped supporting the Peanut device in September 2015, as by then the Project Tango
software stack had evolved beyond the versions of Android that run on thedevice.
8.3 Testing by NASA
In May 2014, two Peanut phones were delivered to the International Space Station to be
part of a NASA project to develop autonomous robots that navigate in a variety of environme
- nts, including outer space. The soccer-ball-sized, 18-sided polyhedral SPHERES robots
were developed at the NASA Ames Research Center, adjacent to the Google campus in
Mountain View, California. Andres Martinez, SPHERES manager at NASA, said "We are
researching how effective Project Tango's vision-based navigation abilities are for performing
localization and navigation of a mobile free flyeron ISS.
8.4 Intel RealSense smartphone
The Intel® RealSense™ Camera ZR300 was designed mainly for rear (world) facing
usages. It includes a Laser Projector which emits a structured pattern of infrared light,
and 2 IR imagers (left and right) which capture the scene. Depth and dimensional
characteristics of objects are calculated via point shift of the left to the right IR images.
Besides the depth camera for computing high density depth (>10million points per
second), ZR300 also includes a wide-field of view camera (VGA with >160 degree
FOV) with a high-precision accelerometer-gyroscope combination for motion and
feature tracking.
On top of the powerful hardware platform which advanced silicon technologies such as
the Intel® Atom™ x7-Z8700 SoC and the Intel® RealSense™ Camera ZR300, the
device runs Android operating system, and supports the Google Project Tango SDK.
The Google Project Tango bring 3 core technologies to mobile devices: Motion
Tracking, Area Learning, and Depth Perception. The motion tracking technology uses
the IMU to calculate the device’s current position relative to where it started, thus can
be used in indoors navigation without using the GPS. The area learning technology
applies a process called Simultaneous Localization and Mapping (SLAM) to learn an
area while tracking the device’s current positon within it, which can be used to perform
“drift correction” in long motion tracking scenarios. The depth perception technology
uses the highly accurate ZR300 depth camera to generate a Point cloud to represent the
depth information of objects. The Intel® RealSense™ technology potentially enables
developers to implement the rich user experience features such as face tracking, scene
perception meshing, environment reconstructing, and depth-enabled photo features such
Page 23
as photo depth blundering, photo layer segmentation, measurement, photo parallax and
refocusing.
Fig (18) Intel RealSense smartphone
8.5 Lenovo Phab 2 Pro
Lenovo Phab 2 Pro was the first commercial smartphone with the Tango Technology,
the device was announced at the beginning of 2016, launched in August, and available
for purchase in the US in November. The Phab 2 Pro had a 6.4 inch screen,
a Snapdragon 652 processor, and 64 GB of internal storage, with a rear facing 16
Megapixels camera and 8 MP front camera.
The hardware focus of the Phab 2 Pro is the camera and sensor system on the back of
the phone that enables Tango, encompassing a depth sensor and motion sensor in
addition to the 16-megapixel main camera. Working in tandem, the three sensors
combine with Google's Tango software API to offer functionality such as accurate
digital measurement of three-dimensional objects, to-scale realistic simulation of
furniture and other objects in a real-life space, and AR-augmented video games.
The Phab 2 Pro's design encompasses a glass display flanked by aluminum upper and
lower bezels, with the lower bezel housing three capacitive navigation buttons (the
Android standard of back, home, and recent apps, from left to right), and the upper
bezel housing a telephone loudspeaker for calls, an 8-megapixel front-facing camera,
Page 24
and a light sensor and proximity sensor. The back and sides of the phone are aluminum,
with glass camera and sensor covers and two plastic antenna lines on the top and
bottom; the cameras and sensors protrude slightly from the back itself, which features a
slight curve similar to that found on the Nexus 6. The main camera is topmost on the
back, followed by the depth sensor, both of which are housed in a vertical pill-shaped
protrusion with a glass covering. Below the main camera and depth sensor housing is
the circular motion sensor, immediately below which is the fingerprint sensor. The
bottom of the phone houses a microUSB port for charging and data transfer, flanked by
identical microphone and speaker grilles. The top of the phone houses the 3.5mm
headphone jack. The left side of the phone (from the front) houses a two-button volume
rocker above a power button, while the right side houses the ejectable SIM/microSD
tray.


Fig (19) Lenovo Phab 2 Pro


Page 25
8.6 Asus Zenfone AR
Asus Zenfone AR, announced at CES 2017,[22] was the second commercial smartphone
with the Tango Technology. It ran Tango AR & Daydream VR on Snapdragon 821, with
6GB or 8GB of RAM and 128 or 256GB of internal memory depending on the
configuration.
Fig (20) Asus Zenfone AR
Page 26
9.FUTURE SCOPE
Project Tango seeks to take the next step in this mapping evolution. Instead of depending
on the infrastructure, expertise, and tools of others to provide maps of the world, Tango
empowers users to build their own understanding, all with a phone. Imagine knowing your
exact position to within inches. Imaginebuilding 3D maps of the world in parallel with other
users around you. Imagine being able to track not just the top down location of
a device, but also its full3D position and alignment. The technology is ambitious, the
potential applications are powerful. The Tango device really enables augmented reality
which opens a whole frontier for playing games in the scenery around you. You can capture
the room, you can then render the scene that includes the room but also adds characters and
adds objects so that you can create games that operate in your natural environment. The
applications even go beyond gaming. Imagine if you could see what room
would look like and decorate it with different types of furniture and walls
and create a very realistic scene. This Technology can be used the guide the visually impaired
to give them auditory queues or where they are going. Can even be used by soldiers in
war to replicate the war-zone and prepare for combat or can even be used to live
out one’s own creative fantasies. The possibilities are really endless for this amazing
technology and the future is looking very bright.
Things Project Tango can do
DIRECTIONS: When you need directions inside a building or structure that current
mapping solutions just don’t provide. Shopping - who just like to get in and out as quickly
as possible. Having an indoor map of the store in your hand could make shopping trips
more efficient by leading you directly to the shelf you want.
EMERGENCY RESPONSE: To help emergency response workers such as firefighters
find their way through buildings by projecting the blueprints onto the screen.
It has the potential to provide valuable information in situations where knowing the
exact layout of a room can be a matter of lifeor death
AUGMENTED REALITY GAMING: It could combine the room-mapping with
augmented reality. “Imagine competing against a friend for control over territories in your
own home with your own miniature army.
Mapping in-game textures onto your real walls through the smartphone would arguably
produce
the best game of Cops and Robbers in history.
Page 27
MODELLING OBJECTS:
A simple image showing image Modelling using Project Tango.
Fig (21) Simple Model
Page 28
10. CONCLUSION
Project Tango enables apps to track a device's position and orientation within a
detailed 3D environment, and to recognize known environments. This makes
possible applications such as in- store navigation, visual measurement and mapping
utilities, presentation and design tools, and a variety of immersive games.
At this moment, Tango is just a project but is developing quite rapidly with early
prototypes and development kits already distributed among many developers. It is
all up to the developers now to create more clever and innovative apps to take
advantage of this technology. It is just the beginning and there is a lot of work to
do to fine-tune this amazing technology. Thus, if Project Tango works – and
we've no reason to suspect it won't - it could prove every bit as revolutio- nary
as Maps or earth or android. It just might take a while for its true genius to become
clear
Page 29
11. REFRENCES
1. Announcement on ATAPGoogle+ site, 30 may 2020.
2. "Future Phones WillUnderstand, See the World". 3 June
2020 Retrieved 4 November 2015.
3. ^"Slamdance: inside the weird virtual reality of Google's Project Tango". 29 May
2020.
4. Qualcomm Powers Next Generation Project Tango Development Platform,
2 9 m a y 2 0 2 0
5. IDF 2015: Intel teams with Google to bring RealSense to Project Tango, 18 jan 2020
6. https://developers.google.com/project-tango/ Google developer website
7. Product announcement on ATAPGoogle+ page, 5 June 2020, retrieved 4 feb 2020
8. https://www.google.com/atap/project-tango/ Project Tango website
Page 30

Weitere ähnliche Inhalte

Was ist angesagt?

Google Glass
Google GlassGoogle Glass
Google Glass
junaid401
 

Was ist angesagt? (20)

Smart note taker
Smart note takerSmart note taker
Smart note taker
 
Google glass documentation
Google glass documentationGoogle glass documentation
Google glass documentation
 
Spatial computing - extending reality
Spatial computing - extending realitySpatial computing - extending reality
Spatial computing - extending reality
 
3D-TV-PPT
3D-TV-PPT3D-TV-PPT
3D-TV-PPT
 
EyeRing PowerPoint Presentation
EyeRing PowerPoint PresentationEyeRing PowerPoint Presentation
EyeRing PowerPoint Presentation
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Smart note taker
Smart note takerSmart note taker
Smart note taker
 
Future of Wireless Technology
Future of Wireless TechnologyFuture of Wireless Technology
Future of Wireless Technology
 
Smart Glass Technology by Kiran
Smart Glass Technology by KiranSmart Glass Technology by Kiran
Smart Glass Technology by Kiran
 
google glass
google glassgoogle glass
google glass
 
MICROSOFT HOLOLENS
MICROSOFT HOLOLENSMICROSOFT HOLOLENS
MICROSOFT HOLOLENS
 
Google Glass
Google GlassGoogle Glass
Google Glass
 
Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
 
Virtual keyboard seminar ppt
Virtual keyboard seminar pptVirtual keyboard seminar ppt
Virtual keyboard seminar ppt
 
HoloLens Introduction and Technical Specifications
HoloLens Introduction and Technical SpecificationsHoloLens Introduction and Technical Specifications
HoloLens Introduction and Technical Specifications
 
Screenless Display PPT Presentation
Screenless Display PPT PresentationScreenless Display PPT Presentation
Screenless Display PPT Presentation
 
Project tango
Project tangoProject tango
Project tango
 
Augmented reality technical presentation
 Augmented reality technical presentation Augmented reality technical presentation
Augmented reality technical presentation
 
Seminar on isphere
Seminar on isphereSeminar on isphere
Seminar on isphere
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 

Ähnlich wie google tango technology Seminar report

Location based services using augmented reality
Location based services using augmented realityLocation based services using augmented reality
Location based services using augmented reality
IAEME Publication
 
Augmented reality report
Augmented reality reportAugmented reality report
Augmented reality report
Satyendra Gupta
 
CG_report_merged (1).pdf
CG_report_merged (1).pdfCG_report_merged (1).pdf
CG_report_merged (1).pdf
rahul812082
 
AbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docxAbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docx
bartholomeocoombs
 
IEM ECE Electrovision 2013
IEM ECE Electrovision 2013IEM ECE Electrovision 2013
IEM ECE Electrovision 2013
agomoni16
 

Ähnlich wie google tango technology Seminar report (20)

IRJET- Deep Dive into Augmented Reality
IRJET-  	  Deep Dive into Augmented RealityIRJET-  	  Deep Dive into Augmented Reality
IRJET- Deep Dive into Augmented Reality
 
Augmented Reality Report by Singhan Ganguly
Augmented Reality Report by Singhan GangulyAugmented Reality Report by Singhan Ganguly
Augmented Reality Report by Singhan Ganguly
 
Location based services using augmented reality
Location based services using augmented realityLocation based services using augmented reality
Location based services using augmented reality
 
Google''s Project Tango
Google''s Project TangoGoogle''s Project Tango
Google''s Project Tango
 
Tango
TangoTango
Tango
 
CMPE- 280-Research_paper
CMPE- 280-Research_paperCMPE- 280-Research_paper
CMPE- 280-Research_paper
 
Aijaz tango
Aijaz tangoAijaz tango
Aijaz tango
 
augmented reality
augmented realityaugmented reality
augmented reality
 
finalgoogle
finalgooglefinalgoogle
finalgoogle
 
Augmented reality : Possibilities and Challenges - An IEEE talk at DA-IICT
Augmented reality : Possibilities and Challenges - An IEEE talk at DA-IICTAugmented reality : Possibilities and Challenges - An IEEE talk at DA-IICT
Augmented reality : Possibilities and Challenges - An IEEE talk at DA-IICT
 
Augmented reality report
Augmented reality reportAugmented reality report
Augmented reality report
 
IRJET-Augmented Reality based Platform to Share Virtual Worlds
IRJET-Augmented Reality based Platform to Share Virtual WorldsIRJET-Augmented Reality based Platform to Share Virtual Worlds
IRJET-Augmented Reality based Platform to Share Virtual Worlds
 
IRJET- Campus Navigation System Based on Mobile Augmented Reality
IRJET- Campus Navigation System Based on Mobile Augmented RealityIRJET- Campus Navigation System Based on Mobile Augmented Reality
IRJET- Campus Navigation System Based on Mobile Augmented Reality
 
Human Computer Interface Augmented Reality
Human Computer Interface Augmented RealityHuman Computer Interface Augmented Reality
Human Computer Interface Augmented Reality
 
Future with Sixth sense technology
Future with Sixth sense technologyFuture with Sixth sense technology
Future with Sixth sense technology
 
CG_report_merged (1).pdf
CG_report_merged (1).pdfCG_report_merged (1).pdf
CG_report_merged (1).pdf
 
AbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docxAbstractThis work presents the design and implementation of an.docx
AbstractThis work presents the design and implementation of an.docx
 
IEM ECE Electrovision 2013
IEM ECE Electrovision 2013IEM ECE Electrovision 2013
IEM ECE Electrovision 2013
 
Report
ReportReport
Report
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
 

Kürzlich hochgeladen

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 

Kürzlich hochgeladen (20)

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
Platformless Horizons for Digital Adaptability
Platformless Horizons for Digital AdaptabilityPlatformless Horizons for Digital Adaptability
Platformless Horizons for Digital Adaptability
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 

google tango technology Seminar report

  • 1. JP INSTITUTE OF ENGINEERING & TECHNOLOGY, MEERUT TANGO TECHNOLOGY Submitted in partial fulfillment of the requirements for the award of the degree of Bachelor of Technology in Computer Science & Engineering Submitted by: RUPESH KUMAR (1628210067) Under The Guidance of Mr. Varun Pundir (ASST PRO. CSE) DR.A.P.J ABDUL KALAM TECHNICAL UNIVERSITY, LUCKNOW, UTTAR PRADESH Session: 2019-20
  • 2. TABLE OF CONTENTS Page DECLARATION ..................................................... ii CERTIFICATE ...................................................... iii ACKNOWLEDGEMENTS ........................................................ iv ABSTRACT ..................................................... v LIST OF FIGURES. ..................................................... vi LIST OF SYMBOLS ........................................................ vii LIST OF ABBREVIATIONS ......................................................... viii 1. INTRODUCTION …………………………………… 1 2. OVERVIEW. …………………………………… 4 3. SMARTPHONE SPECIFICATIO …………………………………… 8 4. HARDWARE. ……………………………………. 9 5. TECHNOLOGY BEHIND TANGO ……………………………………. 14 5.1 TANGO SENSOR. ……………………………………. 14 5.2 IMAGE SENSOR. ……………………………………. 14 6. WORKING CONCEPT ……………………………………. 15 7. PROJECT TANGO CONCEPT. ……………………………………. 18 7.1. MOTION TRACKING. ……………………………………. 18 7.2. AREA LEARNING. ……………………………………. 19 7.3. DEPTH PRECEPTION. ……………………………………. 19 8. DEVICES DEVELOPED SO FAR …………………………………. 22 8.1 THE YELLOWSTONE TABLET …………………………………. 22 8.2 THE PEANUT PHONE …………………………………. 23 8.3. TESTING BY NASA ………………………………….. 23 8.4. INTEL REAL SENSE SMARTPHONE……………………………. ……. 23 8.5 LENOVO PHAB 2 PRO …………………………………… 24 8.6 ASUS ZENFONE AR ………………………………….. 26 9. FUTURE SCOPE. ……………………………………. 27 10. CONCLUSION …………………………. ………… 29 11.REFRENCES …………………………………….. 30 i
  • 3. DECLARATION I hereby declare that this submission is our own work and that, to the best of our knowledge and belief, it contains no material previously published or written by another person nor material which to a substantial extent has been accepted for the award of any other degree or diploma of the university or other institute of higher learning, except where due acknowledgment has been made in the text. Signature. Rupesh Kumar Name. RUPESH KUMAR Roll No. 1628210067 Date 16/06/2020 ii
  • 4. CERTIFICATE This is to certify that this Project Report entitled “TANGO TECHNOLOGY” which is submitted by Rupesh Kumar (1628210067) in the partial fulfillment, for the award of degree of Bachelor of Technology in Department of Computer Science & Engineering, of JP INSTITUTE OF ENGINEERING & TECH- NOLOGY, Meerut, affiliated to DR. A.P.J. ABDUL KALAM TECHNICAL UNIVERSITY, Lucknow; is carried out by him under my supervision. The matter embodied in this Project Work has not been submitted earlier for award of any degree or diploma in any university/institution to the best of our knowledge and belief. (Mr. Varun Pundir ) (Mr. Sreesh Gaur)33 Project Guide Head (CSE) Date: 16/06/2020 iii
  • 5. ACKNOWLEDGEMENT It gives us a great sense of pleasure to present the report of the Technology Seminar, undertaken during B. Tech. Final Year. We owe special debt of grati- tude to Mr. Varun Pundir Department of Computer Science & Engineering, JP Institute of Engineering & Technology, Meerut for his constant support and guidance throughout the course of our work. His sincerity, thoroughness and perseverance have been a constant source of inspiration for us. It is only his cognizant efforts that our endeavors have seen light of the day. We also take the opportunity to acknowledge the contribution of Mr. Sheesh Gaur, HOD, Department of Computer Science & Engineering, JP Insti- tute of Engineering & Technology, Meerut for his full support and assistance during the development of the project. We also do not like to miss the opportunity to acknowledge the contribu- tion of all faculty members of the department for their kind assistance and coop- eration during the development of our project. Last but not the least, we acknowledge our friends for their contribution in the completion of the project. Signature: Rupesh Kumar Name : RUPESH KUMAR Roll No.: 1628210067 Date : 16/06/2020 iv
  • 6. ABSTRACT Tango (formerly named Project Tango, while in testing) was an augmented reality computing platform, developed and authored by the Advanced Technology and Projects (ATAP), a skunkworks division of Google. It used computer vision to enable mobile devices, such as smartphones and tablets, to detect their position relative to the world around them without using GPS or other external signals. This allowed ap- plication developers to create user experiences that include indoor navigation, 3D map- ping, physical space measurement, environmental recognition, augmented reality, and windows into a virtual world. At CES, in January 2016, Google announced a partnership with Lenovo to release a consumer smartphone during the summer of 2016 to feature Tango technology market- ed at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the same time, both companies also announced an application incubator to get applications developed to be on the device on launch. v
  • 7. LIST OF FIGURES Fig (1) Google’s Project Tango Logo Fig (2) A view of Googles Project Tango 3D Model Mapping Fig (3) Prototype 1 (phone look like, Internal) Fig (4) Prototype 2 (phone look like, External) Fig (5) A simple Overview of Components of Tango Phone Fig (6) Tango Phone Front View Fig (7) Tango Phone Back View Fig (8) Tango phone Motherboard Fig (9) Front and Rear Camera Fig (10) Fisheye Camera Fig (11) IR Projector Integrated Depth Sensor Fig (12) The image represents the feed from the fish-eye lens Fig (13) Computer Vision Fig (14) Motion Tracking Fig (15) Area Learning Fig (16) Depth Perception Fig (17) The Yellowstone tablet Fig (18) Intel RealSense smartphone Fig (19) Lenovo Phab 2 Pro Fig (20) Asus Zenfone AR Fig (21) Simple Model vi
  • 8. LIST OF SYMBOLS [x] Integer value of x. ≠ Not Equal ∈ Belongs to € Euro- A Currency _ Optical distance _ o Optical thickness or optical half thickness vii
  • 9. LIST OF ABBREVIATIONS SLAM. Simultaneous Localization and Mapping ATAP Advanced Technology and Projects Group UGE. Unity Game Engine CES Consumer Electronics Show RAM Random Access Memory SSD Solid State Drive API Application Program Interface CPU Central Processing Unit USB. Universal Serial Bus FOV. Field Of View ADF Area Description File LTE Long-Term Evolution VGA Video Graphics Array GPS Global Positioning System SDK Software Development Kit viii
  • 10. 1. INTRODUCTION 3D models represent a 3D object using a collection of points in a given 3D space, connected by various entities such as curved surfaces, triangles, lines, etc. Being a collection of data which includes points and other information, 3D models can be created by hand, scanned (procedural modeling), or algorithmically. The "Project Tango" prototype is an Android smartphone- like device which tracks the 3D motion of particular device, and creates a 3D model of the environment around it. Project Tango was introduced by Google initially in early 2013, they described this as a Simultaneous Localization and Mapping (SLAM) system capable of operating in real-time on a phone. Google’s ATAP teamed up with a number of organizations to create Project Tango from this description. The team at Google’s Advanced Technology and Projects Group (ATAP) has been working with various Universities and Research labs to harvest ten years of research in Robotics and Computer Vision to concentrate that technology into a very unique mobile phone. We are physical being that live in a 3D world yet the mobile devices today assume that the physical world ends the boundaries of the screen. Project Tango’s goal is to give mobile devices a human-scale understanding of space and motion. This project will help people interact with the environment in a fundamentally different way and using this technology we can prototype in a couple of hours something that would take us months or even years before because we did not have this technology readily available. Imagine having all this in a smartphone and see how thingswould change. The first product to emerge from Google's ATAP skunkworks group, [1] Project Tango was developed by a team led by computer scientist Johnny Lee, a core contributor to Microsoft's Kinect. In an interview in June 2015, Lee said, "We're developing the hardware and software technologies to help everything and everyone understand precisely where they are, anywhere.” This device runs Android and includes development APIs to provide alignment, position or location, and depth data to regular Android apps written in C/C++, Java as well as the Unity Game Engine (UGE). These early algorithms, prototypes, and APIs are still in active development. So, these are experimental devices and are intended only for the exploratory and adventurous are not a finalshipping product. Project Tango technology gives a mobile device the ability to navigate the physical world similarto how we do as humans. Project Tango brings a new kind of spatial perception to the Android device platform by adding advanced computer vision, imageprocessing, and special vision sensors. Project Tango is a prototype phone containing highly customized hardware and software designed to allow the phone to track its motion in full 3D in real-time. The sensors make Page 1
  • 11. over a quarter million 3D measurements every single second updating the position and rotation of the phone, blending this data into a single 3D model of the environment. It tracks ones position as one goes around the world and also makes a map of that. It can scan a small section of your room and then are able to generate a little game world in it. It is an open source technology. ATAP has around 200 development kits which has already been distributed among the developers. Google has produced two devices to demonstrate the Project Tango technology: the Peanut phone (no longer available) and the Yellowstone 7-inch tablet. More than 3,000 of these devices had been sold as of June 2015, chiefly to researchers and software developers interested in building applications for the platform. In the summer of 2015, Qualcomm and Intel both announced that they are developing Project Tango reference devices as models for device manufacturers who use their mobile chipsets. At CES, in January 2016, Google announced a partnership with Lenovo to release a consumer smartphone during the summer of 2016 to feature Project Tango technology marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the same time, both companies also announced an application incubator to get applications developed to be on the device on launch. Fig (1) Google’s Project Tango Logo Page 2
  • 12. Which companies are behind Project Tango? A number of companies came together to develop Project Tango. All of these are listed in the credits of the Google Project Tango introduction video called “Say hello to Project Tango!” Each company has had a different amount of involvement. The following are the list of participating companies listed in that video: · Bosch · BSquare · CompalComm · ETH Zürich · Flyby Media · George Washington University · HiDOF · MMSolutions · Movidius · University of Minnesota · NASA JPL · Ologic · OmniVision · Open Source Robotics Foundation · ParaCosm · Sunny Optical tech · Speck Design Page 3
  • 13. 2. OVERVIEW Google's Project Tango is a smartphone equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a wealth of data never before available to application developers, including depth and object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical smartphone. Text Font of Entire Document It's also available as a high-end Android tablet with 7-inch HD display. WHAT IS PROJECT TANGO? Tango allows a device to build up an accurate 3D model of its immediate surroundings, which Google says will be useful for everything from AR gaming to navigating large shopping centres. Fig (2) A view of Googles Project Tango 3D ModelMapping Google isn't content with making software for phones that can merely capture 2D photos and videos. Nor does it just want to take stereoscopic 3D snaps. Instead, Project Tango is a bid to equip every mobile device with a powerful suite of software and sensors that can capture a complete 3D picture of the world around it, in real-time. Why? So you can map your house, furniture and all, simply by walking around it. Bingo - no more measuring up before going shopping for a new wardrobe. Or so you can avoid getting lost next time you go to the hospital - you'll have instant access to a 3D plan of its labyrinthine corridors. Or so you can easily find the 'unhealthy snacks' section in your local megamart. Or so can play amazing augmented reality games. Or so that the visually impaired can receive extra help in Page 4
  • 14. getting around. In fact, as with most Google projects, the ways in which Tango could prove useful are only limited by our imagination. WHAT DOES THE PHONE LOOK LIKE? There are two prototypes of Tango phone yet. A 7inch tablet and another prototype of a 5 inch phone. Fig (3) Prototype 1 It's a fairly standard 7in slate with a slight wedge at the back to accommodate the extra sensors. As far as we can tell, it has three cameras including the webcam. Inside, it has one of Nvidia's so-far-untested Tegra K1 mobile processors with a beefy 4GB of RAM and a 128GB SSD. Google is at pains to point out that it's not a consumer device, but one is supposedly on the way. The depth-sensing array consists of an infrared projector, 4MP rear camera and front- facing fisheye view lens with 180-degree field of vision. Physically, it's a standard phone shape but rather chunky compared to the class of 2014. More like something from about 2010. Page 5
  • 15. Fig (4) Prototype 2 Prototype 2 is an android 5 inch smartphone with the same tango hardware as that of the tablet. Fig (5) A simple Overview of Components of Tango Phone Google's Project Tango is a smartphone equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a wealth of data never before available to application developers, including depth Page 6
  • 16. and object-tracking and instantaneous 3D mapping.And it is almost as powerful and as big as a typical smartphone. Project Tango is different from other emerging 3D-sensing computer vision products, such as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly concerned with determining the device's position and orientation within the environment. The high-end Android tablet with 7-inch HD display, 4GB of RAM, 128GB of internal SSD storage and an NVIDIA Tegra K1 graphics chip (the first in the US and second in the world) that features desktop GPU architecture. It also has a distinctive design that consists of an array of cameras and sensors near the top and a couple of subtle grips on the sides. Movidious which is the company that developed some of the technology which has been used in Tango has been working on computer vision technology for the past seven years — it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a- awareness of depth. They supply information to Movidious custom Myriad 1 low- power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. Page 7
  • 17. 3. SMARTPHONE SPECIFICATION Tango wants to deconstruct reality, taking a quarter million 3D measurements each second to create a real-time 3D model that describes the physical depth of its surroundings. The smartphone specs are The above specs include Snapdragon 800 quad core CPU running up to 2.3 GHz per core, 2GB or 4GB of memory, an expandable 64GB or 128 of internal storage, and a nine axis accelerometer/gyroscope/compass. There’s also a Mini-USB, a Micro-USB, and USB 3.0. In addition to above specs Tango’s specs also include: a rear-facing four megapixel RGB/ infrared camera, a 180-degree field-of- view fisheye rear-facing camera, a 120-degree field- of-view front facing camera, and a 320 x 180 depth sensor – plus a vision processor with one teraflop of computer power. Project Tango uses a 3000 mAh battery Page 8
  • 18. 4. HARDWARE Project Tango is basically a camera and sensor array that happens to run on an Android phone. The smartphone is equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a wealth of data never before available to application developers, including depth and object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical smartphone. The Front View and Back View of a Tango Phone is shown below. It is same like some other phones but the phone is having variety of cameras and sensors that make the 3D modelling of the environment possible. Fig (6) Tango Phone Front View The device tracks the 3D motion and creates a 3D model of the environment around it by using the array of cameras and sensors. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. Page 9
  • 19. There are three cameras that capture a 120-degree wide-angle field of view. 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D data).A rear- facing four megapixel RGB/infrared camera, a 180-degree field-of- view fisheye rear-facing camera, a 120-degree field-of-view front facing camera, and a 320 x 180 depth sensor are the components of the phone at the rear end that works together to give the 3D structure of the scene. Fig (7) Tango Phone Back View Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low- power computer-vis ion processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. The motherboard which contains all of these components is shownbelow Page 10
  • 20. Fig (8) Tango phone Motherboard • Elpida FA164A1PB 2 GB LPDDR3 RAM, layered above a Qualcomm 8974(Snapdragon 800) processor. (RED) • Two Movidius Myriad 1 computer vision co-processors. (ORANGE) • Two AMICA25L016 16 Mbit low voltage serial flash memory ICs. (YELLOW) • InvenSense MPU-9150 9-axis gyroscope/accelerometer/compass MEMS motion tracking device. (GREEN) • Skyworks 77629 multimode multiband power amplifier module for quad-band GSM/EDGE. (BLUE) • PrimeSense PSX1200 Capri PS1200 3D sensor SoC. (VIOLET) The figure above is the motherboard: the red is 2GB LPDDR3 RAM, along with Qualcomm Snapdragon 800 CPU, the orange is computer image processor Movidius Myriad 1, the green which contain 9-axis acceleration sensor / gyroscope / compass, motion tracking, the yellow is two memory ICs AMIC A25L016 flash 16Mbit, the purple is the SoC 3D sensor PrimeSense PSX1200 Capri PS1200, the blue is SPI flash memory Winbond W25Q16CV 16Mbit. Internally, the Myriad 2 consists of 12 128-bit vector processors called Streaming Hybrid Architecture Vector Engines, or SHAVE in general, which run at 60MHz. The Myriad 2 chip gives five times the SHAVE performance of the Myriad 1, and the SIPP engines are 15x to 25x more powerful than the 1 st generation chip. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera..As the main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile device. The OV4682 is a 4MP RGBIR image sensor that c a p t u r e s high-resolution images and video as well as IR information, enabling depth analysis. Page 11
  • 21. Fig (9) Front andRear Camera Fig (10) Fisheye Camera Page 12
  • 22. Fig (11) IR Projector Integrated Depth Sensor Page 13
  • 23. 5. TECHNOLOGY BEHIND TANGO 5.1. TANGO SENSOR Myriad 1 vision processor platform developed by Movidius Company. The sensors allow the device to make "over a quarter million 3D measurements every second, updating its position and orientation in real time, combining that data into a single 3D model of the space around you. Movidius which is the company that developed some of the technology which has been used in Tango has been working on computer vision technology for the past seven years — it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. 5.2 IMAGE SENSOR Image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth which is then supplied information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The Motion Tracking camera keeps track of all the motions made by the user. . There are three cameras that capture a 120-degree wide-angle field of view from the front. An even wider 180 degree span from the back The phone is equipped with a standard 4-megapixel camera paired with a special combinat ion of RGB and IR sensor and a lower-resolution image-tracking camera. Its depth-sensing array consists of an infrared projector, 4MP rear camera and front-facing fisheye view lens with 180-degree field of vision. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. The data collected from sensors and camera is processed by the Myriad vision processor for delivering 3D structure of the view to the apps. Page 14
  • 24. 6. WORKING CONCEPT Project Tango devices combine the camera, gyroscope and accelerometer to estimate six degrees of freedom motion tracking, providing developers the ability to track 3D motion of a device while simultaneously creating a map of the environment An IR projector provides infrared lightthat other (non-RGB) cameras can use to get a sense of an area in 3D space. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. There are three cameras that capture a 120-degree wide-angle field of view from the front. An even wider 180 degree span from the back. A 4-MP color camera sensor can also be used for snapping regular pics. A 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D data). The main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile device. The OV4682 is a 4MP RGB IR image sensor that captures high-resolution images and video as well as IR information, enabling depth analysis. The sensor features a 2um OmniBSI-2 pixel and records 4MP images and video in a 16:9 format at 90fps. The sensor's 2-micron OmniBSI-2 pixel delivers excellent signal-to-noise ratio and IR sensitivity, and offers best-in-class low-light sensitivity. The OV4682's unique architecture and pixel optimization bring not only the best IR performance but also best-in-class image quality. The OV4682 records full-resolution 4-megapixel video in a native 16:9 format at 90 frames per second (fps), with a quarter of the pixels dedicated to capturing IR. The 1/3- inch sensor can also record 1080p high definition (HD) video at 120 fps with electronic image stabilization (EIS), or 720p HD at 180 fps. The OV7251 Camera Chip sensor is capable of capturing VGA resolution video at 100fps using a global shutter. RGB infrared (IR) single sensor that captures high-resolution images and video as well as IR information. It’s dual RGB and IR capabilities allow it to bring a host of additional features to mobile and machine vision applications, including gesture sensing, depth analysis, iris detection and eyetracking. The another camera is fisheye lens enables a 180º FOV, while the sensor balances resolution and frames per second to record black and white images for motion tracking. So if the users moves the devices left or right, it draws the path that the devices and that path followed is show in the image on the right in real-time. Thus through this we have a motion capture capabilities in our device. The device also has a depth sensor. Page 15
  • 25. Fig (12) The image represents the feed from the fish-eye lens Fig (13) Computer Vision Page 16
  • 26. The figure above illustrates depth sensing by displaying a distance heat map on top of what the camera sees, showing blue colors on distant objects and red colors on close by objects. It also the data from the image sensors and paired with the device's standard motion sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and then plot that onto an interactive 3D map. It uses the Sensor fusion technology which combines sensory data or data derived from sensory data from disparate sources such that the resulting information is in some sense better than would be possible when these sources were used separately. Thus it means a more precise, more comprehensive, or more reliable, or refer to the result of an emerging view, such as stereoscopic vision. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidious custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. Mantis Vision, a developer of some of the world's most advanced 3D enabling technolo- gies research MV4D technology platform is the core 3D engine behind Google's Project Tango. Mantis Vision provides the 3D sensing platform, consisting of flash projector hardware components and Mantis Vision's core MV4D technology which includes structured light- based depth sensing algorithms which generates realistic, dense maps of the world. It focuses to provide reliable estimates of the pose of a phone i.e. position and alignment, relative to its environment, dense maps of the world. It focuses to provide reliable estimates of the pose of a phone (position and alignment), relative to its environment Page 17
  • 27. 7. PROJECT TANGO CONCEPTS Project Tango is different from other emerging 3D-sensing computer vision products, such as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly concerned with determining the device's position and orientation within the environment. The software works by integrating three types of functionality: 7.1 MOTION TRACKING Motion tracking allows a device to understand position and orientation using Project Tango's custom sensors. This gives you real-time information about the 3D motion of a device. Motion-tracking: using visual features of the environment, in combination with accelerometer and gyroscope data, to closely track the device's movements in space. Project Tango’s core functionality is measuring movement through space and understanding the area moved through. Google API’s provide the position and orientation of the user’s device in full six degrees of freedom, referred to as its pose. Fig (14) Motion Tracking Page 18
  • 28. 7.2 AREA LEARNING Using area learning, a Project Tango device can remember the visual features of the area it is moving through and recognize when it sees those features again. These features can be saved in an Area Description File (ADF) to use again later. Project Tango devices can use visual cues to help recognize the world around them. They can self-correct errors in motion tracking and relocalize in areas they've seen before. . With an ADF loaded, Project Tango devices gain a new feature called drift corrections or improved motion tracking. Area learning is the way of storing environment data in a map that can be re-used later, shared with other Project Tango devices, and enhanced with metadata such as notes, instructions, or points ofinterest Fig (15) Area Learning 7.3 DEPTH PRECEPTION Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. This configuration gives good depth at a distance whilebalancing power requirements for infrared illumination and depth processing. The depth data allows an application to understand the distance of visible objects to the device. By combining depth perception with motion tracking, you can also measure distance between points in an area that aren’t in the same fame. Page 19
  • 29. Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. Current devices are designed to work best indoors at moderate distances (0.5 to 4 meters). It may not be ideal for close range object scanning. Because the technology relies on viewing infrared light using the device's camera, there are some situations where accurate depth perception is difficult. Areas lit with light sources highin IR like sunlight or incandescent bulbs, or objects that do not reflect IR light cannot be scanned well. By combining depth perception with motion tracking, you can also measure distances between points in an area that aren't in the same frame. Fig (16) Depth Perception Together, these generate data about the device in "six degrees of freedom" (3 axes of orientation plus 3 axes of motion) and detailed three-dimensional information about the environment. Page 20
  • 30. Applications on mobile devices use Project Tango's C and Java APIs to access this data in real time. In addition, an API is also provided for integrating Project Tango with the Unity game engine; this enables the rapid conversion or creation of games that allow the user to interact and navigate in the game space by moving and rotating a Project Tango device in real space. These APIs are documented on the Google developer website. Page 21
  • 31. 8. DEVICES DEVELOPED SO FAR As a platform for software developers and a model for device manufacturers, Google has created two Project Tango devices to date. 8.1 The Yellowstone tablet Google's Project Tango tablet, 2014 "Yellowstone" is a 7-inch tablet with full Project Tango functionality, released in June 2014, and sold as the Project Tango Tablet Development Kit. It features a 2.3 GHz quad-core Nvidia Tegra K1 processor, 128GB flash memory, 1920x1200-pixel touchscreen, 4MP color camera, fisheye-lens (motion-tracking) camera, integrated depth sensing, and 4G LTE connectivity. The device is sold through the official Project Tango website [8] and the Google Play Store. Fig (17) The Yellowstone tablet Page 22
  • 32. 8.2 The Peanut phone "Peanut" was the first production Project Tango device, released in the first quarter of 2014. It was a small Android phone with a Qualcomm MSM8974 quad-core processor and additional special hardware including a fisheye- lens camera (for motion tracking), "RGB-IR" camera (for color images and infrared depth detection), and Movidius image-processing chips. A high- performance accelerometer and gyroscope were added after testing several competing models in the MARS lab at the University of Minnesota. Several hundred Peanut devices were distributed to early-access partners including university researchers in computer vision and robotics, as well as application developers and technology. Google stopped supporting the Peanut device in September 2015, as by then the Project Tango software stack had evolved beyond the versions of Android that run on thedevice. 8.3 Testing by NASA In May 2014, two Peanut phones were delivered to the International Space Station to be part of a NASA project to develop autonomous robots that navigate in a variety of environme - nts, including outer space. The soccer-ball-sized, 18-sided polyhedral SPHERES robots were developed at the NASA Ames Research Center, adjacent to the Google campus in Mountain View, California. Andres Martinez, SPHERES manager at NASA, said "We are researching how effective Project Tango's vision-based navigation abilities are for performing localization and navigation of a mobile free flyeron ISS. 8.4 Intel RealSense smartphone The Intel® RealSense™ Camera ZR300 was designed mainly for rear (world) facing usages. It includes a Laser Projector which emits a structured pattern of infrared light, and 2 IR imagers (left and right) which capture the scene. Depth and dimensional characteristics of objects are calculated via point shift of the left to the right IR images. Besides the depth camera for computing high density depth (>10million points per second), ZR300 also includes a wide-field of view camera (VGA with >160 degree FOV) with a high-precision accelerometer-gyroscope combination for motion and feature tracking. On top of the powerful hardware platform which advanced silicon technologies such as the Intel® Atom™ x7-Z8700 SoC and the Intel® RealSense™ Camera ZR300, the device runs Android operating system, and supports the Google Project Tango SDK. The Google Project Tango bring 3 core technologies to mobile devices: Motion Tracking, Area Learning, and Depth Perception. The motion tracking technology uses the IMU to calculate the device’s current position relative to where it started, thus can be used in indoors navigation without using the GPS. The area learning technology applies a process called Simultaneous Localization and Mapping (SLAM) to learn an area while tracking the device’s current positon within it, which can be used to perform “drift correction” in long motion tracking scenarios. The depth perception technology uses the highly accurate ZR300 depth camera to generate a Point cloud to represent the depth information of objects. The Intel® RealSense™ technology potentially enables developers to implement the rich user experience features such as face tracking, scene perception meshing, environment reconstructing, and depth-enabled photo features such Page 23
  • 33. as photo depth blundering, photo layer segmentation, measurement, photo parallax and refocusing. Fig (18) Intel RealSense smartphone 8.5 Lenovo Phab 2 Pro Lenovo Phab 2 Pro was the first commercial smartphone with the Tango Technology, the device was announced at the beginning of 2016, launched in August, and available for purchase in the US in November. The Phab 2 Pro had a 6.4 inch screen, a Snapdragon 652 processor, and 64 GB of internal storage, with a rear facing 16 Megapixels camera and 8 MP front camera. The hardware focus of the Phab 2 Pro is the camera and sensor system on the back of the phone that enables Tango, encompassing a depth sensor and motion sensor in addition to the 16-megapixel main camera. Working in tandem, the three sensors combine with Google's Tango software API to offer functionality such as accurate digital measurement of three-dimensional objects, to-scale realistic simulation of furniture and other objects in a real-life space, and AR-augmented video games. The Phab 2 Pro's design encompasses a glass display flanked by aluminum upper and lower bezels, with the lower bezel housing three capacitive navigation buttons (the Android standard of back, home, and recent apps, from left to right), and the upper bezel housing a telephone loudspeaker for calls, an 8-megapixel front-facing camera, Page 24
  • 34. and a light sensor and proximity sensor. The back and sides of the phone are aluminum, with glass camera and sensor covers and two plastic antenna lines on the top and bottom; the cameras and sensors protrude slightly from the back itself, which features a slight curve similar to that found on the Nexus 6. The main camera is topmost on the back, followed by the depth sensor, both of which are housed in a vertical pill-shaped protrusion with a glass covering. Below the main camera and depth sensor housing is the circular motion sensor, immediately below which is the fingerprint sensor. The bottom of the phone houses a microUSB port for charging and data transfer, flanked by identical microphone and speaker grilles. The top of the phone houses the 3.5mm headphone jack. The left side of the phone (from the front) houses a two-button volume rocker above a power button, while the right side houses the ejectable SIM/microSD tray. Fig (19) Lenovo Phab 2 Pro Page 25
  • 35. 8.6 Asus Zenfone AR Asus Zenfone AR, announced at CES 2017,[22] was the second commercial smartphone with the Tango Technology. It ran Tango AR & Daydream VR on Snapdragon 821, with 6GB or 8GB of RAM and 128 or 256GB of internal memory depending on the configuration. Fig (20) Asus Zenfone AR Page 26
  • 36. 9.FUTURE SCOPE Project Tango seeks to take the next step in this mapping evolution. Instead of depending on the infrastructure, expertise, and tools of others to provide maps of the world, Tango empowers users to build their own understanding, all with a phone. Imagine knowing your exact position to within inches. Imaginebuilding 3D maps of the world in parallel with other users around you. Imagine being able to track not just the top down location of a device, but also its full3D position and alignment. The technology is ambitious, the potential applications are powerful. The Tango device really enables augmented reality which opens a whole frontier for playing games in the scenery around you. You can capture the room, you can then render the scene that includes the room but also adds characters and adds objects so that you can create games that operate in your natural environment. The applications even go beyond gaming. Imagine if you could see what room would look like and decorate it with different types of furniture and walls and create a very realistic scene. This Technology can be used the guide the visually impaired to give them auditory queues or where they are going. Can even be used by soldiers in war to replicate the war-zone and prepare for combat or can even be used to live out one’s own creative fantasies. The possibilities are really endless for this amazing technology and the future is looking very bright. Things Project Tango can do DIRECTIONS: When you need directions inside a building or structure that current mapping solutions just don’t provide. Shopping - who just like to get in and out as quickly as possible. Having an indoor map of the store in your hand could make shopping trips more efficient by leading you directly to the shelf you want. EMERGENCY RESPONSE: To help emergency response workers such as firefighters find their way through buildings by projecting the blueprints onto the screen. It has the potential to provide valuable information in situations where knowing the exact layout of a room can be a matter of lifeor death AUGMENTED REALITY GAMING: It could combine the room-mapping with augmented reality. “Imagine competing against a friend for control over territories in your own home with your own miniature army. Mapping in-game textures onto your real walls through the smartphone would arguably produce the best game of Cops and Robbers in history. Page 27
  • 37. MODELLING OBJECTS: A simple image showing image Modelling using Project Tango. Fig (21) Simple Model Page 28
  • 38. 10. CONCLUSION Project Tango enables apps to track a device's position and orientation within a detailed 3D environment, and to recognize known environments. This makes possible applications such as in- store navigation, visual measurement and mapping utilities, presentation and design tools, and a variety of immersive games. At this moment, Tango is just a project but is developing quite rapidly with early prototypes and development kits already distributed among many developers. It is all up to the developers now to create more clever and innovative apps to take advantage of this technology. It is just the beginning and there is a lot of work to do to fine-tune this amazing technology. Thus, if Project Tango works – and we've no reason to suspect it won't - it could prove every bit as revolutio- nary as Maps or earth or android. It just might take a while for its true genius to become clear Page 29
  • 39. 11. REFRENCES 1. Announcement on ATAPGoogle+ site, 30 may 2020. 2. "Future Phones WillUnderstand, See the World". 3 June 2020 Retrieved 4 November 2015. 3. ^"Slamdance: inside the weird virtual reality of Google's Project Tango". 29 May 2020. 4. Qualcomm Powers Next Generation Project Tango Development Platform, 2 9 m a y 2 0 2 0 5. IDF 2015: Intel teams with Google to bring RealSense to Project Tango, 18 jan 2020 6. https://developers.google.com/project-tango/ Google developer website 7. Product announcement on ATAPGoogle+ page, 5 June 2020, retrieved 4 feb 2020 8. https://www.google.com/atap/project-tango/ Project Tango website Page 30