Soli is a gesture sensing technology developed by Google that uses millimeter-wave radar to detect fine hand motions and gestures without the need for physical contact or devices. It allows for touchless interaction with electronics and has applications in areas like smart devices, VR/AR, IoT, gaming, and medicine. Soli works by emitting and receiving radio waves that are scattered by the hand, with the time and signal changes used to track hand position and motion. It has advantages like replacing buttons, wireless operation, and precision, though it also has limitations such as a small range and potential security issues.
3. INTRODUCTION
• Soli is one of the projects of Google ATAP.
• The project is headed by Ivan Poupyrev.
• Soli was announced at Google I/O 2015.
• Soli is a new gesture sensing technology for
human-computer interaction.
• Soli is millimetre-wave radar that can
detect micro motion of human hand.
3
Google Project Soli
4. Why Soli?
Electronics and Communication, SCE 2016-17 4
Google Project Soli
• Input is an essential and critical
component of interactive computer
systems.
• As the screen size of electronic devices
decreases, the difficulty in interaction
with devices increases.
• Soli provides wireless and touchless
interactions.
6. WORKING
• Radar is an object-detection system that uses radio waves to determine the range, angle, or
velocity of objects.
• A modulated electromagnetic wave is emitted toward a moving target that scatters the transmitted
radiation, with some portion of energy redirected back toward the radar where it is intercepted by
the receiving antenna.
• The time delay, phase or frequency shift, and amplitude attenuation capture rich information
about the target’s properties, such as distance, velocity, size, shape, surface smoothness,
material and orientation.
6
Google Project Soli
9. SOLI N27516
9
Google Project Soli
• The Soli N27516 chip incorporates the
entire sensor and antenna array into an
ultra-compact 8mm x 10mm Silicon chip.
• It capture motions of fingers at a
phenomenal rate of 10,000 frames per
second.
• It can illuminate the hand with a broad 150
degree radar beam with pulses repeated at
very high frequency (1-10 kHz).
10. VIRTUAL TOOL GESTURES
10
Google Project Soli
Virtual Button Virtual Dial Virtual Slider Virtual Touch Screen
Instead of static shapes, the key to Soli interaction is motion, range and velocity, where the sensor
can accurately detect and track components of complex motions caused by a user hand moving and
gesturing within sensing field.
12. ADVANTAGES
• Replace all kinds of input buttons and sensing
technologies.
• Devices can be operated wirelessly.
• Allows to control devices with gestures.
• Very small in size.
• Low power consumption.
• Extremely fast and highly precise.
• Insensitive to the light, noise or atmospheric
conditions.
12
Google Project Soli
13. DISADVANTAGES
• It has a small radar range.
• Multiple gestures could not be possible.
• Gestures will be predefined.
• Expensive.
• Security Threat.
13
Google Project Soli
14. FUTURE SCOPE
14
Google Project Soli
• Project Soli is looking for developers
to evolve, test the technology and build
Soli applications.
• Presently Soli Development Kit is
available for limited Hardware and
Software developers.
15. CONCLUSION
One of the big problems with wearable devices right now is inputs -
there's no simple way to control these devices.
Therefore gestures can be used by individuals to carry out certain
functions with electronic devices such as Smartwatches, Smartphones
and Laptops.
15
Google Project Soli
16. REFERENCES
[1] Ivan Poupyrev, Jaime L., Nicholas Gillian, Patrick A., and others “Soli: Ubiquitous Gesture
Sensing with Millimeter Wave Radar.” In ACM Transactions and Graphics July 2016.
[2] Ivan Poupyrev, Saiwen Wang, Jie Song and others “Interacting with Soli: Exploring
Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum.”
In ACM Symposium on User Interface Software and Technology (UIST) October 2016.
[3] Ivan Poupyrev, Sato M., and C. Harrison “Enhancing touch interaction on humans, screens,
liquids and everyday objects.” In International conference on Computer-Human interaction
CHI 2012.
[4] Rekimoto J. “GestureWrist and GesturePad: unobtrusive wearable interaction devices.”
In IEEE International Symposium on Wearable Computers (ISWC) 2001.
16
Google Project Soli