Google announced new hardware products including the Pixel 2 smartphone, Google Home Mini and Max smart speakers, Pixel Buds headphones, and Google Clips camera. The Pixel 2 enhances the Google Assistant with features like visual search with Lens and augmented reality. Voice Match on the Google Home allows for personalized Assistant responses based on individual voices. Pixel Buds seamlessly integrate the Assistant for touchless control and real-time language translation. Google Clips is a camera that uses machine learning to automatically take short video clips of people and pets.
2. Indrotuction: Hardware +
Software + AI
Google announced a portfolio of new products this week. Back in June, we had
exclusive access to these products, and we’re excited to finally be able to talk about
them.
Google’s original brand purpose and business model was to organize the worlds
information by delivering a best-in-class search and discovery experience. As their
brand strengthened, they expanded into providing best-in-class software like Gmail,
Google Docs, etc. Today, with the acquisition of Motorola, and more recently HTC,
Google is expanding into hardware, which has been dominated by Apple and
Samsung. Google’s biggest weakness is that they are way behind Apple in
processor and chip design capabilities. However, where Google is light-years ahead
of Apple is in AI and machine learning.
This advantage explains Google’s approach to building hardware that can deliver
experiences around a Hardware + Software+ AI framework. Whereas Apple
develops hardware experiences around their processors and chips, Google is
designing hardware around their massive advantage in AI. Google see’s themselves
as an AI first company, and any brand experiences leveraging their ecosystem
should be approached the same way.
In this review, we’ll take a deep dive into their AI driven hardware devices - Pixel 2,
Home devices, Pixel Buds, and Google Clips – and provide provide implications on
how brand marketers can leverage them to drive experiences.
Tom Edwards
Chief Digital Officer, Epsilon Agency
2
3. Pixel 2
The Pixel 2 was the big expected announcement today as well as the star of the show. Every
great new phone comes with a large screen, incredible camera, and processing speeds that
were unthinkable just a few years ago. These are table stakes. The differentiator is going to
be the software and the role that Google’s assistant will play in your lives. Device based
Machine Learning makes this possible by allowing for serverless data interaction and a more
secure user experience
Simply squeeze the sides of the Pixel 2 to invoke the Assistant, and jump into launching
apps, asking questions, making plans, and much more. As always, you are able to carry out
actions with the Assistant through voice or text input. Later this year, Google plans to roll out
Lens, its image recognition software that was teased earlier this year. Once the Pixel phones
receive this update, Assistant will play an even greater roll, taking visual information into
account to inform its decision making.
Googles answer to Apples AR Kit, was AR Core, and the Pixel is one of the first phones to be
able to showcase its full capabilities and we were shown a few examples of that today
including exclusive AR stickers for the Pixel.
Implications
The Pixel 2 brings more advanced hardware into the hands of consumers. Using this device,
consumers will have a greater personalized experience, new way to visually search
information, and an improved AR experience. The device and it’s underlying software will
continuously ingest contextual intelligence, personalizing your experience the more you use
it. As a brand marketer, this further reinforces the need to create experiences that tailor to
these new interaction models.
Voice based capabilities become increasingly more important as the focus shifts more and
more onto Google’s Assistant. Additionally, the ability to enhance user creation by offering
AR tools will become a key interaction point. Branded or contextually relevant Stickers will
offer similar benefits to lenses and filters and brands should look to take advantage of AR
Core’s capabilities.
3
4. Personalized Voice Experience
Googles introduction of Voice Match is a major enabling factor towards a pervasive
digital assistant. The technology allows the Google Assistant to recognize individual
voices, leading to multi-user support functionality. What this means is that based on
contextual commands, Assistant will be able to provide personalized answers based on
who is asking the question.
It does so by analyzing vocal constructs, pitch, and tone to match an individual voice
back to a Google account. By doing so, Google is able to include a personalized mobile
phone dialer allowing users to make calls from their number, utilizing their individual
contact list.
What this really allows is the ability to have multiple users tapping into a truly
personalized digital assistant. If dad asks ‘how long is it going to take me to get to work
this morning?’ the Assistant will analyze the traffic routes to Dad’s work, and not his
wife or kids. All of that information will be pulled and tracked through individual Google
accounts, allowing the Assistant to move from one device to another – keeping track of
daily schedules, preferences, and more.
Implications
The ability to have multiple users on one device will be a major selling point for Google
in its race against Amazon. It will also proliferate the idea of personalized daily
schedules which are facilitated by this pervasive assistant.
As a brand marketer, you must be asking yourself the question of how you can
integrate yourself into those schedules. You need to be able to provide services and
experiences that can be accessed from these assistants. Building relevant Google
Actions is going to become more and more important as the Assistant integrates itself
deeper into our daily lives.
4
5. Google Home Mini & Max
Google knew that it had to have an answer to Amazon’s massive success with the Echo Dot.
Today, we’re introduced to the Google Home Mini, a small circular device with a modern
mesh covering that comes in a few different colors. At a price point of just $49, Google hopes
to get the device, and its corresponding Assistant, into as many homes as possible.
We were also introduced to the Google Home Max, a Sonos type speaker with Google
Assistant built in. The differentiator in the Max is that it’s truly a smart speaker in the sense
that it will tune itself based on its contextual surroundings. It will respond to any obstructions,
play softer in the morning or crank the volume if it senses the washing machine running.
Additionally, Google announced a few software upgrades that will roll out to its family of
Home devices. The most notable one being Voice Match, which is discussed further on the
next slide. We’re also given hands free calling from mobile devices and a feature called
Broadcast, which turns all your Home devices into somewhat of an intercom system.
Implications
Google is serious about getting its Assistant into your home. Based on initial reactions to
these releases, it looks like they may be able to take a piece out of Amazon’s market share,
making it even more important to think about how your brand can offer Assistant based
experiences, called Actions.
While Alexa requires an invocation term to launch a Skill, Google Actions can be invoked just
by asking a related question or query. Additionally, they’re very well integrated into the visual
Chabot aspect of the Assistant. As a marketer, now is the time to take voice based
experiences seriously, to get ahead of the competition and reap first mover advantages. A
year from now, Assistant Actions and Alexa Skills will be as commonplace as the mobile app.
5
6. Pixel Buds
The Pixel Buds are Googles answer to Apples Air Pods, with a little more punch packed in.
They offer instant pairing, a rechargeable carrying case, and use touch controls on the ear
pods.
However, the integration with Google Assistant is what differentiates these headphones.
Simply holding your finger on the right earbud will invoke the assistant, but unlike Siri, you
won’t have to wait for any beep or confirmation. Once finished, lift your finger and the
Assistant will begin its response. This interaction model seems to make for a more
conversational type of experience.
The coolest part of today’s demo revolved around the Google Translate feature that these
headphones enable. Paired with a Pixel 2, your Earbuds will be able to facilitate a
conversation with someone of a different language. Tell the Assistant which language
you’d like to converse in, speak normally, and your phone will play a response in that
native language. Once your partner responds, your Pixel Buds will play perfect English
back to you. The demo seemed to work really well during the event.
Implications
AirPods and now Pixel Buds have indicated an industry wide shift towards integrating
virtual assistant access directly into headphones. We’re moving towards a world of ambient
computing, in which our computers will exist in the background and only be invoked when
they are truly needed.
These headphones are a step in that direction and will also increase the rate at which
proxies act on our behalf. It’s important for marketers to start thinking about how to market
to these assistants, as they will increasingly be doing more and more bidding for us,
whether we’re aware of it happening or not.
6
7. Google Clips
Clips was one of the surprise products that we were given today. Essentially, it’s a small
camera that can be placed or attached almost anywhere and relies on machine learning to
analyze and take 3 second clips of its surroundings.
The algorithm, called Moment IQ, learns the faces, individuals, and pets that are important to
you, based on the amount of time you’re with them. Additionally, it will recognize good
lighting and framing conditions for those instances. Once it recognizes a smile or action, it
will begin shooting – resulting in more spontaneous captures while allowing you to be in all
the shots.
Google is keen on the privacy aspect associated with this device, stressing that no data can
leave the device or even be backed up to a cloud until the user has a chance to review or
share it. However, in public, if the ML algorithms determines a correct instance, there’s
nothing to stop it from picking up and recording actions the recipient may not want it to.
Implications
The device is clearly marketed towards parents and dog owners, who want an easier way to
capture life’s moments as they happen. Looking forward, 2018 will be a year in which
computer vision dominates our user experiences. This camera will evolve to be contextually
aware and we wouldn’t be surprised if it acquires Assistant functionality as well.
As Google Lens is pushed out to more and more phones, look to see that software be
integrated into other devices as well, Clips being one of the first. For brand marketers, now is
the time to start seriously thinking about how SEO is going to change in a world where voice
and computer vision dominate search.
7