Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

5G and the Invisible Interface

Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Wird geladen in …3
×

Hier ansehen

1 von 62 Anzeige

5G and the Invisible Interface

Herunterladen, um offline zu lesen

We focus on Invisible Interfaces and their influence on digital experiences. With the advent of 5G creating the foundation for the increased adoption of ‘invisibility’ in our interaction with technology – we’ll discuss what this could mean for the UX and CX industry.

We focus on Invisible Interfaces and their influence on digital experiences. With the advent of 5G creating the foundation for the increased adoption of ‘invisibility’ in our interaction with technology – we’ll discuss what this could mean for the UX and CX industry.

Anzeige
Anzeige

Weitere Verwandte Inhalte

Diashows für Sie (20)

Ähnlich wie 5G and the Invisible Interface (20)

Anzeige

Aktuellste (20)

5G and the Invisible Interface

  1. 1. WELCOME Grab a cuppa - we’ll be underway very soon…
  2. 2. The Invisible Interface And what it may mean for ‘Experience’ Design.
  3. 3. Some Principles… • It is positively encouraged to disagree. • I hope to raise a few questions - the answers are unknown. • Covid19 implications run throughout. • First of a number of discussion sessions over the coming months; • Digital Doubles • New growth measures • Conscious Customers • Life Centred Design
  4. 4. We’ll discuss…. • The Advent of 5G • The World of ‘invisibility’ • Biometrics • AI • The world of ‘Neuro’. • A revision of HCI? • Ethics & Privacy • Testing Invisible Interfaces & the future
  5. 5. The advent of 5G
  6. 6. The path has been laid… • The Best Interface Is No Interface – Golden Krishna • Don’t Make Me Think – Steve Krug (maybe we need to think more) • Ruined By Design – Mike Monteiro • Hooked (I have a real issue with this and him…) – Nir Eyal • Hippo – Pete Trainor
  7. 7. But is being redrawn… • The Age of Surveillance Capitalism – Shoshana Zuboff • The Perils of Perception – Bobby Duffy • The Death of the Gods – Carl Miller • Human Compatible – Stuart Russell
  8. 8. 5G enables… • ‘True’ Self-driving cars • ‘True’ Remote Healthcare & personalised medicine • ‘True’ Super sharp resolution for video conferences…. • ‘True’ frictionless interactions.. • Smart Cities • Deeply Integrated AI • Life Centred Design • ‘Bigger’ Big Data (huge post Covid19 growth)
  9. 9. What does that mean? 1. Creates ‘Always on’ interfaces. 2. Humans become a more direct ‘data source’ 3. Creating exhausts/trails that look much deeper into individuals 4. Enabling the creation of ‘on the fly’ personalisation and experiences to a level not seen before. So; • What implications and opportunities does a true ‘always on’ capability offer in terms of experience design?
  10. 10. “Digital technology has had a direct impact on the way humans now engage - at a much deeper level than we think. We have become a black and white, yes and no, right and wrong society, directly (and maybe subconsciously) influenced by the binary foundations of digital (platforms) – the 1’s and 0’s that provide the basis for the industry. We have ceased to be analogue, we have ceased to see the grey. The grey is where knowledge is found, ideas are created and culture grows. Singularity is already here – and we didn’t even need AI to achieve it.“
  11. 11. The World of Invisibility
  12. 12. Biometrics
  13. 13. The ‘ideal’ use case “I use voice assistants for accessibility reasons. I am quadriplegic and they help me lead an independent life. They assist with tasks others take for granted like adjusting my heating and lights, window blinds, and TV and music. I have considered the privacy implications but for me the independence these assistants offer me far outweighs my privacy concerns.”
  14. 14. The not so ‘ideal’ use case In Portland, Oregon, a woman discovered that her Echo had taken it upon itself to send recordings of private conversations to one of her husband’s employees. In a statement, Amazon said that the Echo must have misheard the wake word, misheard a request to send a message, misheard a name in its contacts list and then misheard a confirmation to send the message, all during a conversation about hardwood floors.
  15. 15. “Alexa, sing a song” “Alexa, find quick dinner recipes” “Alexa, turn on Whisper mode” The days when you had to wake up your partner to ask Alexa to turn off the alarm are over. Whisper mode is a feature that allows you to whisper to Alexa, who in turn whispers when responding to you. “Alexa, how do I receive calls?” “Alexa, remind me to call Dad in 10 minutes” “Alexa, what’s 13 degrees Celsius in Fahrenheit?” “Alexa, play relaxing music” “Alexa, play Trivia Hero” “Alexa, announce breakfast is ready” Ask Alexa to make announcements on all compatible Echo devices in your household or the Alexa app. “Alexa, how can I create my own story?” “Alexa, give me a book recommendation” “Alexa, open Relaxing Piano” “Alexa, what's on my to-do list?” “Alexa, give me a limerick” “Alexa, play the song that goes, ‘I’ve got that sunshine in my pocket’” “Alexa, how do I make slime?”
  16. 16. The experience of voice • Micro interactions (and experiences) as sources of data gathering feeding into wider more commercially attractive experiences. • A single touchpoint within a larger ‘journey’? • A single interaction for a specific need and service. • Requires increased allowances for error – a need to design additional error recognition into experiences.
  17. 17. ‘Observational’ tech… • The changing foundations from a human/digital relationship perspective have been in place for some time now; • Security cameras. • Number plate recognition systems. • Fingerprint or facial access to our own devices. • Auto passport gates at airports.
  18. 18. The face is a window into a persons mind -Aristotle.
  19. 19. Facial Recognition – a practical example of the challenges. On average, adults in urban cultures scowl when they were angry 30% of the time. Which means that some 70% of the time adults do not scowl when angry.
  20. 20. Facial Recognition – an advanced example • Alipay (part of Alibaba) provide a service called ‘Smile to Pay’ – which enables users to do exactly what it sounds like. In an imaginary use case - could we therefore see 5G allowing us to withdraw money from ATM’s simply by looking into a screen? How do we design, test, monitor manage for that?
  21. 21. Gesture, Haptics & Movement • Google Soli (foundation for the Pixel phones gesture controls) • BMW – ICE controllers • IoT sensors within Care Homes. • Haptic responses to movement to reassure. • Partially sighted interactions • ‘Embracelet’ – creating positive emotions remotely. A physical reaction from a ‘remote’ intent • Neuro Gestures. • Interrupt the brain’s messages to hands and legs to predict movements
  22. 22. Some Biometric discussion points • How can we accurately test for gesture? • What can be learnt from voice interactions in terms of experience design? How much control do the designers really have? How successful are true voice interactions at present? • In terms of facial recognition – how can this truly be understood for an individual? • How much more specific do ‘personas’ need to go if we take into account deeper visual behaviours? • How do we blend a mix of screen & invisible interactions into our research and design processes?
  23. 23. ‘AI’
  24. 24. Three (very broad) levels of ‘AI’ • Perception • Automating Judgement • Predicting (Social & Individual) Outcomes Each of these have raised (ethical) concerns/uncertainty and confusion in the public which has been fed by the (primarily) the digital & tech sector.
  25. 25. Perception Genuine, rapid technological progress • Content identification (Shazam, reverse image search) • Face recognition* • Medical diagnosis from scans • Speech to text • Deepfakes* * Ethical concerns because of high accuracy
  26. 26. Automating Judgement Far from perfect, but improving • Spam detection • Detection of copyrighted material • Automated essay grading • Hate speech detection • Content recommendation Ethical concerns in part because some error is inevitable
  27. 27. Predicting (social & individual) outcomes Fundamentally dubious • Predicting criminal recidivism • Predicting job performance • Predictive policing • Predicting terrorist risk • Predicting at-risk kids Ethical concerns amplified by inaccuracy
  28. 28. How do we test AI? • Perception • Automating Judgement • Predicting (Social & Individual) Outcomes Each of these have raised (ethical) concerns/uncertainty and confusion in the public which has been fed by the (primarily) the digital & tech sector.
  29. 29. The core questions to ask of AI algorithms 1. Is it any good when tried in new parts of the real world? 2. Would something simpler, and more transparent and robust, be just as good? 3. Could I explain how it works (in general) to anyone who is interested? 4. Could I explain to an individual how it reached its conclusion in their particular case? 5. Does it know when it is on shaky ground, and can it acknowledge uncertainty? 6. Do people use it appropriately, with the right level of scepticism? 7. Does it actually help in practice?
  30. 30. A testing & evaluation framework Currently – Phase 1 in terms of testing “AI” Algorithms is completed regularly – Phases 2 -4 are not… we need to be aware of this and consider how we can control the environmental and ‘algorithmic’ growth. Pharmaceuticals Algorithms Phase 1 Safety: Initial testing on human subjects Digital testing: Performance on test cases Phase 2 Proof-of-concept: Estimating efficacy and optimal use on selected subjects Laboratory testing: Comparison with humans, user testing Phase 3 Randomized Controlled Trials: Comparison against existing treatment in clinical setting Field testing: Controlled trials of impact Phase 4 Post-marketing surveillance: For long-term side-effects Routine use: Monitoring for problems
  31. 31. Some AI discussion points • Where should experience designers sit in the development of AI solutions? • Should we consider creating positive friction when needed with interactions that are led by AI? • Can we blend AI testing in with usability & user experience testing? • Is the ‘user’ actually more than just the ‘physical’ human is it a blend of AI and human? As Experience Designers I think we have a deep responsibility to understand AI and the tests that any algorithm has undergone. This feeds into our research and designs.
  32. 32. The world of ‘Neuro’.
  33. 33. Definition: Neuroscience “The Scientific study of the nervous system”
  34. 34. Creating ‘Certainty’ 1. Improve the data set of an individual, improve the tech and AI capabilities to such an extent that you can develop the profile to absolutely accurately predict the next move of a human. 2. You simplify the human to make them more predictable.
  35. 35. Neuro ‘Mania’ • Neuro Politics • Neuro Gastronomy • Neuro Ethics • Neuro Law • Neuro Architecture • Neuro Philosophy • Neuro Economic • Neuro Theology • Neuro Musicology
  36. 36. Neuro Marketing…. • Did not exist in 2002. • Is seen as a serious approach to ‘persuading/influencing’ human behaviour. • To get people to do what you want them to do. • But - can the findings of an FRMI scan or a FB wristband highlight someone’s actual action? • “I wasn’t going to buy that, Amazon thought I wanted to..” • “I didn’t mean to think that – my brain did it without me knowing”
  37. 37. Neuro Marketing discussion points • How do we design for intent and not action? • Should we? • How will our work change with the ‘advent’ of Neuro Marketing? Will it change at all? • Will deep personalisation become more and more engrained? • How will research develop? • What testing could accommodate such an ability of being able to ‘look’ inside an individuals brain?
  38. 38. Neuro Marketing Forecasting….& AI • What behaviours might ‘we’ expect to happen in the population • What behaviours might ‘we’ expect people to show when presented with a specific argument, language, product. • Will we create more individual ‘false’ situations (remember FB) to create, develop and learn intent? • What do we as experience designers do with that? • Should we do anything?
  39. 39. A re-balance of HCI?
  40. 40. Who is in control and what’s the impact? Human or ‘Computer’? Do we design for an AI prediction or AI making a judgement? For intention or action? To pre-empt movements? How can we possibly accommodate the speed of this change and data flow? Can we integrate multiple invisible interfaces across a journey and accommodate significant uncertainty and larger margins for error?
  41. 41. Who is in control and what’s the impact? • What ‘value’ will humans gain from interactions that are invisible and ‘powered’ by AI, neuro etc..? • As interfaces and the underlying tech and (decisions) become invisible should we look to create experiences that are more emotionally fulfilling? • Should we therefore focus even more on creating real ‘delight/joy’ in our experience design?
  42. 42. Influenced Behaviour • In almost all situations experience design and tech influences behaviour. This can be a positive (for the individual and society) – reducing vehicle speed, preventing shoplifting, staying at home etc. • It could also can also be a negative and restrict ‘true’ human behaviour – for example, I constantly delight when I’m recommended certain products or music by Amazon or Spotify that I dislike because I’ve randomly clicked on various links – in an attempt to confuse and game the tracking systems.
  43. 43. Influenced Behaviour • Capturing, processing and modelling data created by influenced behaviour creates influenced solutions. • The skills of an experience designer become more refined and understood – data is more clearly seen as just one part of the deal (try telling customers that….) • Therefore creating a need for wider experience and service design skills. • Sociologists • Anthropologists • Psychologists • Philosophers….. (I would say that..)
  44. 44. Q- Will ‘we’ rail against the tech that supports invisibility or embrace it? • Will negatively influenced behaviour come to more prominence? • Forcing designers to create more and more ‘restrictive’ experiences. • To demand that experience professionals increase skillset & resource. • Is an appreciation by people of big data (with Covid19) here to stay? • In a watershed moment is it being seen as a real leveller and enabler? • Is the workplace changing beyond recognition? If so, are invisible interfaces going to accelerate or hinder change?
  45. 45. Privacy & Ethics
  46. 46. We aimed for tech surpassing human capability. We created tech surpassing human vulnerability.
  47. 47. But what is privacy today? • Big data is being seen by some as a saviour in the current climate. • Will this change societies relationship with privacy? • How could this affect experience design? • Are we opening up to being ‘observed’ for the greater societal and human good?
  48. 48. Privacy is a human right • We will keep all your personal data for as long as your account remains open. You can close your account at any time using your account settings. If you close your account, we will delete the personal data associated with your account. • If your account is inactive for a period of eighteen months, we reserve the right to close your account and delete your personal data.
  49. 49. Our Privacy Challenge • Privacy by Design. • Is it our responsibility to create privacy into the research, solutions and experiences that we design? • How will experiences taking place across multiple interfaces protect privacy? Should we lead these discussions? • How do we build an ‘openness’ on privacy into our ‘users’ awareness? Should we? Is it our job?
  50. 50. Thinking about Ethics • Should we use findings from Neuro technology to create experiences? • Should we try and influence thoughts? • How do we create ethical experiences that have significant AI inputs – when the algorithm is out of our control? • Should we therefore ensure that we position ourselves at the centre even more? • Should we become the ‘ethical’ voice of the user • Persuasion and Influence – will still be the battleground – but be much more personalised that before - What is our stance/position?
  51. 51. Q- Will Invisible Interfaces create more division? • Will screens be used by those that cannot afford ‘sentient’ hardware and applications? • Or will they be the preserve of the few – with everyone else being ‘forced’ to use sentient-led technology. • Will we see observational technology becoming the norm.
  52. 52. Wilful blindness • …failing to see - or admit to ourselves or our colleagues - the issues and problems in plain sight… we prefer ignorance as we are afraid of questioning…
  53. 53. Some questions for you. • Can you see and communicate the unintended consequences of invisibility early enough to clients and come up with solutions? • Do you know enough about the tech being used by clients and the data ramifications of the use of that tech? • Do you see yourself as a UX’er, UI designer or someone developing ‘solutions’ that will enhance people’s lives? • How do you feel when you are developing a solution you know is not for the benefit of the user (we don’t live in an ideal world or industry – do we?)
  54. 54. Testing the future Some ideas and thoughts…
  55. 55. Our (early) approach • With screen-based interactions we have developed ‘buffers’ – body language, facial expressions and the occasional sigh tend to offer glimpses into the true feelings of a user. • We’ve started developing tools to look at gait, speed of walk (and speech) in addition to more pronounced body and facial clues alongside true tone of voice and use of specific language trigger words and terminology.
  56. 56. Our (early) approach • Using beacons and IoT sensors (and platform) to create a networked environment in which to distract/observe/develop relationships and understand behaviours. • Using wearables in context. Data and learnings can be extrapolated from understanding context in detail. • Investigating ethical neuro technology that can understand reaction but do not directly influence decisions.
  57. 57. The future Experience Designer? • The platforms that we use, test, develop and enhance have been developed by technologists. • These platforms are ‘stable’…… • Sociologists, Anthropologists, Psychologists are all moving to the forefront as technology becomes more embedded in everyday lives. • The increased use of these skills in experience design will allow technology to become an enabler rather than the lead. • Moving more naturally into Service Design & Blueprint.
  58. 58. The future Experience Designer? • Understands AI, its limitations and best use cases. • Has an ethical design mindset • Understands the developing human-computer relationship - (as the human is the interface) – so the ‘user’ actually becomes more than just the human.
  59. 59. Upcoming Discussions • Digital Doubles – Mid April • Life Centred Design – May • Experience UX Blog - https://www.experienceux.co.uk/ux-blog/ • Remote workshops and training ongoing • The Need for Humane Tech – 16th April - www.firesmoke.co
  60. 60. Thank you.

Hinweis der Redaktion

  • Instead, they did something else with their faces. People also scowled when they were not angry. “They scowl when they’re concentrating, they scowl when someone tells them a bad joke, they scowl when they have gas, they scowl for lots of reasons,”
  • Instead, they did something else with their faces. People also scowled when they were not angry. “They scowl when they’re concentrating, they scowl when someone tells them a bad joke, they scowl when they have gas, they scowl for lots of reasons,”
  • Does AI remove interactions? Instead of us going through a journey designed by ‘us’ we go through a journey designed by ‘It’?

    Will it remove the need for us to interact, engage with those that we previously have. Will it remove interfaces?
  • We have to also look towards the impact and potential implications of neuroscience when we are considering the invisible interfaces. The claims made by some in the digital industry as to the impact of neuro marketing are outlandish at best and misleading at worst. But, but, but – we need to take this into account when we are trying to understand the tension between a customers true need and our clients underlying beliefs.

    HOW CAN WE SHOW EXAMPLES OF TRUE BEHAVIOUR IF THE CLIENT HAS BEEN COMMITTED TO USING NEURO APPROACHES TO INFLUENCING USER BEHAVIOUR.

×