SlideShare ist ein Scribd-Unternehmen logo
1 von 29
ANONYMOUS &
CHEAP
Experimenting with Unobtrusive
Methods of Measuring User
Experience and Engagement for In-
Gallery Interactives
Anonymous: Solution and Problem
Visitors: Low barrier to
entry
NO DATA
Naturalistic Observation (Lurking)
One task
One interpretive point
Usable ≠ Engaging
What these experiments were… and weren’t
WERE:
• UX engagement
• Stop-gap
• Trial and error/iterative
WEREN’T:
• Educational/interpretive goal
measurement
• Exhaustive or definitive
• High-tech
“Engagement is a category of user experience characterized
by attributes of challenge, positive affect, endurability,
aesthetic and sensory appeal, attention, feedback,
variety/novelty, inter-activity, and perceived user control”
- O’Brien & Toms
Question 1: What is “engaged”?
Questions about experience
• What proportion of visitors are using
them?
• Do visitors understand function?
• Do the interactives support the context?
• What is the emotional response?
Are there things we can measure to benchmark future projects?
First Line of
Attack:
Idle Timers &
Custom Event
Analytics
Idle Timer as
User Session
Proxy
Mean: 3.7 pinch-to-zoom interactions per “user session”
Idle Timer as
User Session
Proxy
Mean: 0.3 interactions with interpretive content per “user session”
0
500
1000
1500
2000
2500
3000
Blaschka Map: User Sessions and Map Pin Taps
Map Pin Tap Events User Sessions
Second Approach:
Video Observations
and Optimized Forms
The Setup
• Two cameras
• Two rooms
• Four digital interactives
Deferred
Observations
• Unobtrusive
• Non-real-time
Optimized
Forms
• Two tools
• Multiple iterations
• Lots of trial and error
Counting
Visitors and
Interactive
Usage
Results
42%
Overall
64%
Child/teen
32%
Adult
47%
Senior
Tiffany“Workshop”:
PercentageofVisitors W ho
UsedAtLeastOneDigital
Interac�ve,byAgeCategory
Digging Deeper
• Individual observations of
behaviors and interactions
• Branching logic forms for data
collection optimized to visitor
behaviors
Mosaic Theater
How do you measure
engagement with a
passive experience?
Mosaic Theater: Passive Engagement
30%
13%
18%
17%
22%
Mosaic Theater: Visitor Time Spent
less than 1 minute
1-2 minutes
2-4 minutes
4-8 minutes
more than 8 minutes
Mosaic Theater: Interaction
Tiffany Workshop Interactives: the Payoff
Workshop Interactives: Content
Engagement
Workshop Interactives: Disengagement
• Satisfied with information and/or experience: user purposefully examined most or all the
content and/or engaged in discussion with others about the information presented
• Interruption or distraction: interruption of attention by other person or activity
• Usability issue(s): repeated unsuccessful attempts at interaction or visual frustration
• Boredom or uninterested: user quickly perused or otherwise spent little time or attention on
information or application
• Time Constraint: summoned away before user appeared ready to depart. This category was sometimes
difficult to distinguish from “interruption or distraction”
• Unclear: could not infer reason for disengagement or otherwise unclear
Where Do We Go From Here?
XKCD

Weitere ähnliche Inhalte

Ähnlich wie MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement For In-Gallery Interactives

SkillSwap Weekend - Usability Testing
SkillSwap Weekend - Usability TestingSkillSwap Weekend - Usability Testing
SkillSwap Weekend - Usability Testing
schaef2493
 
Session 1051: Improve Your Website With Usability Testing
Session 1051: Improve Your Website With Usability TestingSession 1051: Improve Your Website With Usability Testing
Session 1051: Improve Your Website With Usability Testing
Michael Lambur
 
Online evaluation for Local Government Information Unit
Online evaluation for Local Government Information UnitOnline evaluation for Local Government Information Unit
Online evaluation for Local Government Information Unit
Alice Casey
 

Ähnlich wie MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement For In-Gallery Interactives (20)

When Mobile meets UX/UI powered by Growth Hacking Asia
When Mobile meets UX/UI powered by Growth Hacking AsiaWhen Mobile meets UX/UI powered by Growth Hacking Asia
When Mobile meets UX/UI powered by Growth Hacking Asia
 
An engaging click
An engaging clickAn engaging click
An engaging click
 
UX Workshop at Startit@KBC
UX Workshop at Startit@KBCUX Workshop at Startit@KBC
UX Workshop at Startit@KBC
 
User engagement in the digital world
User engagement in the digital worldUser engagement in the digital world
User engagement in the digital world
 
FITC Mobile 09 Presentation: UX From Stationary To Mobile
FITC Mobile 09 Presentation: UX From Stationary To MobileFITC Mobile 09 Presentation: UX From Stationary To Mobile
FITC Mobile 09 Presentation: UX From Stationary To Mobile
 
User eXperience insights
User eXperience insightsUser eXperience insights
User eXperience insights
 
Designing Mobile UX
Designing Mobile UXDesigning Mobile UX
Designing Mobile UX
 
Extending the reach and learning new skills: IL the web conferencing way. Fos...
Extending the reach and learning new skills: IL the web conferencing way. Fos...Extending the reach and learning new skills: IL the web conferencing way. Fos...
Extending the reach and learning new skills: IL the web conferencing way. Fos...
 
Immersive Learning with Virtual Reality and Face Recognition
Immersive Learning with Virtual Reality and Face RecognitionImmersive Learning with Virtual Reality and Face Recognition
Immersive Learning with Virtual Reality and Face Recognition
 
Homespun UX: Going Beyond Web Analytics
Homespun UX: Going Beyond Web AnalyticsHomespun UX: Going Beyond Web Analytics
Homespun UX: Going Beyond Web Analytics
 
SkillSwap Weekend - Usability Testing
SkillSwap Weekend - Usability TestingSkillSwap Weekend - Usability Testing
SkillSwap Weekend - Usability Testing
 
Usability Testing: Making it fast, good, and cheap
Usability Testing: Making it fast, good, and cheapUsability Testing: Making it fast, good, and cheap
Usability Testing: Making it fast, good, and cheap
 
Tell me what you want and I’ll show you what you can have: who drives design ...
Tell me what you want and I’ll show you what you can have: who drives design ...Tell me what you want and I’ll show you what you can have: who drives design ...
Tell me what you want and I’ll show you what you can have: who drives design ...
 
User Experience Design Fundamentals - Part 2: Talking with Users
User Experience Design Fundamentals - Part 2: Talking with UsersUser Experience Design Fundamentals - Part 2: Talking with Users
User Experience Design Fundamentals - Part 2: Talking with Users
 
Session 1051: Improve Your Website With Usability Testing
Session 1051: Improve Your Website With Usability TestingSession 1051: Improve Your Website With Usability Testing
Session 1051: Improve Your Website With Usability Testing
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 
classmar2.ppt
classmar2.pptclassmar2.ppt
classmar2.ppt
 
Got the tech, do they use it?
Got the tech, do they use it?Got the tech, do they use it?
Got the tech, do they use it?
 
When Qual Met UX
When Qual Met UXWhen Qual Met UX
When Qual Met UX
 
Online evaluation for Local Government Information Unit
Online evaluation for Local Government Information UnitOnline evaluation for Local Government Information Unit
Online evaluation for Local Government Information Unit
 

Mehr von MuseWeb Foundation

MW20 Inside the Climate Converter
MW20 Inside the Climate ConverterMW20 Inside the Climate Converter
MW20 Inside the Climate Converter
MuseWeb Foundation
 
Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...
Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...
Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...
MuseWeb Foundation
 

Mehr von MuseWeb Foundation (20)

MuseWeb Virtual Tour Experience at MCN
MuseWeb Virtual Tour Experience at MCNMuseWeb Virtual Tour Experience at MCN
MuseWeb Virtual Tour Experience at MCN
 
Big Data and the Visitor Journey: Using Data Science to Understand Visitor Ex...
Big Data and the Visitor Journey: Using Data Science to Understand Visitor Ex...Big Data and the Visitor Journey: Using Data Science to Understand Visitor Ex...
Big Data and the Visitor Journey: Using Data Science to Understand Visitor Ex...
 
MW20 Artificial Intelligence in the service of creative storytelling
MW20 Artificial Intelligence in the service of creative storytellingMW20 Artificial Intelligence in the service of creative storytelling
MW20 Artificial Intelligence in the service of creative storytelling
 
How to Build, When to Buy: Scalable Tactics for Digital Projects and Services
How to Build, When to Buy: Scalable Tactics for Digital Projects and ServicesHow to Build, When to Buy: Scalable Tactics for Digital Projects and Services
How to Build, When to Buy: Scalable Tactics for Digital Projects and Services
 
Tangible Augmented Reality for Archival Research
Tangible Augmented Reality for Archival ResearchTangible Augmented Reality for Archival Research
Tangible Augmented Reality for Archival Research
 
Mw20 -3gg Citizen History - so close or too far?
Mw20 -3gg Citizen History - so close or too far?Mw20 -3gg Citizen History - so close or too far?
Mw20 -3gg Citizen History - so close or too far?
 
Mw20 -3k Physical Visualizations
Mw20 -3k Physical VisualizationsMw20 -3k Physical Visualizations
Mw20 -3k Physical Visualizations
 
Opening Our Doors Wider
Opening Our Doors WiderOpening Our Doors Wider
Opening Our Doors Wider
 
Prototyping in collaboration with university libraries
Prototyping in collaboration with university librariesPrototyping in collaboration with university libraries
Prototyping in collaboration with university libraries
 
MW20 Inside the Climate Converter
MW20 Inside the Climate ConverterMW20 Inside the Climate Converter
MW20 Inside the Climate Converter
 
Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...
Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...
Magus Cagliostro, Wonders.do, Israel: Art of Escape, Magic, and immersive sto...
 
Bots I have met - Paul Rowe
Bots I have met - Paul RoweBots I have met - Paul Rowe
Bots I have met - Paul Rowe
 
Telling and Sharing Stories Online about Museum Objects
Telling and Sharing Stories Online about Museum ObjectsTelling and Sharing Stories Online about Museum Objects
Telling and Sharing Stories Online about Museum Objects
 
Digital social innovation and the evolving role of digital in museums haith...
Digital social innovation and the evolving role of digital in museums   haith...Digital social innovation and the evolving role of digital in museums   haith...
Digital social innovation and the evolving role of digital in museums haith...
 
A Crisis of Capacity - Adam Moriarty
A Crisis of Capacity - Adam MoriartyA Crisis of Capacity - Adam Moriarty
A Crisis of Capacity - Adam Moriarty
 
Understanding Access: Translation Services and Accessibility Programs MW19 Li...
Understanding Access: Translation Services and Accessibility Programs MW19 Li...Understanding Access: Translation Services and Accessibility Programs MW19 Li...
Understanding Access: Translation Services and Accessibility Programs MW19 Li...
 
Designing a 'No Interface' Audiowalk"
Designing a 'No Interface' Audiowalk"Designing a 'No Interface' Audiowalk"
Designing a 'No Interface' Audiowalk"
 
Approaching “Dark Heritage” Through Essential Questions: An Interactive Digit...
Approaching “Dark Heritage” Through Essential Questions: An Interactive Digit...Approaching “Dark Heritage” Through Essential Questions: An Interactive Digit...
Approaching “Dark Heritage” Through Essential Questions: An Interactive Digit...
 
MW19: Simple Tangible Interaction
MW19: Simple Tangible InteractionMW19: Simple Tangible Interaction
MW19: Simple Tangible Interaction
 
Co-creating knowledge_Baggesen
Co-creating knowledge_BaggesenCo-creating knowledge_Baggesen
Co-creating knowledge_Baggesen
 

Kürzlich hochgeladen

1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 

Kürzlich hochgeladen (20)

Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 

MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement For In-Gallery Interactives

Hinweis der Redaktion

  1. I wanted to share some info today about a few cheap and simple experiments we’ve done to try to get some -- hopefully meaningful -- data on how visitors are engaging with digital interactives that were NOT built to capture information on individual visitors.
  2. At CMoG, we build a lot of anonymous-use digital interactives for our exhibitions, particularly temporary exhibitions. For visitors, that means no apps to download, no devices to carry around, no registration or sharing of personal information. The big problem with that is that those applications don’t provide meaningful data on usage because there’s effectively no such thing as a user. There are some ways to try to get around that don’t intrude or require anything from user -– like tracking users with Wi-Fi localization -- but they’re not great for the information I’m interested in: visitor engagement from a user experience perspective.
  3. We regularly do studies where someone watches where visitors go and where they linger. But as a UX person, that doesn’t tell me a lot about their digital experiences. Doing visitor surveys helps, but it’s obviously pretty intrusive. I tend to rely on observations to try to get at user experiences, but people don’t like being watched, and I’m not particularly fond of watching them. And my sense is that, when museum staff are around, some people avoid or walk away from digital interactives for fear of looking foolish.
  4. One of the other problems with observations – sometimes, there’s nowhere to be unobtrusive. This is the exhibition area in our Library – essentially a hallway, and usually pretty quiet. So it’s a very difficult place to lurk unobtrusively.
  5. A lot of our digital interactives are pretty narrowly focused and simple, especially for temporary exhibitions. They’re often not meant to be 10 minute experiences. Sometimes 30 seconds will do it. This interactive was part of our recent Tiffany’s Glass Mosaics exhibition. There was essentially one interaction – you move the view area around to see differences between three different mosaics made from the same design drawing. Most of the information was visual, but there were also three highlighted areas with short bits of text. You could literally get all the information in a minute or even less. Usability is very important, but just because they could figure out how to move the viewer around with their finger doesn’t mean they were interested or consuming the information.
  6. So we were hoping for some better information. While we’re exploring more complicated solutions, we decided to look at a few not-so-super-sexy, low-tech, and cheap ways to try to get some information on how engaged visitors are with these interactives. There’s obviously some crossover here with educational goals. I don’t claim any expertise in that area, but this has been a good project for discussion with the education team.
  7. So the first question that arose when we started on this effort – what does engagement actually mean? What are we measuring? Obviously, engagement can mean different things with different applications or different perspectives, but this is a definition I came across that nicely combines different Human-Computer Interaction and Psychology concepts into a solid UX engagement starting place, including some things we could actually try to measure against.
  8. From there, it was time to start thinking about questions we could try to answer – whether there were ”positive affects” or “sensory appeal” or “attention” that we could detect. One big challenge in temporary exhibitions is that time constraints usually mean that once these digital interactives were built, there’s a pretty low likelihood of updates, or improvements. Unless something is broken, it’s on to the next thing… So a big questions was: are there things we can measure with the digital interactives in those exhibitions that would be useful and applicable to future efforts?
  9. As a place to start, we tried some things with existing analytics tools. At first, we just set up the applications to track custom events in Google analytics – you tap or pinch and it tells Google that something happened and when. These kinds of applications don’t have the normal web or mobile application things like identifiable users or sessions or page views. So to try to tease more information out, we tried using idle timers as a proxy for a user ”session”. We often use these idle timers anyway to reset the application after no one has been using it. So when it’s been sitting idle for 45 seconds or whatever, it tells Google that that was the end of a user session. The next interaction after that starts a new session.
  10. This is a touch-screen interactive originally built by Genetic Science Learning Center at the University of Utah that we adapted for use in an exhibition about microscopes. Basically, you did pinch gestures to zoom to higher magnifications. So: super-simple, single interaction. And using that idle timer technique, we could at least get a guess at how much a “user” was doing with the application – so in this case, how many times each person pinched to zoom in.
  11. There weren’t a lot of ”user sessions” – this was over in that quieter library exhibition area. So that wasn’t a big surprise. But the people that did use it at least seem to have seen a decent portion of the content – with this ide timer method, we saw an average of about 3.7 pinch-to-zoom interactions per session. You could see all the content in 4 or 5 pinch gestures, so the people that DID use it seemed to be seeing most of the content.
  12. We used this same idle timer method in another touch display later in the main galleries – this was an interactive map of collections of Blaschka models of marine invertebrates, which are these incredibly accurate glass models from the 1800s, of creatures like anemone and jellyfish that don’t hold up well as dead specimens in jars. They were used in universities and museums all over the world. .
  13. This map data showed one or two thousand interactions with that map most days – so people were reasonably enticed to explore the map – but not so enticed by the actual interpretive content: only about 1 in 3 proxy user sessions involved touching one of the map pins or pin clusters that included information about Blaschka collections. The other two-thirds of the sessions just explored the map. This idle timer method is definitely not perfect – it’s likely that sessions are undercounted in busy times when the idle timer doesn’t trigger or when it’s used in groups. And maybe some overcount if people are slow or distracted or discussing things.
  14. In any case, we can’t really get a full understanding of the various aspects of engagement just by measuring touches and swipes. So to take a stab at getting more robust and interesting user experience data, we tried a second approach: using footage from off-the-shelf, networked security cameras, combined with optimized data entry forms to collect lots of observation data quickly. One of the hopes was to compare the data we gathered with the idle timer technique with the security camera footage. Unfortunately, one of the not-so-surprising lessons learned from that exhibition: spotty Wi-Fi connections can make a mess of analytics data collection. So we’ll have to go back to that comparison another time.
  15. We tried this video observation - form data approach in the Tiffany’s Glass Mosaics exhibition. There were four touch screen-based interactives in two of the rooms. One room was a darkened space we called the “Mosaic Theater”, which had large monitors for display and a kiosk-style touch display to select presentations and view some basic contextual information. The other room was presented like a mosaic workshop, and had three touch-screen interactives that let visitors closely examine a few of the mosaics and the glass fragments used in them. We mounted a security camera on the ceiling of each of those two rooms.
  16. The cameras are a good way to ensure more naturalistic observation – there’s no museum staff member or consultant peering over their shoulder, at least not literally. We did of course have some internal conversations about privacy. Security cameras are part of our every day lives now, so that helped ease some concerns, and we put a few policies in place to try help protect visitor privacy. One of the big bonuses of this video method was that you could do your observations at any time – you just schedule the cameras to record, and you can spend your life in meetings as needed. Then you can go back and watch the footage and collect your data any time, or recruit other people to help.
  17. The real key to making this work efficiently was trying out a couple of different ways of using data collection forms for tracking visitor activities. I went through quite a few iterations of these – watching video, clicking buttons, seeing what I could realistically and efficiently track, then changing the form and trying again. Essentially, it was usability testing the tool for an audience of one or two people – mostly just me. This was actually one of the most time-consuming aspects of this approach, but hopefully it’s something we can use again in the future.
  18. One of the first trial-and-error lessons was that we couldn’t count visitors AND track their real interactions at the same time, or with the same tool. So for quick basic quantitative data, I created a web form connected to a Google spreadsheet to track the number of visitors entering the space and the number of visitors using the digital interactives – with some demographic best guesses. We could usually run the video at 2 or 3 times normal speed and we managed to track 700 people pretty efficiently. That said, this wouldn’t be great for tracking many hundreds of visitors in an hour. https://dm.cmog.org/tiffany/ux-analysis/test-visitor-track-form.html
  19. We used this particular web form for the workshop area interactives – it didn’t work well for the Mosaic Theater space, for reasons I’ll talk about in a minute. So with this method, we could get a sense of the proportions of different demographic categories of visitors who used the digital interactives. We obviously had to make some guesses about demographics, which is a flaw if you want to dive very deeply into any of this data. It’s better for generalizations.
  20. One surprise from the data, for me anyway, was that seniors used the interactives more than adults. The seniors just took more time to check everything out. You can get these kinds of insights from regular, in-person observations. It was nice to have these naturalistic observations combined with a larger set of quantitative data, but this was still a bit more superficial than what we were hoping to get.
  21. So, to attempt to dig a little deeper, we turned to different tools and techniques. To get real information about actual engagement, you really have to watch individual users. But we were still hoping to find some efficiencies in data collection, and hoping for some ways to quantify things. To keep things simple, I just used Microsoft Forms to set up data entry forms that worked roughly in the order that visitors typically moved, and only show what’s needed – if the visitor didn’t use one of the digital interactives, it skipped those data fields. https://forms.office.com/Pages/ResponsePage.aspx?id=5U3G_qrrqEWN1P8uiGfiqf_Qgq-jS_9Gjbi0fOfPEdFUODZMVTgxR0xERzNTMUhLRjg1MUtSTkxJOC4u
  22. Tracking engagement from a UX standpoint for the Mosaic Theater was challenging. With this space, the interactive part wasn’t really key to being engaged. Visitors could consume the interpretive content without ever touching it -- other people could choose which presentations to watch. And people sometimes came and went multiple times, or watched and listened to some of the presentations and then got absorbed in their phones. So this was a bit trickier for defining measurable indicators of engagement.
  23. But we gave it a shot. We tracked 100 visitors with the video footage for the mosaic theater. We gave up on demographics – it was just too hard to make those guesses in the darkened space. It WAS generally possible to tell which video presentations were being watched, so we were able to get information not just about the time they spent in there but how many presentations they sat through and which ones they left during or after.
  24. We did also track visitor usage of the control surface, and it did at least seem clear that people knew HOW to use it. One of the questions that I hoped to answer was about a back button, to interrupt whatever was playing and go back to the selection screen. We removed that early on in the exhibition over concerns about it being annoying to other people watching the presentations. So we wanted to know if it was problematic NOT to have that control. Given how passive the experience was meant to be, I was actually a bit surprised at the proportion of people who did use the controls. And while there were definitely a few people frustrated by the lack of a back button, it wasn’t really a big problem.
  25. Where this video and form data collection approach really paid off was in the workshop area, where there were three touch-screen interactives -- in close proximity, adequate light, and with clear camera views of the screens and visitors. We used forms optimized for this space and questions about these interactives, so quite often, we were able to track 2 or 3 people at a time.
  26. Along with demographic guestimates, we could track the time spent interacting on each application, how much of the available content the visitor saw, and see and make some notes about behaviors and and engagement with the context – like talking, and pointing at portions of a mosaic. Visitors spent the most time with the digital interactive that had the least interpretive content – the ”Last Supper” interactive only had 3 highlighted interpretive locations, compared with 7 for the others. So on that simpler interactive, most of the users saw all the key points.
  27. We could even make some reasonably clear inferences about why they STOPPED interacting with the application. After a little trial and error, we came up with these reasons for disengagement – things we could usually see from the video observations. There’s obviously some subjectivity to this, so we made it possible to select multiple reasons on the form if there was some grey area.
  28. So from that data, that Last Supper digital interactive pretty clearly comes out ahead on visitors who seemed satisfied with their experience (in blue), which explains the longer times spent using it. It probably helped that it was on a larger display, but I think its simplicity and clarity played a big role.
  29. So now those exhibitions are all over and all those digital interactives are packed away in Git repositories. We got a bunch of data on if and how visitors engaged and disengaged with these particular digital experiences, and there wasn’t much iteration or testable changes because we were on to the next stack of projects and exhibitions. It was all meant to be just cheap and easy experiments anyway, while we see about fancier things like maybe automating video observations with computer vision programs. But for the immediate future, we’re hoping that we can employ similar techniques again. We’d like test how well the JavaScript idle timer sessions correlate with real visitor sessions from the video footage. We hope that some of this analysis can serve as a benchmark to compare future digital interactives, and maybe apply these approaches to projects for permanent exhibitions, where we can potentially actually iterate and re-test, in the dreamy future when we have lots spare time on our hands.