12. Presentation Outline
• Thought Experiment
• Complex Systems and EID
• Multimodal Interfaces
• Layouts in Multimodal Interfaces
• Temporal Synchrony
13. Controlling Complex Systems
• Large, socio-technical, real-time,
dynamic systems
• Complex systems require special
interfaces because (Vicente &
Rasmussen, 1992):
– Complex systems require complex
controllers
– Physical systems are governed by
constraints
– Good controllers must possess a
model of the system
14. Ecological Interface Design
• Helps with understanding underlying system
constraints
• Recognize and respond to abnormal events
• Map system constraints and relationships onto
perceptual objects
• Allow operators to use skill-based behaviour
15. Perceptual Relationships
• “Visual ways of displaying
information that can reduce the need
for memory or mental calculation”
(Burns & Hajdukiewicz, 2004)
• Visual comparisons of orientation,
size, shape, location
– Visual characteristics that are
perceptually easy to evaluate
– Also provides grouping and layout
information (configural displays)
Burns (2000)
18. Layouts in Multimodal Interfaces
• Different models (Sarter, 2006):
– Redundancy
– Supplementary
– Sensory modalities as separate channels of information
• Lack of research about how to layout information in different
modalities, especially for abstract data
• Hypothesis: Layouts in multimodal interfaces will
depend on cross-modal relationships
19. Showing Cross-modal
Relationships
• Different degrees of cross-modal relationships:
– Completely automatic– Multisensory Integration
– Based on a judgement – Cross-modal Matching
• Perceptual relationships EID
• Can we group multi-modal interface information into
perceptual objects ?
• Are there processing advantages for grouping display
information into perceptual events?
20. Three principles of multi-sensory integration
(Meredith & Stein, 1993)
Spatial Rule
– Integration is more likely when the individual sensory
stimuli come from roughly the same location
Temporal Rule
– Integration is more likely when the individual sensory
stimuli start from roughly the same time
.
Principle of Inverse Effectiveness
– Integration is more likely when the individual sensory
stimuli are vague or weak.
21. Display Integration in Ecological
Displays (Burns, 2000)
• Impact of spatial proximity (things
presented close together) and
temporal proximity (things presented
at nearly the same time) on
understanding ecological displays
• Recommendation of using higher
spatial proximity and temporal
proximity in adaptive problem solving
tasks
22. • Spatial location?
• Temporal occurrence?
Possible Methods for Grouping
Crossmodal Data
23. Temporal Synchrony
• A type of crossmodal matching
– Are things happening at the
same time/ same duration?
• Ability to detect temporal
synchrony developed at a very
young age (Lewkowicz, 2000)
• “Synchrony window” exists in
which things are perceived to be
synchronous
24. WhyTemporal Synchrony?
• Not as constrained by physical location of displays.
– Higher mobility for operators
– Lower space footprint
• Can be more dynamic
• Different methods for invoking temporal synchrony:
– Onset
– Duration
– Rate
– Rhythm
25. Audition andTouch
• Most commonly used non-visual modalities for
information presentation
• Share many “amodal” characteristics
• Benefits:
– Ability to be perceived even when attention is
directed elsewhere
– More accurate representation of temporal
information (double flash illusion)
29. Proposed Methodology- Research Questions
• To what degree can we group or compare auditory (earcon/sonification)
and tactile (tactor/tactification) information using temporal synchrony
and how is this impacted by operator workload?
• Are there processing advantages for grouping display information using
these perceptual relationships?
+
Temporal
Synchrony?
=
Perceptual
Objects?
30. Take Home Message
• Layouts in multi-modal interfaces may be
dependent on crossmodal perceptual relationships.
• EID can leverage these relationships in complex
systems
• Temporal synchrony is one method that may be
used to group information in multimodal interfaces