New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects
1. QoMEX’09 A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects Christian Timmerer and Markus Waltl Klagenfurt University (UNIKLU) Faculty of Technical Sciences (TEWI) Department of Information Technology (ITEC) Multimedia Communication (MMC) http://research.timmerer.com http://blog.timmerer.com mailto:christian.timmerer@itec.uni-klu.ac.at Co-authors: Markus Waltl, Christian Timmerer, and Hermann Hellwagner Acknowledgment: This work is supported in part by the European Commission in the context of the InterMedia project. Further information is available at http://intermedia.miralab.unige.ch/. Slides available at http://www.slideshare.net/christian.timmerer
2. Outline Introduction / Motivation Sensory Effect Description Language Test-Bed: Annotation Tool and Simulator Test Environment and Preliminary Results Conclusions Demo & Video 2009/07/31 2 Christian Timmerer, Klagenfurt University, Austria
3. Introduction Universal Multimedia Access (UMA) Anywhere, anytime, any time + technically feasible Main focus on devices and network connectivity issues Universal Multimedia Experience (UME) Take the user into account Multimedia Adaptation and Quality Models/Metrics Single modality (i.e., audio, image, or video only) or a simple combination of two modalities (i.e., audio and video) Triple user characterization model Sensorial, e.g., sharpness, brightness Perceptual, e.g., what/where is the content Emotional, e.g., feeling, sensation Ambient Intelligence Add’llight effects are highly appreciated for both audio and visual content Calls for a scientific framework to capture, measure, quantify, judge, and explain the user experience 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 3 B. de Ruyter, E. Aarts. “Ambient intelligence: visualizing the future”, Proceedings of the Working Conference on Advanced Visual Interfaces, New York, NY, USA, 2004, pp. 203–208. E. Aarts, B. de Ruyter, “New research perspectives on Ambient Intelligence”, Journal of Ambient Intelligence and Smart Environments, IOS Press, vol. 1, no. 1, 2009, pp. 5–14. F. Pereira, “A triple user characterization model for video adaptation and quality of experience evaluation,” Proc. of the 7th Workshop on Multimedia Signal Processing, Shanghai, China, October 2005, pp. 1–4.
4. Motivation Consumption of multimedia content may stimulate also other senses Vision or audition Olfaction, mechanoreception, equilibrioception, thermoception, … Annotation with metadata providing so-called sensory effects that steer appropriate devices capable of rendering these effects 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 4 … giving her/him the sensation of being part of the particular media ➪ worthwhile, informative user experience
5. Sensory Effect Description Language (SEDL) XML Schema-based language for describing sensory effects Basic building blocks to describe, e.g., light, wind, fog, vibration, scent MPEG-V Part 3, Sensory Information Adopted MPEG-21 DIA tools for adding time information (synchronization) Actual effects are not part of SEDL but defined within the Sensory Effect Vocabulary (SEV) Extensibility: additional effects can be added easily w/o affecting SEDL Flexibility: each application domain may define its own sensory effects Description conforming to SEDL :== Sensory Effect Metadata (SEM) May be associated to any kind of multimedia content (e.g., movies, music, Web sites, games) Steer sensory devices like fans, vibration chairs, lamps, etc. via an appropriate mediation device ➪ Increase the experience of the user ➪ Worthwhile, informative user experience 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 5
6. Sensory Effect Description Language (cont’d) 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 6 SEM ::=[DescriptionMetadata](Declarations|GroupOfEffects|Effect|ReferenceEffect)+ Declarations ::= (GroupOfEffects|Effect|Parameter)+ GroupOfEffects ::= timestamp EffectDefinitionEffectDefinition (EffectDefinition)* Effect ::= timestamp EffectDefinition EffectDefinition ::= [activate][duration][fade][alt] [priority][intensity][position] [adaptability]
7. Test-Bed: Annotation Tool and Simulator Annotation Tool: SEVino Simulator: SESim 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 7
8. Test Environment Based on amBX (Ambient Experience) system + SDK Two fan devices, a wrist rumbler, two sound speakers, a subwoofer, two lights, and a wall washer Everything controlled by SEM descriptionsexcept light effect ➪ automatic color calculation is deployed Advantages Reduction of description size Speeds up authoring stage Different automatic color calculation methods may lead to different user experiences (1) Average color in the RGB color space (2-4) Dominant color in the RGB, HSV, and HMMD Requires a lot of computational resources: (2-4) > (1) due to the management of color bins & amBX supports only RGB 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 8
9. Preliminary Results More or less constant color pattern A lot of different colors which change very rapidly 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 9 Color calculation is performed only on every pth frame (p=5) for efficiency reasons
10. Conclusions Test-bed for the QoMEX evaluation of sensory effects SEVino: a video annotation tool for sensory effects SESim: a corresponding simulation tool A real world test environment based on the amBX system and SDK Major findings Average color for the automatic color calculation ➪ immediate reaction to color changes, appealing effects, low computational requirements, real-time applicable RGB, HSV, and HMMD dominant color ➪ smoother reaction to color changes, higher computational requirements Future work Optimization of the automatic color calculation (real-time) Subjective tests (already started & stay tuned) (Semi-)automatic extraction of sensory effect information 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 10
11. References M. Waltl, C. Timmerer, and H. Hellwagner, “A Test-Bed for Quality of Multimedia Experience Evaluation of Sensory Effects”, Proceedings of the First International Workshop on Quality of Multimedia Experience (QoMEX 2009), San Diego, USA, July 29-31, 2009. C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner, “Interfacing with Virtual Worlds”, accepted for publication in the Proceedings of the 2009 NEM Summit, Saint-Malo, France, September 28-30, 2009. C. Timmerer, “MPEG-V: Media Context and Control”, 89th ISO/IEC JTC 1/SC 29/WG 11 (MPEG) Meeting, London, UK, June 2009. https://www-itec.uni-klu.ac.at/mmc/blog/2009/07/08/mpeg-v-media-context-and-control/ MPEG-V: http://www.chiariglione.org/mpeg/working_documents.htm#MPEG-V MPEG-V reflector: http://lists.uni-klu.ac.at/mailman/listinfo/metaverse 2009/07/31 Christian Timmerer, Klagenfurt University, Austria 11
Take only every nth frame for automatic color calculation for performance reasonsHSV and HMMD are used since these color spaces are closer to the human perception of color than RGB
Video1 (A Chinese Ghost Story 1 - Taoist Monk Fight Scene, http://www.youtube.com/watch?v=TzBkL_1kCUc) has a length of 63 s, 25 fps, 624x336 pixel, and 1058 kbit/sbitrate with a more or less constant color pattern.Video2 (Alien Quadrilogy (2003) Trailer, http://www.youtube.com/watch?v=gIWLwen1Rf8) has a length of 62 s, 25 fps, 640x464 pixel, and 702 kbit/sbitrate with a lot of different colors which change very rapidly