Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

The human side of video streaming services

1.332 Aufrufe

Veröffentlicht am

In this talk I explain why video quality control is a machine learning problem.
Human perception is a highly non-linear process, influenced by many more factors that we can measure on a video streaming service. In fact, despite the tremendous advances in video coding, we still don’t understand the intricate relationship between a stream and its delivery systems. We have little clues as to how different network conditions actually affect the quality of a video service perceived by the end user. Today, the most predominant consumer of network capacity (video) transits through a ‘video-repellent’ network (the Internet), one that has no notion of data delivery deadlines. So what are we getting from modern video services? Can we manage the quality of user experience, instead of trying to monitor or control the quality of network services? In this talk I give a critical perspective on video quality, its measurement and optimization, ending up with a controversial proposition.

Veröffentlicht in: Technologie
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

The human side of video streaming services

  1. 1. The human side ofvideo streaming servicesProf. Antonio LiottaEindhoven University of Technology http://bit.ly/autonomic_networks http://nl.linkedin.com/in/liotta https://twitter.com/#!/a_liotta www.slideshare.net/ucaclio http://bit.ly/press_articles
  2. 2. Three questions about QoE• What’s QoE?• How can we measure QoE?• Can we manage QoE?Prof. A. Liotta 2
  3. 3. The video delivery chain• An open loop system• Over a best-effort network• Operated via over-provisioning Can we monitor the perceived quality? Prof. A. Liotta 3
  4. 4. QoE is about perceived quality & satisfaction “A person‟s individual perceptions and responses that result from the use (or anticipated use) of a system” “An overall acceptability of a service, as perceived subjectively by the end user” “A measure of the overall end-to-end system performance” QoE is even more than this!Prof. A. Liotta 4
  5. 5. Measuring QoE may seem straightforward At which point does a video become unsatisfactory?Prof. A. Liotta 5
  6. 6. Which factors affect QoE? Very few are actually measurable !!http://erasmus-ip-multimedia2012.ing.unimo.it/index.php/lectures-videosProf. A. Liotta Select: DAY 5 6
  7. 7. We can measure some technical factors but cannot accurately correlate them with QoE perceptionProf. A. Liotta 7
  8. 8. We can measure some technical factors but cannot accurately correlate them with QoE perceptionProf. A. Liotta 8
  9. 9. Particularly difficult to correlate network conditions with video QoE perception Technical factors If it weren’t for the ‘edge tricks’, packet networks wouldn’t be able to stream • best-effort vs. QoS audio/video • Latency, jitter, packet loss, queue size, … • MPEG-4 video • 1% packet loss • no edge buffering A. Liotta, L. Druda, G. Exarchakos, V. Menkovski, Quality of Experience management forvideo streams: the case of Skype. In proc. of the 10th International Conference on Advances Prof. A.in Mobile Computing and Multimedia, Bali, Indonesia, 3-5 December 2012 (ACM). Liotta 9
  10. 10. The non-technical factors are even harder to measure and correlate with QoE perception The human visual system is non-linearProf. A. Liotta 10
  11. 11. Expectations affect QoE perception Motivations, purpose, personal interest, previous experience, boredomNon-technical factors expectations Prof. A. Liotta 11
  12. 12. Same encoding, but the pedestrian video is perceived worseProf. A. Liotta 12
  13. 13. How can we measure QoE?Prof. A. Liotta 13
  14. 14. Two main options Subjective QoE Objective QoEhttp://erasmus-ip-multimedia2012.ing.unimo.it/index.php/lectures-videos Select: DAY 5Prof. A. Liotta 14
  15. 15. Subjective QoE “The most reliable way of assessing the quality of a video as perceived by a particular human observer is to ask his [or her] opinion” (*)But can we really measure human perception through questions?• It’s hard to formulate the right questions• Questionnaires are intrusive• How many subjects give sufficient confidence?(*) A. C. Bovik, The Essential Guide to Video Processing, Academic Press, 2009 Prof. A. Liotta 15
  16. 16. An engineering approach to subjective QoE Single stimulus Double stimulus ITU-T Rec. P910 (*) provides guidelines for: • Standard viewing conditions • Criteria for the selection of observers • Test material preparation • Assessment procedures • Data analysis methods(*) ITU-T Rec. P.910 (09/99), Subjective video quality assessment methods for multimedia applications, 2008 Prof. A. Liotta 16
  17. 17. Single stimulus• The subject watches an impaired video and rates its quality without making any comparison with the original unimpaired sequence• The grading scale is defined as Mean Opinion Score (MOS)Prof. A. Liotta 17
  18. 18. Single stimulus Huge variability and bias 35% VQEG HD5 30% 25% STDEV[%scale] 20% 17,94% 15% 10% 5% 0% 0% 50% 100% MOS [%scale]Typically, the STDEV of MOS is 15-20% in the midrange and decreases at the edges“Report on the Validation of Video Quality Models for High Definition Video Content”by the Video Quality Experts Group, Jun. 2010. Prof. A. Liotta 18
  19. 19. We are better at spotting Double stimulus differencesProf. A. Liotta 19
  20. 20. Double stimulus Better than single stimulus but not good enoughProf. A. Liotta 20
  21. 21. A broad range of objective video quality metrics Full-reference metrics compare the distorted video directly with its original sequence. The most reliable but only applicable to off-line assessment No-reference metrics Assess the distorted video without any reference video. Measure image distortions, e.g. blockiness, blur, noise. Used to assess the impact of transport errors but far less reliable than FR Reduced-reference metrics Evaluate the distorted video based on a series of features that have been extracted from the reference video. Used for QoE prediction and management but less reliable than FR.Prof. A. Liotta 21
  22. 22. PSNR: the most loved-hated metric These two images have the same PSNRProf. A. Liotta 22
  23. 23. The problems with existing QoE assessment (both subjective and objective)• Not sufficiently accurate• Meant for off-line study• Not meant to correlate with QoSHard to close the loop, needed to manage video services !Prof. A. Liotta 23
  24. 24. QoE management is a machine learning problemProf. A. Liotta 24
  25. 25. Maximum Likelihood Different Scaling maps responses to a psychometric function (the human perception curve) DEVIATION OF RESPONSES BETWEEN 1 AND 10% DEPENDING ON VIDEO TYPE V. Menkovski, G. Exarchakos, A. Liotta, The Value of Relative Quality in Video Delivery, Journal of Mobile Multimedia. Vol.7(3), pp. 151-162 (Sept. 2011) http://bit.ly/JMM-2011Prof. A. Liotta 25
  26. 26. MLDS works because we are much better at scoring difference of differencesWhich one of these two pairs has bigger difference?Prof. A. Liotta 26
  27. 27. We can score ‘difference of differences’ even with video (not just still pictures)Prof. A. Liotta 27
  28. 28. MLDS provides a utility function to perform QoE management ZONE 1 QoS deltas don’t produce delta QoEs 364 KbpsProf. A. Liotta 512 Kbps 28
  29. 29. MLDS provides a utility function to perform QoE management ZONE 2 strong non linearity 64 KbpsProf. A. Liotta 256 Kbps 29
  30. 30. MLDS is more accurate than conventional QoE rating but still unscalable • Must consider all combinations of samples • A full round of tests including 10 levels of stimuli requires 10 210 tests 4 • The test matrix explodes as we consider more parameters Can we speed up the prediction-model learning process?Prof. A. Liotta 30
  31. 31. Active learning helps eliminating the redundant tests • After the first few test we can start estimating the answers of the remaining tests • The estimation of the unanswered test uses the characteristics of the psychometric curve to reduce the problem domain River bed Tractor Blue skyProf. A. Liotta 31
  32. 32. Learning convergence varies for different videos but always leads to improved scalabilityV. Menkovski, A. Liotta, Adaptive Psychometric Scaling for Video Quality AssessmentJournal of Signal Processing: Image Communication (Elsevier, 2012)http://bit.ly/JSP-2012 Prof. A. Liotta 32
  33. 33. Closing the QoE control loop QoS probe actuators Optimizing QoE QoE MLDS QoS prediction modelsProf. A. Liotta 33
  34. 34. But we’ll also have to face NEW conditions !! „Sport over mobile phone‟ QoS probe actuators Optimizing QoE QoE QoS prediction modelsProf. A. Liotta 34
  35. 35. Reinforcement Learning to realize ‘trial & error’ network loops „Sport over mobile phone‟ QoS probe actuators Optimizing QoE Machine QoE measure QoS prediction Learning or inferenceProf. A. Liotta 35
  36. 36. Networks quickly learn how to deal with new conditions(problem domain is constrained to psychometric function) 100 95 90 85 Accuracy 80 75 70 Old conditions New conditions 65 60 55 50 1030 1090 1150 1020 1080 1140 10 70 130 190 250 310 370 430 490 550 610 670 730 790 850 910 970 60 120 180 240 300 360 420 480 540 600 660 720 780 840 900 960 New ‘trial & error’ samples V. Menkovski, G. Exarchakos, A. Liotta, Online Learning for Quality of Experience Management The annual machine learning conference of Belgium and The Netherlands, Leuven, Belgium, 2010 http://bit.ly/BENELEARN-2010 Prof. A. Liotta 36
  37. 37. Take-home messages • Existing QoE methods are – annoying, expensive, inaccurate, ineffective – can’t be used to control video services • What is the ‘right’ question? – we are good at spotting difference of differences – off-line machine learning to build e2e models of video services • Service management is a ‘learning’ problem – human perception is a moving target – ML works with incomplete information, extrapolates non- obvious patterns and handles the unknown via trial&errorProf. A. Liotta 37
  38. 38. Thank you ! Check out my other Webinars at www.slideshare.net/ucaclio Want to author or edit a book? New Springer Series: Internet of Things – Technology, Communications and Computing Get in touch!! http://bit.ly/pervasive-networks liotta.antonio@gmail.comProf. A. Liotta 38