This slide telling what "adaptive streaming" is. In the beginning, it explains how content(media) is prepared to fit adaptive needs, then talking about yapi.js - which is the web player used by KKTV(VOD service) in Taiwan.
11. server holds
media
EFFICIENT WAY
Player get part
of video and
start play
streaming
(progressive download)
Keep requesting
rest part of
video during
playback
inflexible
12. PREPARE MEDIA
3. Fit adaptive need
.mp4 / .ts file
more storage needed but much flexible
different bitrate
+
+
fragments
manifest
file
24. CLIENT SIDE DECRYPTION
• Client get a license instead of a key
• A blackbox/sandbox get the key by processing license
• That blackbox/sandbox provide key for decryption and
directly output to display
• This solution called DRM(Digital Right Management)
25. DRM ON BROWSER
• For browser, that blackbox called CDM
(Content Decrypt Module)
• Each browser support different DRM
context - “a blackbox or sandbox get the key by
processing license”
27. INTRODUCE EME
• Even though browser support its own DRM,
W3C defines a EME spec, in order to expose same
api to use
• prefixed api was implemented on earlier version of
chrome(smart tv)
Encrypted Media Extension
28. YAPI’S PROTECTION LOGIC
• implement protection on client side through EME
• give what DRM server needs to retrieve license
• deal with different browsers (versions)
29. EME
CDM
attempting to
play encrypted
media
get ‘challenge’
DRM
license
server
request with challenge
get license
yapi
provide license
for CDM to
decrypt content
EME
CDM
PROTECTION LOGIC FLOW
31. MEDIA ELEMENT
var videoNode = document.createElement(‘video’);
videoNode.src = VIDEO_URL;
With html5 media element, you can play single video source easily
33. MSE
“ MSE (Media Source Extension) extends HTMLMediaElement to allow
JavaScript to generate media streams for playback.
Allowing JavaScript to generate streams facilitates a variety of use cases like
adaptive streaming and time shifting live streams.“
media source extension
34. var video = document.createElement(‘video’);
video.src = VIDEO_URL
MEDIASOURCE IS A ‘SOURCE’
set ‘src’ attribute of video element to an url pointed to media
source
new window.MediaSource();window.URL.createObjectURL(ms);
35. SOURCE BUFFER
sourceBufferVideo = ms.addSourceBuffer(VIDEO_CODEC);
sourceBufferAudio = ms.addSourceBuffer(AUDIO_CODEC);
// get stream buffer via network
sourceBufferVideo.appendBuffer(buffer);
// sourcebuffer provides buffer info after append complete
36. BUFFER INFO
var buffered = sourceBuffer.buffered;
buffered.length; // how many discontinuous buffered time range
buffered.start(0); // first buffer start time
buffered.end(0); // first buffer end time
get buffer information from buffered attribute
37. MSE EXTENDS MEDIA ELEMENT
• MSE focus on providing stream buffer to media
element
• playback behavior still hold by media element
e.g play, pause, seek
41. BETWEEN MODULES
2. a parent module
problems
• defining ‘parent’
• multiple parent modules?
streaming
adaptive
parent
parent?
42. BETWEEN MODULES
3. event system
• modules are parallel
• semantic event name
indicates current ‘state’ of
player
streaming
adaptive
protection / EME
mediaElement / MSE
eventBus
56. USEFUL MEDIA EVENTS
loadstart: Indicate loading media begins, this fires when setting src attribute
loadedmetadata: while element has basic info of playback, e.g duration
timeupdate: while current time of playback is ticking
seeking/seeked: while conducting seek
ended: playback ends
play/playing: playback resume from other status to playing
57. ADDITIONAL EVENTS
loadedmanifest: after manifest is loaded/parsed
bitratechanged: when bitrate is changed
enableabr: when adaptive activation changed
buffering: when playback pending
cuechanged: webvtt subtitle cue changed (in and out)
58. yapi.load(MANIFEST_URL); // exposed api
yapi.addEventListener(‘playing’, onPlaying);
function onPlaying() {
// app logic
// or ui reaction, e.g showing playing status ui
}
// onPlaying invoked when playing
app
yapi ui