SlideShare ist ein Scribd-Unternehmen logo
1 von 94
WebRTC

Introduc)on	
  to	
  WebRTC	
  

Dan	
  Burne4	
  
Chief	
  Scien)st,	
  Tropo	
  
Director	
  of	
  Standards,	
  Voxeo	
  
	
  

Alan	
  Johnston	
  
Dis)nguished	
  Engineer	
  
Avaya	
  
	
  
WebRTC	
  Tutorial	
  Topics	
  
• 
• 
• 
• 
• 
• 
• 

What	
  is	
  WebRTC?	
  
How	
  to	
  Use	
  WebRTC	
  
WebRTC	
  Peer-­‐to-­‐Peer	
  Media	
  
WebRTC	
  Protocols	
  and	
  IETF	
  Standards	
  
WebRTC	
  W3C	
  API	
  Overview	
  
Pseudo	
  Code	
  Walkthrough	
  
PracFcal	
  bits	
  
AdhearsionConf	
  2013	
  

2	
  
What	
  is	
  WebRTC?	
  
WebRTC	
  is	
  “Voice	
  &	
  Video	
  in	
  the	
  browser”	
  
•  Access	
  to	
  camera	
  and	
  microphone	
  without	
  a	
  
plugin	
  
–  No	
  proprietary	
  plugin	
  required!	
  	
  

•  Audio/video	
  direct	
  from	
  browser	
  to	
  browser	
  
•  Why	
  does	
  it	
  maUer?	
  
–  Media	
  can	
  stay	
  local	
  
–  Mobile	
  devices	
  eventually	
  dropping	
  voice	
  channel	
  
anyway	
  
–  Games	
  
AdhearsionConf	
  2013	
  

4	
  
The	
  Browser	
  RTC	
  FuncFon	
  
Web	
  
Server	
  

Signaling	
  
Server	
  

HTTP	
  or	
  WebSockets	
  
	
  
JavaScript/HTML/CSS	
  
Other	
  APIs	
  
Web	
  
Browser	
  

•  WebRTC	
  adds	
  new	
  Real-­‐
Time	
  CommunicaFon	
  (RTC)	
  
FuncFon	
  built-­‐in	
  to	
  
browsers	
  

–  No	
  download	
  
HTTP	
  or	
  WebSockets	
  –  No	
  Flash	
  or	
  other	
  plugins	
  
	
   (Signaling)	
  

•  Contains	
  

–  Audio	
  and	
  video	
  codecs	
  
–  Ability	
  to	
  negoFate	
  peer-­‐to-­‐
peer	
  connecFons	
  
On-­‐the-­‐wire	
  protocols	
  
–  Echo	
  cancellaFon,	
  packet	
  loss	
  
(Media	
  or	
  Data)	
  
concealement	
  

RTC	
  APIs	
  

Browser	
  
RTC	
  
FuncFon	
  

NaFve	
  OS	
  Services	
  

•  In	
  Chrome	
  &	
  Firefox	
  today,	
  
Internet	
  Explorer	
  someFme	
  
and	
  Safari	
  eventually	
  

AdhearsionConf	
  2013	
  

5	
  
Benefits	
  of	
  WebRTC	
  
For	
  Developer 	
  	
  

For	
  User	
  

•  Streamlined	
  development	
  –	
  
one	
  placorm	
  
•  Simple	
  APIs	
  –	
  detailed	
  
knowledge	
  of	
  RTC	
  protocols	
  
not	
  needed	
  
•  NAT	
  traversal	
  only	
  uses	
  
expensive	
  relays	
  when	
  no	
  
other	
  choice	
  
•  Advanced	
  voice	
  and	
  video	
  
codecs	
  without	
  licensing 	
  	
  

•  No	
  download	
  or	
  install	
  –	
  
easy	
  to	
  use	
  
•  All	
  communicaton	
  
encrypted	
  –	
  private	
  
•  Reliable	
  session	
  
establishment	
  	
  
–  “just	
  works”	
  

•  Excellent	
  voice	
  and	
  video	
  
quality	
  	
  
•  Many	
  more	
  choices	
  for	
  real-­‐
Fme	
  communicaFon	
  

AdhearsionConf	
  2013	
  

6	
  
WebRTC	
  Support	
  of	
  MulFple	
  Media	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  
Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  
WebCam	
  Video	
  
Stereo	
  Audio	
  

Browser	
  L	
  	
  	
  	
  
on	
  Laptop	
  

Browser	
  M	
  
on	
  Mobile	
  

•  MulFple	
  sources	
  of	
  audio	
  and	
  video	
  are	
  assumed	
  
and	
  supported	
  
•  All	
  media,	
  voice	
  and	
  video,	
  and	
  feedback	
  messages	
  
are	
  mulFplexed	
  over	
  the	
  same	
  transport	
  address	
  
AdhearsionConf	
  2013	
  

7	
  
WebRTC	
  Triangle	
  
Web	
  Server	
  
(ApplicaFon)	
  

Peer	
  ConnecFon	
  (Audio,	
  Video,	
  and/or	
  Data)	
  
Browser	
  L	
  

Browser	
  M	
  

(Running	
  HTML5	
  ApplicaFon	
  	
  
from	
  Web	
  Server)	
  

(Running	
  HTML5	
  ApplicaFon	
  	
  
from	
  Web	
  Server)	
  

•  Both	
  browsers	
  running	
  the	
  same	
  web	
  applicaFon	
  from	
  web	
  
server	
  
•  Peer	
  ConnecFon	
  established	
  between	
  them	
  with	
  the	
  help	
  of	
  
the	
  web	
  server	
  
AdhearsionConf	
  2013	
  

8	
  
WebRTC	
  Trapezoid	
  
Web	
  Server	
  A	
  
(ApplicaFon	
  A)	
  

Browser	
  M	
  

SIP	
  	
  
or	
  Jingle	
  

Web	
  Server	
  B	
  
(ApplicaFon	
  B)	
  

Peer	
  ConnecFon	
  (Audio	
  and/or	
  Video)	
  

Browser	
  T	
  

(Running	
  HTML5	
  ApplicaFon	
  	
  
from	
  Web	
  Server	
  B)	
  

(Running	
  HTML5	
  ApplicaFon	
  	
  
from	
  Web	
  Server	
  A)	
  

	
  

•  Similar	
  to	
  SIP	
  Trapezoid	
  
	
  
•  Web	
  Servers	
  communicate	
  using	
  SIP	
  or	
  Jingle	
  or	
  proprietary	
  
•  Could	
  become	
  important	
  in	
  the	
  future.	
  
AdhearsionConf	
  2013	
  

9	
  
WebRTC	
  and	
  SIP	
  
Web	
  Server	
  	
  

SIP	
  

SIP	
  Server	
  

SIP	
  

Browser	
  M	
  

Peer	
  ConnecFon	
  (Audio	
  and/or	
  Video)	
  

SIP	
  Client	
  
	
  

•  SIP	
  (Session	
  IniFaFon	
  Protocol)	
  is	
  a	
  signaling	
  protocol	
  used	
  by	
  service	
  
providers	
  and	
  enterprises	
  for	
  real-­‐Fme	
  communcaFon	
  
•  Peer	
  ConnecFon	
  appears	
  as	
  a	
  standard	
  RTP	
  session,	
  described	
  by	
  SDP	
  
•  SIP	
  Endpoint	
  must	
  support	
  RTCWEB	
  media	
  extensions	
  	
  
	
  
AdhearsionConf	
  2013	
  

10	
  
WebRTC	
  and	
  Jingle	
  
Web	
  Server	
  

Jingle	
  

XMPP	
  Server	
  

Jingle	
  

Peer	
  ConnecFon	
  (Audio	
  and/or	
  Video)	
  
Browser	
  M	
  

Jingle	
  Client	
  

•  Jingle	
  is	
  a	
  signaling	
  extension	
  to	
  XMPP	
  (Extensible	
  Messaging	
  and	
  
Presence	
  Protocol,	
  aka	
  Jabber)	
  
•  Peer	
  ConnecFon	
  SDP	
  can	
  be	
  mapped	
  to	
  Jingle	
  
•  Jingle	
  Endpoint	
  must	
  support	
  RTCWEB	
  Media	
  extensions	
  
AdhearsionConf	
  2013	
  

11	
  
WebRTC	
  and	
  PSTN	
  
Web	
  Server	
  

Peer	
  ConnecFon	
  (Audio)	
  
PSTN	
  Gateway	
  

Browser	
  M	
  

Phone	
  

•  Peer	
  ConnecFon	
  terminates	
  on	
  a	
  PSTN	
  Gateway	
  
•  Audio	
  Only	
  
•  EncrypFon	
  ends	
  at	
  Gateway	
  
AdhearsionConf	
  2013	
  

12	
  
WebRTC	
  with	
  SIP	
  
Web	
  Server	
  	
  

SIP	
  Proxy/Registrar	
  Server	
  	
  
WebSocket	
  (SIP)	
  

HTTP	
  	
  

(HTML5/CSS/
JavaScript)	
  

Browser	
  M	
  

(running	
  JavaScript	
  SIP	
  UA)	
  

HTTP	
  	
  
WebSocket	
  
(HTML5/CSS/
(SIP)	
  
JavaScript)	
  
SRTP	
  Media	
  

Browser	
  T	
  

(running	
  JavaScript	
  SIP	
  UA)	
  

	
  

•  Browser	
  runs	
  a	
  SIP	
  User	
  Agent	
  by	
  running	
  JavaScript	
  from	
  Web	
  Server	
  
	
  
•  SRTP	
  media	
  connecFon	
  uses	
  WebRTC	
  APIs	
  
•  Details	
  in	
  [dram-­‐iec-­‐sipcore-­‐websocket]	
  that	
  defines	
  SIP	
  transport	
  over	
  
AdhearsionConf	
  2013	
  
13	
  
WebSockets	
  
WebRTC	
  Signaling	
  Approaches	
  
•  Signaling	
  is	
  required	
  for	
  exchange	
  of	
  candidate	
  transport	
  
addresses,	
  codec	
  informaFon,	
  media	
  keying	
  informaFon	
  
•  Many	
  opFons	
  –	
  choice	
  is	
  up	
  to	
  web	
  developer	
  

AdhearsionConf	
  2013	
  

14	
  
How	
  to	
  Use	
  WebRTC	
  
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

All	
  media	
  added	
  

Set	
  Up	
  Peer	
  
ConnecFon	
  
Peer	
  ConnecFon	
  established	
  

AUach	
  Media	
  
or	
  Data	
  

AUach	
  more	
  media	
  or	
  data	
  

Ready	
  for	
  call	
  

Exchange	
  
Offer/Answer	
   2013	
  
AdhearsionConf	
  

16	
  
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

•  getUserMedia()
–  Audio	
  and/or	
  video	
  
–  Constraints	
  
–  User	
  permissions	
  

All	
  media	
  added	
  

Set	
  Up	
  Peer	
  
ConnecFon	
  

•  Browser	
  must	
  ask	
  before	
  
allowing	
  a	
  page	
  to	
  access	
  
microphone	
  or	
  camera	
  

Peer	
  ConnecFon	
  established	
  

AUach	
  Media	
  
or	
  Data	
  

AUach	
  more	
  media	
  or	
  data	
  

•  MediaStream
•  MediaStreamTrack
–  CapabiliFes	
  
–  States	
  (sepngs)	
  

Ready	
  for	
  call	
  

Exchange	
  
Offer/Answer	
  

AdhearsionConf	
  2013	
  

17	
  
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

•  RTCPeerConnection

– 
All	
  media	
  added	
  
– 
– 
Set	
  Up	
  Peer	
  
– 
ConnecFon	
  
– 
Peer	
  ConnecFon	
  established	
  
– 
AUach	
  Media	
  
AUach	
  more	
  media	
  or	
  data	
  
– 
or	
  Data	
  
– 
Ready	
  for	
  call	
  

Direct	
  media	
  
Between	
  two	
  peers	
  
ICE	
  processing	
  
SDP	
  processing	
  
DTMF	
  support	
  
Data	
  channels	
  
IdenFty	
  verificaFon	
  
StaFsFcs	
  reporFng	
  

Exchange	
  
Offer/Answer	
  
AdhearsionConf	
  2013	
  

18	
  
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

•  addStream()
–  Doesn't	
  change	
  media	
  state!	
  

•  removeStream()

All	
  media	
  added	
  

–  DiUo!	
  

Set	
  Up	
  Peer	
  
ConnecFon	
  

•  createDataChannel()

Peer	
  ConnecFon	
  established	
  

AUach	
  Media	
  
or	
  Data	
  

–  Depends	
  on	
  transport	
  

AUach	
  more	
  media	
  or	
  data	
  

Ready	
  for	
  call	
  

Exchange	
  
Offer/Answer	
  

AdhearsionConf	
  2013	
  

19	
  
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

All	
  media	
  added	
  

Set	
  Up	
  Peer	
  
ConnecFon	
  
Peer	
  ConnecFon	
  established	
  

AUach	
  Media	
  
or	
  Data	
  

•  createOffer(),
createAnswer()
•  setLocalDescription(),
setRemoteDescription()

•  Applying	
  SDP	
  answer	
  makes	
  
the	
  magic	
  happen

AUach	
  more	
  media	
  or	
  data	
  

Ready	
  for	
  call	
  

Exchange	
  
Session	
  
DescripFons	
  

AdhearsionConf	
  2013	
  

20	
  
WebRTC	
  usage	
  –	
  a	
  bit	
  more	
  detail	
  
Set	
  Up	
  Signaling	
  
Channel	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

Set	
  Up	
  Peer	
  
ConnecFon	
  

AUach	
  Media	
  or	
  
Data	
  

Exchange	
  
Session	
  
AdhearsionConf	
  2013	
  
DescripFons	
  

AUach	
  more	
  media	
  or	
  data	
  

21	
  
SDP	
  offer/answer	
  
•  Session	
  DescripFons	
  
–  Session	
  DescripFon	
  Protocol	
  created	
  for	
  use	
  by	
  
SIP	
  in	
  sepng	
  up	
  voice	
  (and	
  video)	
  calls	
  
–  Describes	
  real-­‐Fme	
  media	
  at	
  low	
  level	
  of	
  detail	
  
•  Which	
  IP	
  addresses	
  and	
  ports	
  to	
  use	
  
•  Which	
  codecs	
  to	
  use	
  

•  Offer/answer	
  model	
  (JSEP)	
  
–  One	
  side	
  sends	
  an	
  SDP	
  offer	
  lisFng	
  what	
  it	
  wants	
  
to	
  send	
  and	
  what	
  it	
  can	
  receive	
  
–  Other	
  side	
  replies	
  with	
  an	
  SDP	
  answer	
  lisFng	
  what	
  
it	
  will	
  receive	
  and	
  send	
  
AdhearsionConf	
  2013	
  

22	
  
WebRTC	
  Peer-­‐to-­‐Peer	
  Media	
  
Media	
  Flows	
  in	
  WebRTC	
  
Web	
  Server	
  
	
  
	
  

Internet	
  

Home	
  WiFi	
  
Router	
  

Router	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  Router	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

24	
  
Media	
  without	
  WebRTC	
  
Web	
  Server	
  
	
  
	
  

Internet	
  

Home	
  WiFi	
  
Router	
  

Router	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  Router	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

25	
  
Peer-­‐to-­‐Peer	
  Media	
  with	
  WebRTC	
  
Web	
  Server	
  
	
  
	
  

Internet	
  

Home	
  WiFi	
  
Router	
  

Router	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  Router	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

26	
  
NAT	
  Complicates	
  Peer-­‐to-­‐Peer	
  Media	
  
Web	
  Server	
  

Most	
  browsers	
  are	
  behind	
  NATs	
  
on	
  the	
  Internet,	
  which	
  
complicates	
  the	
  establishment	
  
of	
  peer-­‐to-­‐peer	
  media	
  sessions.	
  
	
  

	
  
	
  

Internet	
  

Router	
  with	
  
NAT	
  

Home	
  WiFi	
  	
  
with	
  NAT	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  with	
  
NAT	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

27	
  
What	
  is	
  a	
  NAT?	
  
•  Network	
  Address	
  Translator	
  (NAT)	
  
•  Used	
  to	
  map	
  an	
  inside	
  address	
  (usually	
  a	
  
private	
  IP	
  address)	
  to	
  outside	
  address	
  
(usually	
  a	
  public	
  IP	
  address)	
  at	
  Layer	
  3	
  
•  Network	
  Address	
  and	
  Port	
  TranslaFon	
  
(NAPT)	
  also	
  changes	
  the	
  transport	
  port	
  
number	
  (Layer	
  4)	
  
– These	
  are	
  omen	
  just	
  called	
  NATs	
  as	
  well	
  
•  One	
  reason	
  for	
  NAT	
  is	
  the	
  IP	
  address	
  
shortage	
  
AdhearsionConf	
  2013	
  

28	
  
NAT	
  Example	
  
Internet	
  
“Outside”	
  	
  	
  Public	
  IP	
  Address	
  	
  

	
  

203.0.113.4

“Inside”	
  	
  	
  Private	
  IP	
  Addresses	
  	
  

	
  

192.168.x.x

Home	
  WiFi	
  	
  
with	
  NAT	
  
	
  

Browser	
  M	
  
192.168.0.5	
  
	
  

Browser	
  T	
  
192.168.0.6	
  
	
  
	
  

AdhearsionConf	
  2013	
  

29	
  
NATs	
  and	
  ApplicaFons	
  
•  NATs	
  are	
  compaFble	
  with	
  client/server	
  protocols	
  
such	
  as	
  web,	
  email,	
  etc.	
  
•  However,	
  NATs	
  generally	
  block	
  peer-­‐to-­‐peer	
  
communicaFon	
  

•  Typical	
  NAT	
  traversal	
  for	
  VoIP	
  and	
  video	
  
services	
  today	
  use	
  a	
  media	
  relay	
  whenever	
  the	
  
client	
  is	
  behind	
  a	
  NAT	
  
–  Omen	
  done	
  with	
  an	
  SBC	
  –	
  Session	
  Border	
  
Controller	
  
–  This	
  is	
  a	
  major	
  expense	
  and	
  complicaFon	
  in	
  
exisFng	
  VoIP	
  and	
  video	
  systems	
  

•  WebRTC	
  has	
  a	
  built-­‐in	
  NAT	
  traversal	
  strategy:	
  
InteracFve	
  ConnecFvity	
  Establishment	
  (ICE)	
  
AdhearsionConf	
  2013	
  

30	
  
Peer-­‐to-­‐Peer	
  Media	
  Through	
  NAT	
  
Web	
  Server	
  

ICE	
  connecFvity	
  checks	
  can	
  
omen	
  establish	
  a	
  direct	
  peer-­‐
to-­‐peer	
  session	
  between	
  
browsers	
  behind	
  different	
  
NATs	
  
	
  

	
  
	
  

Internet	
  

Router	
  with	
  
NAT	
  

Home	
  WiFi	
  	
  
with	
  NAT	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  with	
  
NAT	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

31	
  
ICE	
  ConnecFvity	
  Checks	
  
•  ConnecFvity	
  through	
  NAT	
  can	
  be	
  achieved	
  using	
  ICE	
  
connecFvity	
  checks	
  
•  Browsers	
  exchange	
  a	
  list	
  of	
  candidates	
  
–  Local:	
  read	
  from	
  network	
  interfaces	
  	
  
–  Reflexive:	
  obtained	
  using	
  a	
  STUN	
  Server	
  
–  Relayed:	
  obtained	
  from	
  a	
  TURN	
  Server	
  (media	
  relay)	
  

•  Browsers	
  aUempt	
  to	
  send	
  STUN	
  packets	
  to	
  the	
  
candidate	
  list	
  received	
  from	
  other	
  browser	
  
•  Checks	
  performed	
  by	
  both	
  sides	
  at	
  same	
  Fme	
  
•  If	
  one	
  STUN	
  packet	
  gets	
  through,	
  a	
  response	
  is	
  sent	
  
and	
  this	
  connecFon	
  used	
  for	
  communicaFon	
  
–  TURN	
  relay	
  will	
  be	
  last	
  resort	
  (lowest	
  priority)	
  
AdhearsionConf	
  2013	
  

32	
  
P2P	
  Media	
  Can	
  Stay	
  Local	
  to	
  NAT	
  
If	
  both	
  browsers	
  are	
  
behind	
  the	
  same	
  NAT,	
  
connecFvity	
  checks	
  can	
  
omen	
  establish	
  a	
  
connecFon	
  that	
  never	
  
leaves	
  the	
  NAT.	
  
	
  

Web	
  Server	
  
	
  
	
  

Internet	
  

Router	
  with	
  
NAT	
  

Home	
  WiFi	
  	
  
with	
  NAT	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  with	
  
NAT	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

33	
  
ICE	
  Servers	
  
Web	
  Server	
  
	
  
	
  

STUN	
  Server	
   TURN	
  Server	
  
198.51.100.9	
   198.51.100.2	
  
	
  

	
  

ICE	
  uses	
  STUN	
  and	
  TURN	
  
servers	
  in	
  the	
  public	
  
Internet	
  to	
  help	
  with	
  NAT	
  
traversal.	
  
	
  

Internet	
  

Home	
  WiFi	
  	
  
with	
  NAT	
  
203.0.113.4

Router	
  with	
  
NAT	
  

	
  

	
  

	
  
Browser	
  M	
  
192.168.0.5	
  
	
  

	
  

	
  

Browser	
  D	
  
	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  with	
  
NAT	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

34	
  
Browser	
  Queries	
  STUN	
  Server	
  	
  
Web	
  Server	
  
	
  
	
  

STUN	
  Server	
   TURN	
  Server	
  
198.51.100.9	
   198.51.100.2	
  
	
  

	
  

Browser	
  sends	
  STUN	
  test	
  
packet	
  to	
  STUN	
  server	
  to	
  
learn	
  its	
  public	
  IP	
  address	
  
(address	
  of	
  the	
  NAT).	
  
	
  

Internet	
  

Home	
  WiFi	
  	
  
with	
  NAT	
  
203.0.113.4

Router	
  with	
  
NAT	
  

	
  

	
  

	
  
Browser	
  M	
  
192.168.0.5	
  
	
  

	
  

	
  

Browser	
  D	
  
	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  with	
  
NAT	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

35	
  
TURN	
  Server	
  Can	
  Relay	
  Media	
  
Web	
  Server	
  
	
  
	
  

STUN	
   TURN	
  Server	
  as	
  a	
  
Server	
  
Media	
  Relay	
  
	
  

	
  

In	
  some	
  cases,	
  connecFvity	
  
checks	
  fail,	
  and	
  a	
  TURN	
  
Media	
  Relay	
  on	
  the	
  public	
  
Internet	
  must	
  be	
  used.	
  	
  
	
  

Internet	
  

Router	
  with	
  
NAT	
  

Home	
  WiFi	
  	
  
with	
  NAT	
  

	
  

	
  

	
  

	
  

Browser	
  M	
  
	
  

Browser	
  D	
  

Browser	
  T	
  
	
  

Coffee	
  Shop	
  
WiFi	
  with	
  
NAT	
  
	
  
	
  

Browser	
  L	
  
	
  

AdhearsionConf	
  2013	
  

36	
  
WebRTC	
  Protocols	
  and	
  IETF	
  
Standards	
  
WebRTC:	
  A	
  Joint	
  Standards	
  Effort	
  
•  Internet	
  Engineering	
  Task	
  Force	
  (IETF)	
  and	
  World	
  
Wide	
  Web	
  ConsorFum	
  (W3C)	
  are	
  working	
  together	
  on	
  
WebRTC	
  
•  IETF	
  
–  Protocols	
  –	
  “bits	
  on	
  wire”	
  
–  Main	
  protocols	
  are	
  already	
  RFCs,	
  but	
  many	
  extensions	
  in	
  
progress	
  
–  RTCWEB	
  (Real-­‐Time	
  CommunicaFons	
  on	
  the	
  Web)	
  Working	
  
Group	
  is	
  the	
  main	
  focus,	
  but	
  other	
  WGs	
  involved	
  as	
  well	
  
–  hUp://www.iec.org	
  	
  

•  W3C	
  

–  APIs	
  –	
  used	
  by	
  JavaScript	
  code	
  in	
  HTML5	
  
–  hUp://www.w3c.org	
  
AdhearsionConf	
  2013	
  

38	
  
WebRTC	
  Protocols	
  
ApplicaFon	
  Layer	
  
HTTP	
  

ICE	
  

WebSocket	
  

SRTP	
  

SDP	
  

STUN	
  
TURN	
  

Transport	
  Layer	
  

TLS	
  
TCP	
  

Network	
  Layer	
  

DTLS	
  
UDP	
  

SCTP	
  

IP	
  
SIP	
  is	
  not	
  shown	
  as	
  it	
  is	
  opFonal	
  
AdhearsionConf	
  2013	
  

39	
  
IETF	
  RTCWEB	
  Documents	
  
Document)

Ref)

Overview'

“Overview:'Real'Time'Protocols'for'
Browser6based'Applications”'

draft6ietf6rtcweb6
overview'

Use'Cases'and'Requirements'

“Web'Real6Time'Communication'
Use6cases'and'Requirements”'

draft6ietf6rtcweb6
use6cases6and6
requirements'

RTP'Usage'

“Web'Real6Time'Communication'
(WebRTC):'Media'Transport'and'
Use'of'RTP”'

draft6ietf6rtcweb6
rtp6usage'

Security'Architecture'

“RTCWEB'Security'Architecture”'

draft6ietf6rtcweb6
security6arch'

Threat'Model'

“Security'Considerations'for'RTC6
Web”'

draft6ietf6rtcweb6
security'

Data'Channel'

“RTCWeb'Data'Channels”'

draft6ietf6rtcweb6
data6channel'

JSEP'

“JavaScript'Session'Establishment'
Protocol”'

draft6ietf6rtcweb6
jsep'

Audio'

“WebRTC'Audio'Codec'and'
Processing'Requirements”'

draft6ietf6rtcweb6
audio'

Quality'of'Service'

'

Title)

“DSCP'and'other'packet'markings'
for'RTCWeb'QoS”'

draft6ietf6rtcweb6
qos'

AdhearsionConf	
  2013	
  

40	
  
Codecs	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  RFC	
  6716	
  	
  	
  	
  	
  	
  	
  	
  	
  .	
  	
  	
  	
  	
  	
  	
  	
  

•  Mandatory	
  to	
  Implement	
  (MTI)	
  audio	
  codecs	
  are	
  
seUled	
  on	
  Opus	
  and	
  G.711	
  (finally!)	
  
•  Video	
  is	
  not	
  yet	
  decided!	
  
AdhearsionConf	
  2013	
  

41	
  
WebRTC	
  W3C	
  API	
  Overview	
  
Two	
  primary	
  API	
  secFons	
  
•  Handling	
  local	
  media	
  
–  Media	
  Capture	
  and	
  Streams	
  (getUserMedia)	
  
specificaFon	
  

•  Transmipng	
  media	
  
–  WebRTC	
  (Peer	
  ConnecFon)	
  specificaFon	
  

AdhearsionConf	
  2013	
  

43	
  
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  

“Audio”	
  Track	
  

Presenter	
  Stream	
  

“PresentaFon”	
  
	
  Track	
  
“Audio”	
  Track	
  

Microphone	
  Audio	
  

Presenter	
  Video	
  

ApplicaFon	
  Sharing	
  Video	
  

Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  

“Presenter”	
  	
  
Track	
  
DemonstraFon	
  Stream	
  
“Audio”	
  Track	
  

DemonstraFon	
  
Video	
  

Browser	
  M	
  

Sources	
  

PresentaFon	
  Stream	
  

Captured	
  
MediaStreams	
  

•  In	
  this	
  example	
  

“DemonstraFon”	
  
	
  Track	
  

Created	
  
MediaStreams	
  

Tracks	
  

–  Captured	
  4	
  local	
  media	
  streams	
  
–  Created	
  3	
  media	
  streams	
  from	
  them	
  
–  Sent	
  streams	
  over	
  Peer	
  ConnecFon	
  
AdhearsionConf	
  2013	
  

44	
  
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  

Browser	
  M	
  

Sources	
  

•  Sources	
  

“Audio”	
  Track	
  

Presenter	
  Stream	
  

“PresentaFon”	
  
	
  Track	
  
“Audio”	
  Track	
  

Presenter	
  Video	
  

Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  

PresentaFon	
  Stream	
  

“Presenter”	
  	
  
Track	
  
DemonstraFon	
  Stream	
  
“Audio”	
  Track	
  

DemonstraFon	
  
Video	
  

Captured	
  
MediaStreams	
  

“DemonstraFon”	
  
	
  Track	
  

Created	
  
MediaStreams	
  

Tracks	
  

–  Encoded	
  together	
  
–  Can't	
  manipulate	
  individually	
  
AdhearsionConf	
  2013	
  

45	
  
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  

Browser	
  M	
  

“Audio”	
  Track	
  

Presenter	
  Stream	
  

“PresentaFon”	
  
	
  Track	
  
“Audio”	
  Track	
  

Presenter	
  Video	
  

Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  

PresentaFon	
  Stream	
  

“Presenter”	
  	
  
Track	
  
DemonstraFon	
  Stream	
  
“Audio”	
  Track	
  

DemonstraFon	
  
Video	
  

Sources	
  

Captured	
  
MediaStreams	
  
•  Tracks	
  (MediaStreamTrack)	
  

“DemonstraFon”	
  
	
  Track	
  

Created	
  
MediaStreams	
  

Tracks	
  

–  Tied	
  to	
  a	
  source	
  
–  Exist	
  primarily	
  as	
  part	
  of	
  Streams;	
  single	
  media	
  type	
  
–  Globally	
  unique	
  ids;	
  opFonally	
  browser-­‐labeled	
  
AdhearsionConf	
  2013	
  

46	
  
Local	
  Media	
  Handling	
  
PresentaFon	
  Stream	
  

Audio	
  

“Audio”	
  Track	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  

Presenter	
  Stream	
  

“Audio”	
  Track	
  
Presenter	
  Video	
  

“Presenter”	
  	
  
Track	
  
DemonstraFon	
  Stream	
  

Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  

Browser	
  M	
  

Sources	
  

“PresentaFon”	
  
	
  Track	
  

“Audio”	
  Track	
  

DemonstraFon	
  
Video	
  

Captured	
  
MediaStreams	
  

•  Captured	
  MediaStream	
  

“DemonstraFon”	
  
	
  Track	
  

Created	
  
MediaStreams	
  

Tracks	
  

–  Returned	
  from	
  getUserMedia()
–  Permission	
  check	
  required	
  to	
  obtain	
  
AdhearsionConf	
  2013	
  

47	
  
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  

Browser	
  M	
  

Sources	
  

“Audio”	
  Track	
  

Presenter	
  Stream	
  

“PresentaFon”	
  
	
  Track	
  
“Audio”	
  Track	
  

Presenter	
  Video	
  

Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  

PresentaFon	
  Stream	
  

“Presenter”	
  	
  
Track	
  
DemonstraFon	
  Stream	
  
“Audio”	
  Track	
  

DemonstraFon	
  
Video	
  

Captured	
  
MediaStreams	
  

•  MediaStream	
  

“DemonstraFon”	
  
	
  Track	
  

Created	
  
MediaStreams	
  

Tracks	
  

–  All	
  contained	
  tracks	
  are	
  synchronized	
  
–  Can	
  be	
  created,	
  transmiUed,	
  etc.	
  
AdhearsionConf	
  2013	
  

48	
  
Local	
  Media	
  Handling	
  
•  Sepngs	
  
–  Current	
  values	
  of	
  source	
  properFes	
  (height,	
  width,	
  
etc.)	
  
–  Exposed	
  on	
  MediaStreamTrack

•  CapabiliFes	
  
–  Allowed	
  values	
  for	
  source	
  properFes	
  
–  Exposed	
  on	
  MediaStreamTrack

•  Constraints	
  
–  Requested	
  ranges	
  for	
  track	
  properFes	
  
–  Used	
  in	
  getUserMedia(),	
  applyConstraints()
AdhearsionConf	
  2013	
  

49	
  
Transmipng	
  media	
  
•  Signaling	
  channel	
  
–  Non-­‐standard	
  
–  Must	
  exist	
  to	
  set	
  up	
  Peer	
  ConnecFon	
  

•  Peer	
  ConnecFon	
  
–  Links	
  together	
  two	
  peers	
  
–  Add/Remove	
  Media	
  Streams	
  
•  addStream(),	
  removeStream()

–  Handlers	
  for	
  ICE	
  or	
  media	
  change	
  
–  Data	
  Channel	
  support	
  
AdhearsionConf	
  2013	
  

50	
  
Peer	
  ConnecFon	
  
•  "Links"	
  together	
  two	
  peers	
  
–  Via	
  new RTCPeerConnection()
–  Generates	
  Session	
  DescripFon	
  offers/answers	
  
•  createOffer(),	
  createAnswer()

–  From	
  SDP	
  answers,	
  iniFates	
  media	
  
•  setLocalDescription(),	
  setRemoteDescription()

–  Offers/answers	
  MUST	
  be	
  relayed	
  by	
  applicaFon	
  
code!	
  
–  ICE	
  candidates	
  can	
  also	
  be	
  relayed	
  and	
  added	
  by	
  app	
  
•  addIceCandidate()	
  
AdhearsionConf	
  2013	
  

51	
  
Peer	
  ConnecFon	
  
•  Handlers	
  for	
  signaling,	
  ICE	
  or	
  media	
  change	
  
–  onsignalingstatechange
–  onicecandidate,	
  
oniceconnectionstatechange
–  onaddstream,	
  onremovestream
–  onnegotiationneeded
–  A	
  few	
  others	
  
AdhearsionConf	
  2013	
  

52	
  
Peer	
  ConnecFon	
  
•  “Extra”	
  APIs	
  
–  Data	
  
–  DTMF	
  
–  StaFsFcs	
  
–  IdenFty	
  

•  Grouped	
  separately	
  in	
  WebRTC	
  spec	
  
–  but	
  really	
  part	
  of	
  RTCPeerConnection	
  
definiFon	
  
–  all	
  are	
  mandatory	
  to	
  implement	
  
AdhearsionConf	
  2013	
  

53	
  
Data	
  Channel	
  API	
  
•  RTCDataChannel createDataChannel()

•  Configurable	
  with	
  
– 
– 
– 
– 

ordered
maxRetransmits,	
  maxRetransmitTime
negotiated
id

•  Provides	
  RTCDataChannel	
  with	
  
–  send()
–  onopen,	
  onerror,	
  onclose,	
  onmessage*	
  

AdhearsionConf	
  2013	
  

54	
  
DTMF	
  API	
  
•  RTCDTMFSender createDTMFSender()
–  Associates	
  track	
  input	
  parameter	
  with	
  this	
  
RTCPeerConnection

•  RTCDTMFSender	
  provides	
  
–  boolean canInsertDTMF()
–  insertDTMF()
–  ontonechange
–  (other	
  stuff)	
  

AdhearsionConf	
  2013	
  

55	
  
StaFsFcs	
  API	
  
•  getStats()
–  Callback	
  returns	
  staFsFcs	
  for	
  given	
  track	
  

•  StaFsFcs	
  available	
  (local/remote)	
  are:	
  
–  Bytes/packets	
  xmiUed	
  
–  Bytes/packets	
  received	
  

•  May	
  be	
  useful	
  for	
  congesFon-­‐based	
  
adjustments	
  

AdhearsionConf	
  2013	
  

56	
  
IdenFty	
  API	
  
•  setIdentityProvider(),	
  
getIdentityAssertion()

•  Used	
  to	
  verify	
  idenFty	
  via	
  third	
  party,	
  e.g.,	
  
Facebook	
  Connect	
  
•  Both	
  methods	
  are	
  opFonal	
  
•  onidentity	
  handler	
  called	
  amer	
  any	
  
verificaFon	
  aUempt	
  
•  RTCPeerConnection.peerIdentity	
  holds	
  
any	
  verified	
  idenFty	
  asserFon	
  
AdhearsionConf	
  2013	
  

57	
  
Pseudo	
  Code	
  Walkthrough	
  
Pseudo	
  Code	
  
•  Close	
  to	
  real	
  code,	
  but	
  .	
  .	
  .	
  
•  No	
  HTML,	
  no	
  signaling	
  channel,	
  not	
  
asynchronous,	
  and	
  API	
  is	
  sFll	
  in	
  flux	
  
•  Don't	
  expect	
  this	
  to	
  work	
  anywhere	
  

AdhearsionConf	
  2013	
  

59	
  
Back	
  to	
  first	
  diagram	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  
Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  
WebCam	
  Video	
  
Stereo	
  Audio	
  

Browser	
  L	
  	
  	
  	
  
on	
  Laptop	
  

Browser	
  M	
  
on	
  Mobile	
  

•  Mobile	
  browser	
  "calls"	
  laptop	
  browser	
  
•  Each	
  sends	
  media	
  to	
  the	
  other	
  

AdhearsionConf	
  2013	
  

60	
  
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();

getMedia();
createPC();
attachMedia();
call();

getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(

•  We	
  will	
  look	
  next	
  at	
  each	
  of	
  these	
  
•  .	
  .	
  .	
  except	
  for	
  creaFng	
  the	
  signaling	
  
channel	
  

[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

61	
  
Mobile	
  browser	
  produces	
  .	
  .	
  .	
  
Audio	
  

PresentaFon	
  Stream	
  
“Audio”	
  Track	
  
“PresentaFon”	
  Track	
  

PresentaFon	
  Video	
  

Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  

“Audio”	
  Track	
  
Presenter	
  Video	
  

“Presenter”	
  Track	
  
DemonstraFon	
  Stream	
  

Front	
  Camera	
  Video	
  
Rear	
  Camera	
  Video	
  

Presenter	
  Stream	
  

“Audio”	
  Track	
  

DemonstraFon	
  Video	
  

“DemonstraFon”	
  Track	
  

Browser	
  M	
  

Sources	
  

Captured	
  MediaStreams	
  

Created	
  MediaStreams	
  

•  At	
  least	
  3	
  calls	
  to	
  getUserMedia()
•  Three	
  calls	
  to	
  new MediaStream()
•  App	
  sends	
  stream	
  ids,	
  then	
  streams	
  
AdhearsionConf	
  2013	
  

Tracks	
  

62	
  
funcFon	
  getMedia()	
  [1]	
  
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback

// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//

function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

. . .

function attachMedia() {
presentation =
new MediaStream(

•  Get	
  audio	
  
•  (Get	
  window	
  video	
  –	
  out	
  of	
  scope)	
  

[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

63	
  
funcFon	
  getMedia()	
  [2]	
  
. . .
constraint =
{"video": {"mandatory": {"facingMode": "environment"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//

constraint =
{"video": {"mandatory": {"facingMode": "user"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);

•  Get	
  front-­‐facing	
  camera	
  
•  Get	
  rear-­‐facing	
  camera	
  

constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

64	
  
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();

getMedia();
createPC();
attachMedia();
call();

getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(

•  We	
  will	
  look	
  next	
  at	
  each	
  of	
  these	
  
•  .	
  .	
  .	
  except	
  for	
  creaFng	
  the	
  signaling	
  
channel	
  

[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

65	
  
funcFon	
  createPC()	
  
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
pc = new RTCPeerConnection(configuration);

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();

pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

•  Create	
  RTCPeerConnection
•  Set	
  handlers	
  

function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

66	
  
Mobile	
  browser	
  consumes	
  .	
  .	
  .	
  
Audio	
  &	
  Video	
  Stream	
  

Display	
  

“Video”	
  Track	
  

Right	
  Headphone	
  

“Right”	
  Track	
  

Lem	
  Headphone	
  

Browser	
  M	
  

“Lem”	
  Track	
  
(Audio	
  &	
  Video	
  Stream	
  selected)	
  

Stereo	
  Stream	
  
“Right”	
  Track	
  
“Lem”	
  Track	
  
“Mono”	
  track	
  
Mono	
  Stream	
  

Sinks	
  

MediaStreams	
  

•  Receives	
  three	
  media	
  streams	
  
•  Chooses	
  one
•  Sends	
  tracks	
  to	
  output	
  channels	
  
AdhearsionConf	
  2013	
  

Tracks	
  

67	
  
FuncFon	
  handleIncomingStream()	
  
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);

•  If	
  incoming	
  stream	
  has	
  video	
  track,	
  set	
  to	
  
av_stream	
  and	
  display	
  it	
  
•  If	
  it	
  has	
  two	
  audio	
  tracks,	
  must	
  be	
  stereo	
  
•  Otherwise,	
  must	
  be	
  the	
  mono	
  stream	
  

pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

68	
  
FuncFon	
  show_av(st)	
  
display.srcObject =
new MediaStream(st.getVideoTracks()[0]);
left.srcObject =
new MediaStream(st.getAudioTracks()[0]);
right.srcObject =
new MediaStream(st.getAudioTracks()[1]);

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

•  Using	
  new	
  srcObject	
  property	
  on	
  media,	
  
•  Set	
  new	
  stream	
  as	
  source	
  

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

69	
  
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();

getMedia();
createPC();
attachMedia();
call();

getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(

•  We	
  will	
  look	
  next	
  at	
  each	
  of	
  these	
  
•  .	
  .	
  .	
  except	
  for	
  creaFng	
  the	
  signaling	
  
channel	
  

[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

70	
  
funcFon	
  aUachMedia()	
  [1]	
  
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
application.getVideoTracks()[0]]);
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
front.getVideoTracks()[0]]);
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
. . .

// Audio
// Presentation

"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);

// Audio
// Presenter

// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(

// Audio
// Demonstration

•  Create	
  3	
  new	
  streams,	
  all	
  with	
  same	
  
audio	
  but	
  different	
  video	
  
AdhearsionConf	
  2013	
  

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",

[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

71	
  
funcFon	
  aUachMedia()	
  [2]	
  
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

•  AUach	
  all	
  3	
  streams	
  to	
  Peer	
  ConnecFon	
  
•  Send	
  stream	
  ids	
  to	
  peer	
  (before	
  streams!)	
  
}

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));

}

}

function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));

}

signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

72	
  
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();

getMedia();
createPC();
attachMedia();
call();

getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(

•  We	
  will	
  look	
  next	
  at	
  each	
  of	
  these	
  
•  .	
  .	
  .	
  except	
  for	
  creaFng	
  the	
  signaling	
  
channel	
  

[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

73	
  
funcFon	
  call()	
  
pc.createOffer(gotDescription, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback

function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);

function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);

signalingChannel.send(JSON.stringify({ "sdp": desc }));

constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);

}

pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter

•  Ask	
  browser	
  to	
  create	
  SDP	
  offer	
  
•  Set	
  offer	
  as	
  local	
  descripFon	
  
•  Send	
  offer	
  to	
  peer	
  

demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

AdhearsionConf	
  2013	
  

74	
  
How	
  do	
  we	
  get	
  the	
  SDP	
  answer?	
  
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};

}

function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter

};

demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);

// Audio
// Demonstration

pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);

•  Signaling	
  channel	
  provides	
  message	
  
•  If	
  SDP,	
  set	
  as	
  remote	
  descripFon	
  
•  If	
  ICE	
  candidate,	
  tell	
  the	
  browser	
  
AdhearsionConf	
  2013	
  

}

signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));

function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};

75	
  
And	
  now	
  the	
  laptop	
  browser	
  .	
  .	
  .	
  
•  Watch	
  for	
  the	
  following	
  
–  We	
  set	
  up	
  media	
  *amer*	
  receiving	
  the	
  offer	
  
–  but	
  the	
  signaling	
  channel	
  sFll	
  must	
  exist	
  first!	
  
–  Also,	
  need	
  to	
  save	
  incoming	
  stream	
  ids	
  

AdhearsionConf	
  2013	
  

76	
  
Signaling	
  channel	
  message	
  is	
  trigger	
  
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();

}

attachMedia();

function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};

. . .

pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;

};

// Video
// Left audio
// Right audio
// Left audio
// Right audio

// Treat the left audio as the mono stream

pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {

•  Set	
  up	
  PC	
  and	
  media	
  if	
  not	
  already	
  done	
  

if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}

signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};

AdhearsionConf	
  2013	
  

77	
  
Signaling	
  channel	
  message	
  is	
  trigger	
  
signalingChannel.onmessage = function (msg) {
. . .
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();

}

attachMedia();

function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;

// Video
// Left audio
// Right audio
// Left audio
// Right audio

// Treat the left audio as the mono stream

pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);

};

•  If	
  SDP,	
  *also*	
  answer	
  
•  But	
  if	
  neither	
  SDP	
  nor	
  ICE	
  candidate,	
  must	
  
be	
  set	
  of	
  incoming	
  stream	
  ids,	
  so	
  save	
  
AdhearsionConf	
  2013	
  

signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}

signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};

78	
  
FuncFon	
  prepareForIncomingCall()	
  
createPC();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;

getMedia();

var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();

}

attachMedia();

function createPC() {
pc = new RTCPeerConnection(configuration);

attachMedia();

pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;

•  No	
  suprises	
  here	
  
•  Media	
  obtained	
  is	
  a	
  liUle	
  different	
  
•  But	
  aUached	
  the	
  same	
  way	
  
AdhearsionConf	
  2013	
  

// Video
// Left audio
// Right audio
// Left audio
// Right audio

// Treat the left audio as the mono stream

pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}

signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};

79	
  
FuncFon	
  answer()	
  
pc.createAnswer(gotDescription, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",

function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);

"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();

}

attachMedia();

function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =

signalingChannel.send(JSON.stringify({ "sdp": desc }));

function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);

}

constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;

•  createAnswer()	
  automaFcally	
  uses	
  
value	
  of	
  remoteDescription	
  when	
  

generaFng	
  new	
  SDP	
  

// Video
// Left audio
// Right audio
// Left audio
// Right audio

// Treat the left audio as the mono stream

pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}

signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};

AdhearsionConf	
  2013	
  

80	
  
Laptop	
  browser	
  consumes	
  .	
  .	
  .	
  
PresentaFon	
  Stream	
  
“Audio”	
  Track	
  
“PresentaFon”	
  Track	
  
Presenter	
  Stream	
  	
  
“Audio”	
  Track	
  
Speaker	
  

“Presenter”	
  Track	
  

Display	
  

DemonstraFon	
  Stream	
  

Display	
  

“Audio”	
  Track	
  

Display	
  

“DemonstraFon”	
  Track	
  

Browser	
  L	
  

(All	
  video	
  streams	
  selected)	
  

Tracks	
  

MediaStreams	
  

Sinks	
  

•  Three	
  input	
  streams
•  All	
  have	
  same	
  #	
  of	
  audio	
  and	
  video	
  tracks
•  Need	
  stream	
  ids	
  to	
  disFnguish	
  
AdhearsionConf	
  2013	
  

81	
  
FuncFon	
  handleIncomingStream()	
  
if (st.id === incoming.presentation) {
speaker.srcObject =
new MediaStream(st.getAudioTracks()[0]);
win1.srcObject =
new MediaStream(st.getVideoTracks()[0]);
} else if (st.id === incoming.presenter) {
win2.srcObject =
new MediaStream(st.getVideoTracks()[0]);
} else {
win3.srcObject =
new MediaStream(st.getVideoTracks()[0]);
}

•  Use	
  ids	
  to	
  disFnguish	
  streams	
  
•  Extract	
  one	
  audio	
  and	
  all	
  video	
  tracks	
  
•  Assign	
  to	
  element	
  sources	
  
AdhearsionConf	
  2013	
  

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();

}

attachMedia();

function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;

// Video
// Left audio
// Right audio
// Left audio
// Right audio

// Treat the left audio as the mono stream

pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}

signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};

82	
  
Laptop	
  browser	
  produces	
  .	
  .	
  .	
  
Audio	
  &	
  Video	
  Stream	
  
Video	
  

“Video”	
  Track	
  

WebCam	
  
Lem	
  Microphone	
  

“Right”	
  Track	
  
“Lem”	
  Track	
  

Lem	
  Audio	
  

Right	
  Microphone	
  

Stereo	
  Stream	
  

Browser	
  L	
  

“Right”	
  Track	
  

Right	
  Audio	
  

“Lem”	
  Track	
  
“Mono”	
  Track	
  
Mono	
  Stream	
  

Tracks	
  

Created	
  MediaStreams	
  

Captured	
  
MediaStreams	
  

Sources	
  

•  Three	
  calls	
  to	
  getUserMedia()
•  Three	
  calls	
  to	
  new MediaStream()
•  No	
  stream	
  ids	
  needed	
  
AdhearsionConf	
  2013	
  

83	
  
FuncFon	
  getMedia()	
  [1]	
  
navigator.getUserMedia({"video": true}, function (stream) {
webcam = stream;
}, e);

var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}

//

stub error callback

var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();

}

attachMedia();

function createPC() {
pc = new RTCPeerConnection(configuration);

. . .

pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;

•  Request	
  webcam	
  video	
  

// Video
// Left audio
// Right audio
// Left audio
// Right audio

// Treat the left audio as the mono stream

pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}

signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};

AdhearsionConf	
  2013	
  

84	
  
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett
WebRTC Overview by Dan Burnett

Weitere ähnliche Inhalte

Was ist angesagt?

Asterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus Gateway
Asterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus GatewayAsterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus Gateway
Asterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus GatewayAlessandro Polidori
 
Cisco APIs: An Interactive Assistant for the Web2Day Developer Conference
Cisco APIs: An Interactive Assistant for the Web2Day Developer ConferenceCisco APIs: An Interactive Assistant for the Web2Day Developer Conference
Cisco APIs: An Interactive Assistant for the Web2Day Developer ConferenceCisco DevNet
 
SWIFT: Tango's Infrastructure For Real-Time Video Call Service
SWIFT: Tango's Infrastructure For Real-Time Video Call ServiceSWIFT: Tango's Infrastructure For Real-Time Video Call Service
SWIFT: Tango's Infrastructure For Real-Time Video Call ServiceMeng ZHANG
 
Developing SIP Applications
Developing SIP ApplicationsDeveloping SIP Applications
Developing SIP ApplicationsVoxeo Corp
 
Media Handling in FreeSWITCH
Media Handling in FreeSWITCHMedia Handling in FreeSWITCH
Media Handling in FreeSWITCHMoises Silva
 
Developing rich SIP applications with SIPSIMPLE SDK
Developing rich SIP applications with SIPSIMPLE SDKDeveloping rich SIP applications with SIPSIMPLE SDK
Developing rich SIP applications with SIPSIMPLE SDKSaúl Ibarra Corretgé
 
WebRTC: A front-end perspective
WebRTC: A front-end perspectiveWebRTC: A front-end perspective
WebRTC: A front-end perspectiveshwetank
 
SIP and DNS - federation, failover, load balancing and more
SIP and DNS - federation, failover, load balancing and moreSIP and DNS - federation, failover, load balancing and more
SIP and DNS - federation, failover, load balancing and moreOlle E Johansson
 
When DevOps and Networking Intersect by Brent Salisbury of socketplane.io
When DevOps and Networking Intersect by Brent Salisbury of socketplane.ioWhen DevOps and Networking Intersect by Brent Salisbury of socketplane.io
When DevOps and Networking Intersect by Brent Salisbury of socketplane.ioDevOps4Networks
 
SIP Testing with FreeSWITCH
SIP Testing with FreeSWITCHSIP Testing with FreeSWITCH
SIP Testing with FreeSWITCHMoises Silva
 
Rome 2017: Building advanced voice assistants and chat bots
Rome 2017: Building advanced voice assistants and chat botsRome 2017: Building advanced voice assistants and chat bots
Rome 2017: Building advanced voice assistants and chat botsCisco DevNet
 
WebRTC overview
WebRTC overviewWebRTC overview
WebRTC overviewRouyun Pan
 
Things I wished I knew before building my first WebRTC app - RTE2020
Things I wished I knew before building my first WebRTC app - RTE2020Things I wished I knew before building my first WebRTC app - RTE2020
Things I wished I knew before building my first WebRTC app - RTE2020Alberto González Trastoy
 
2015 update: SIP and IPv6 issues - staying Happy in SIP
2015 update: SIP and IPv6 issues - staying Happy in SIP2015 update: SIP and IPv6 issues - staying Happy in SIP
2015 update: SIP and IPv6 issues - staying Happy in SIPOlle E Johansson
 
PLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environment
PLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environmentPLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environment
PLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environmentPROIDEA
 
WAN Automation Engine API Deep Dive
WAN Automation Engine API Deep DiveWAN Automation Engine API Deep Dive
WAN Automation Engine API Deep DiveCisco DevNet
 

Was ist angesagt? (20)

Asterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus Gateway
Asterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus GatewayAsterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus Gateway
Asterisk WebRTC frontier: make client SIP Phone with sipML5 and Janus Gateway
 
Cisco APIs: An Interactive Assistant for the Web2Day Developer Conference
Cisco APIs: An Interactive Assistant for the Web2Day Developer ConferenceCisco APIs: An Interactive Assistant for the Web2Day Developer Conference
Cisco APIs: An Interactive Assistant for the Web2Day Developer Conference
 
SWIFT: Tango's Infrastructure For Real-Time Video Call Service
SWIFT: Tango's Infrastructure For Real-Time Video Call ServiceSWIFT: Tango's Infrastructure For Real-Time Video Call Service
SWIFT: Tango's Infrastructure For Real-Time Video Call Service
 
Developing SIP Applications
Developing SIP ApplicationsDeveloping SIP Applications
Developing SIP Applications
 
Kamailio World 2014 - Kamailio - The Platform for Interoperable WebRTC
Kamailio World 2014 - Kamailio - The Platform for Interoperable WebRTCKamailio World 2014 - Kamailio - The Platform for Interoperable WebRTC
Kamailio World 2014 - Kamailio - The Platform for Interoperable WebRTC
 
DevCon 5 (December 2013) - WebRTC & WebSockets
DevCon 5 (December 2013) - WebRTC & WebSocketsDevCon 5 (December 2013) - WebRTC & WebSockets
DevCon 5 (December 2013) - WebRTC & WebSockets
 
Media Handling in FreeSWITCH
Media Handling in FreeSWITCHMedia Handling in FreeSWITCH
Media Handling in FreeSWITCH
 
Snappy Kamailio
Snappy KamailioSnappy Kamailio
Snappy Kamailio
 
Developing rich SIP applications with SIPSIMPLE SDK
Developing rich SIP applications with SIPSIMPLE SDKDeveloping rich SIP applications with SIPSIMPLE SDK
Developing rich SIP applications with SIPSIMPLE SDK
 
WebRTC Summit November 2013 - WebRTC Interoperability (and why it is important)
WebRTC Summit November 2013 - WebRTC Interoperability (and why it is important)WebRTC Summit November 2013 - WebRTC Interoperability (and why it is important)
WebRTC Summit November 2013 - WebRTC Interoperability (and why it is important)
 
WebRTC: A front-end perspective
WebRTC: A front-end perspectiveWebRTC: A front-end perspective
WebRTC: A front-end perspective
 
SIP and DNS - federation, failover, load balancing and more
SIP and DNS - federation, failover, load balancing and moreSIP and DNS - federation, failover, load balancing and more
SIP and DNS - federation, failover, load balancing and more
 
When DevOps and Networking Intersect by Brent Salisbury of socketplane.io
When DevOps and Networking Intersect by Brent Salisbury of socketplane.ioWhen DevOps and Networking Intersect by Brent Salisbury of socketplane.io
When DevOps and Networking Intersect by Brent Salisbury of socketplane.io
 
SIP Testing with FreeSWITCH
SIP Testing with FreeSWITCHSIP Testing with FreeSWITCH
SIP Testing with FreeSWITCH
 
Rome 2017: Building advanced voice assistants and chat bots
Rome 2017: Building advanced voice assistants and chat botsRome 2017: Building advanced voice assistants and chat bots
Rome 2017: Building advanced voice assistants and chat bots
 
WebRTC overview
WebRTC overviewWebRTC overview
WebRTC overview
 
Things I wished I knew before building my first WebRTC app - RTE2020
Things I wished I knew before building my first WebRTC app - RTE2020Things I wished I knew before building my first WebRTC app - RTE2020
Things I wished I knew before building my first WebRTC app - RTE2020
 
2015 update: SIP and IPv6 issues - staying Happy in SIP
2015 update: SIP and IPv6 issues - staying Happy in SIP2015 update: SIP and IPv6 issues - staying Happy in SIP
2015 update: SIP and IPv6 issues - staying Happy in SIP
 
PLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environment
PLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environmentPLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environment
PLNOG 13: Bart Salaets: Optimising TCP in today’s changing network environment
 
WAN Automation Engine API Deep Dive
WAN Automation Engine API Deep DiveWAN Automation Engine API Deep Dive
WAN Automation Engine API Deep Dive
 

Andere mochten auch

Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...
Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...
Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...ID4you
 
'Where Friends and Business Meet'
'Where Friends and Business Meet''Where Friends and Business Meet'
'Where Friends and Business Meet'Luke Murphy
 
Resume Yannick Jolliet (v17 English)
Resume Yannick Jolliet (v17 English)Resume Yannick Jolliet (v17 English)
Resume Yannick Jolliet (v17 English)Yannick Jolliet
 
Zvi hecker
Zvi heckerZvi hecker
Zvi heckerjurox
 
Diccionario lengua de señas chile
Diccionario lengua de señas chileDiccionario lengua de señas chile
Diccionario lengua de señas chileMaka Abarca Varas
 
Bettencourt 2014 rota dar es salaam 28 april
Bettencourt 2014 rota dar es salaam 28 april  Bettencourt 2014 rota dar es salaam 28 april
Bettencourt 2014 rota dar es salaam 28 april Rotary International
 
El Forex Es Un Fraude Luis Gonzalez Espino
El Forex Es Un Fraude Luis Gonzalez EspinoEl Forex Es Un Fraude Luis Gonzalez Espino
El Forex Es Un Fraude Luis Gonzalez Espinofrogshole6
 
Local Marketing Platforms
Local Marketing PlatformsLocal Marketing Platforms
Local Marketing PlatformsArtoos
 
St john & st jmes ce primary school london
St john & st jmes ce primary school   londonSt john & st jmes ce primary school   london
St john & st jmes ce primary school londonJudith Moreno
 
1 estudio-cloud_computing_retos_y_oportunidades_vdef
1  estudio-cloud_computing_retos_y_oportunidades_vdef1  estudio-cloud_computing_retos_y_oportunidades_vdef
1 estudio-cloud_computing_retos_y_oportunidades_vdefOrlando Verdugo
 
Österreich VC/PE Situation 2015
Österreich VC/PE Situation 2015Österreich VC/PE Situation 2015
Österreich VC/PE Situation 2015Elfriede Sixt
 
Angular 2: Neuerungen und Migration
Angular 2: Neuerungen und MigrationAngular 2: Neuerungen und Migration
Angular 2: Neuerungen und MigrationManfred Steyer
 
Syllabus - Derecho Procesal Constitucional
Syllabus - Derecho Procesal ConstitucionalSyllabus - Derecho Procesal Constitucional
Syllabus - Derecho Procesal ConstitucionalJorge Baquerizo
 

Andere mochten auch (20)

Fiestas fase 7
Fiestas fase 7Fiestas fase 7
Fiestas fase 7
 
Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...
Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...
Diseño de experiencias turísticas - campañas de promoción y posicionamiento p...
 
NetBoxOne - NBO
NetBoxOne - NBONetBoxOne - NBO
NetBoxOne - NBO
 
'Where Friends and Business Meet'
'Where Friends and Business Meet''Where Friends and Business Meet'
'Where Friends and Business Meet'
 
Resume Yannick Jolliet (v17 English)
Resume Yannick Jolliet (v17 English)Resume Yannick Jolliet (v17 English)
Resume Yannick Jolliet (v17 English)
 
Zvi hecker
Zvi heckerZvi hecker
Zvi hecker
 
Diccionario lengua de señas chile
Diccionario lengua de señas chileDiccionario lengua de señas chile
Diccionario lengua de señas chile
 
Bettencourt 2014 rota dar es salaam 28 april
Bettencourt 2014 rota dar es salaam 28 april  Bettencourt 2014 rota dar es salaam 28 april
Bettencourt 2014 rota dar es salaam 28 april
 
El Forex Es Un Fraude Luis Gonzalez Espino
El Forex Es Un Fraude Luis Gonzalez EspinoEl Forex Es Un Fraude Luis Gonzalez Espino
El Forex Es Un Fraude Luis Gonzalez Espino
 
Local Marketing Platforms
Local Marketing PlatformsLocal Marketing Platforms
Local Marketing Platforms
 
Los años 50s.
Los años 50s.Los años 50s.
Los años 50s.
 
St john & st jmes ce primary school london
St john & st jmes ce primary school   londonSt john & st jmes ce primary school   london
St john & st jmes ce primary school london
 
Entrevista
EntrevistaEntrevista
Entrevista
 
1 estudio-cloud_computing_retos_y_oportunidades_vdef
1  estudio-cloud_computing_retos_y_oportunidades_vdef1  estudio-cloud_computing_retos_y_oportunidades_vdef
1 estudio-cloud_computing_retos_y_oportunidades_vdef
 
Österreich VC/PE Situation 2015
Österreich VC/PE Situation 2015Österreich VC/PE Situation 2015
Österreich VC/PE Situation 2015
 
Gimnasia ocular
Gimnasia ocularGimnasia ocular
Gimnasia ocular
 
Angular 2: Neuerungen und Migration
Angular 2: Neuerungen und MigrationAngular 2: Neuerungen und Migration
Angular 2: Neuerungen und Migration
 
NOESIS
NOESISNOESIS
NOESIS
 
El gin
El ginEl gin
El gin
 
Syllabus - Derecho Procesal Constitucional
Syllabus - Derecho Procesal ConstitucionalSyllabus - Derecho Procesal Constitucional
Syllabus - Derecho Procesal Constitucional
 

Ähnlich wie WebRTC Overview by Dan Burnett

TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan BurnettTADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan BurnettAlan Quayle
 
Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014
Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014
Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014Bart Uelen
 
Bridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptx
Bridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptxBridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptx
Bridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptxAlberto González Trastoy
 
Workshop web rtc customers and use cases
Workshop web rtc customers and use casesWorkshop web rtc customers and use cases
Workshop web rtc customers and use casesDouglas Tait
 
Getting Started with WebRTC
Getting Started with WebRTCGetting Started with WebRTC
Getting Started with WebRTCChad Hart
 
WebRTC Workshop - What is (and isn't WebRTC)
WebRTC Workshop - What is (and isn't WebRTC)WebRTC Workshop - What is (and isn't WebRTC)
WebRTC Workshop - What is (and isn't WebRTC)Oracle
 
Boost JBoss AS7 with HTML5 WebRTC for Real Time Communications
Boost JBoss AS7 with HTML5 WebRTC for Real Time CommunicationsBoost JBoss AS7 with HTML5 WebRTC for Real Time Communications
Boost JBoss AS7 with HTML5 WebRTC for Real Time Communicationstelestax
 
Architecting your WebRTC application for scalability, Arin Sime
Architecting your WebRTC application for scalability, Arin SimeArchitecting your WebRTC application for scalability, Arin Sime
Architecting your WebRTC application for scalability, Arin SimeAlan Quayle
 
WebRTC Standards Update (October 2014)
WebRTC Standards Update (October 2014)WebRTC Standards Update (October 2014)
WebRTC Standards Update (October 2014)Victor Pascual Ávila
 
WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...
WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...
WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...Amir Zmora
 
Upperside WebRTC conference - WebRTC intro
Upperside WebRTC conference - WebRTC introUpperside WebRTC conference - WebRTC intro
Upperside WebRTC conference - WebRTC introVictor Pascual Ávila
 
WebRTC standards update - November 2014
WebRTC standards update - November 2014WebRTC standards update - November 2014
WebRTC standards update - November 2014Victor Pascual Ávila
 
Baby Steps: A WebRTC Tutorial
Baby Steps: A WebRTC TutorialBaby Steps: A WebRTC Tutorial
Baby Steps: A WebRTC TutorialTsahi Levent-levi
 

Ähnlich wie WebRTC Overview by Dan Burnett (20)

TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan BurnettTADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
 
WebRTC
WebRTCWebRTC
WebRTC
 
Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014
Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014
Short introduction to WebRTC at the Amsterdam WebRTC Meetup, March 26, 2014
 
Bridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptx
Bridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptxBridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptx
Bridging_WebRTC_with_SIP_Alberto_WebRTCventures_Cluecon2023_NoVideo.pptx
 
WebRTC standards update (Jul 2014)
WebRTC standards update (Jul 2014)WebRTC standards update (Jul 2014)
WebRTC standards update (Jul 2014)
 
WebRTC Seminar Report
WebRTC  Seminar ReportWebRTC  Seminar Report
WebRTC Seminar Report
 
Workshop web rtc customers and use cases
Workshop web rtc customers and use casesWorkshop web rtc customers and use cases
Workshop web rtc customers and use cases
 
Getting Started with WebRTC
Getting Started with WebRTCGetting Started with WebRTC
Getting Started with WebRTC
 
WebRTC Workshop - What is (and isn't WebRTC)
WebRTC Workshop - What is (and isn't WebRTC)WebRTC Workshop - What is (and isn't WebRTC)
WebRTC Workshop - What is (and isn't WebRTC)
 
Webrtc and tokbox
Webrtc and tokboxWebrtc and tokbox
Webrtc and tokbox
 
Boost JBoss AS7 with HTML5 WebRTC for Real Time Communications
Boost JBoss AS7 with HTML5 WebRTC for Real Time CommunicationsBoost JBoss AS7 with HTML5 WebRTC for Real Time Communications
Boost JBoss AS7 with HTML5 WebRTC for Real Time Communications
 
Architecting your WebRTC application for scalability, Arin Sime
Architecting your WebRTC application for scalability, Arin SimeArchitecting your WebRTC application for scalability, Arin Sime
Architecting your WebRTC application for scalability, Arin Sime
 
WebRTC Standards Update (October 2014)
WebRTC Standards Update (October 2014)WebRTC Standards Update (October 2014)
WebRTC Standards Update (October 2014)
 
WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...
WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...
WebRTC Standards & Implementation Q&A - WebRTC Standards Feature Complete 
No...
 
Html5 RTC - 1
Html5 RTC  - 1Html5 RTC  - 1
Html5 RTC - 1
 
Workshop oracle
Workshop oracleWorkshop oracle
Workshop oracle
 
Upperside WebRTC conference - WebRTC intro
Upperside WebRTC conference - WebRTC introUpperside WebRTC conference - WebRTC intro
Upperside WebRTC conference - WebRTC intro
 
Web rtc 入門
Web rtc 入門Web rtc 入門
Web rtc 入門
 
WebRTC standards update - November 2014
WebRTC standards update - November 2014WebRTC standards update - November 2014
WebRTC standards update - November 2014
 
Baby Steps: A WebRTC Tutorial
Baby Steps: A WebRTC TutorialBaby Steps: A WebRTC Tutorial
Baby Steps: A WebRTC Tutorial
 

Mehr von Mojo Lingo

FreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In Sight
FreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In SightFreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In Sight
FreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In SightMojo Lingo
 
Using Asterisk to Create "Her"
Using Asterisk to Create "Her"Using Asterisk to Create "Her"
Using Asterisk to Create "Her"Mojo Lingo
 
Tipping the Scales: Measuring and Scaling Asterisk
Tipping the Scales: Measuring and Scaling AsteriskTipping the Scales: Measuring and Scaling Asterisk
Tipping the Scales: Measuring and Scaling AsteriskMojo Lingo
 
AdhearsionConf 2013 Keynote
AdhearsionConf 2013 KeynoteAdhearsionConf 2013 Keynote
AdhearsionConf 2013 KeynoteMojo Lingo
 
Speech-Enabling Web Apps
Speech-Enabling Web AppsSpeech-Enabling Web Apps
Speech-Enabling Web AppsMojo Lingo
 
WebRTC: What? How? Why? - ClueCon 2013
WebRTC: What? How? Why? - ClueCon 2013WebRTC: What? How? Why? - ClueCon 2013
WebRTC: What? How? Why? - ClueCon 2013Mojo Lingo
 
Infiltrando Telecoms Usando Ruby
Infiltrando Telecoms Usando RubyInfiltrando Telecoms Usando Ruby
Infiltrando Telecoms Usando RubyMojo Lingo
 
Enhancing FreePBX with Adhearsion
Enhancing FreePBX with AdhearsionEnhancing FreePBX with Adhearsion
Enhancing FreePBX with AdhearsionMojo Lingo
 
Connecting Adhearsion
Connecting AdhearsionConnecting Adhearsion
Connecting AdhearsionMojo Lingo
 
Testing Adhearsion Applications
Testing Adhearsion ApplicationsTesting Adhearsion Applications
Testing Adhearsion ApplicationsMojo Lingo
 
Testing Telephony: It's Not All Terrible
Testing Telephony: It's Not All TerribleTesting Telephony: It's Not All Terrible
Testing Telephony: It's Not All TerribleMojo Lingo
 
Rayo for XMPP Folks
Rayo for XMPP FolksRayo for XMPP Folks
Rayo for XMPP FolksMojo Lingo
 
Talking To Rails
Talking To RailsTalking To Rails
Talking To RailsMojo Lingo
 
Building Real Life Applications with Adhearsion
Building Real Life Applications with AdhearsionBuilding Real Life Applications with Adhearsion
Building Real Life Applications with AdhearsionMojo Lingo
 
Keeping It Realtime!
Keeping It Realtime!Keeping It Realtime!
Keeping It Realtime!Mojo Lingo
 
Integrating Voice Through Adhearsion
Integrating Voice Through AdhearsionIntegrating Voice Through Adhearsion
Integrating Voice Through AdhearsionMojo Lingo
 
Infiltrating Telecoms Using Ruby
Infiltrating Telecoms Using RubyInfiltrating Telecoms Using Ruby
Infiltrating Telecoms Using RubyMojo Lingo
 
Telephony Through Ruby Colored Lenses
Telephony Through Ruby Colored LensesTelephony Through Ruby Colored Lenses
Telephony Through Ruby Colored LensesMojo Lingo
 
Voice Applications for the Modern Open Source Hacker
Voice Applications for the Modern Open Source HackerVoice Applications for the Modern Open Source Hacker
Voice Applications for the Modern Open Source HackerMojo Lingo
 
Multidextrous Voice Application Framework
Multidextrous Voice Application FrameworkMultidextrous Voice Application Framework
Multidextrous Voice Application FrameworkMojo Lingo
 

Mehr von Mojo Lingo (20)

FreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In Sight
FreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In SightFreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In Sight
FreeSWITCH, FreeSWITCH Everywhere, and Not A Phone In Sight
 
Using Asterisk to Create "Her"
Using Asterisk to Create "Her"Using Asterisk to Create "Her"
Using Asterisk to Create "Her"
 
Tipping the Scales: Measuring and Scaling Asterisk
Tipping the Scales: Measuring and Scaling AsteriskTipping the Scales: Measuring and Scaling Asterisk
Tipping the Scales: Measuring and Scaling Asterisk
 
AdhearsionConf 2013 Keynote
AdhearsionConf 2013 KeynoteAdhearsionConf 2013 Keynote
AdhearsionConf 2013 Keynote
 
Speech-Enabling Web Apps
Speech-Enabling Web AppsSpeech-Enabling Web Apps
Speech-Enabling Web Apps
 
WebRTC: What? How? Why? - ClueCon 2013
WebRTC: What? How? Why? - ClueCon 2013WebRTC: What? How? Why? - ClueCon 2013
WebRTC: What? How? Why? - ClueCon 2013
 
Infiltrando Telecoms Usando Ruby
Infiltrando Telecoms Usando RubyInfiltrando Telecoms Usando Ruby
Infiltrando Telecoms Usando Ruby
 
Enhancing FreePBX with Adhearsion
Enhancing FreePBX with AdhearsionEnhancing FreePBX with Adhearsion
Enhancing FreePBX with Adhearsion
 
Connecting Adhearsion
Connecting AdhearsionConnecting Adhearsion
Connecting Adhearsion
 
Testing Adhearsion Applications
Testing Adhearsion ApplicationsTesting Adhearsion Applications
Testing Adhearsion Applications
 
Testing Telephony: It's Not All Terrible
Testing Telephony: It's Not All TerribleTesting Telephony: It's Not All Terrible
Testing Telephony: It's Not All Terrible
 
Rayo for XMPP Folks
Rayo for XMPP FolksRayo for XMPP Folks
Rayo for XMPP Folks
 
Talking To Rails
Talking To RailsTalking To Rails
Talking To Rails
 
Building Real Life Applications with Adhearsion
Building Real Life Applications with AdhearsionBuilding Real Life Applications with Adhearsion
Building Real Life Applications with Adhearsion
 
Keeping It Realtime!
Keeping It Realtime!Keeping It Realtime!
Keeping It Realtime!
 
Integrating Voice Through Adhearsion
Integrating Voice Through AdhearsionIntegrating Voice Through Adhearsion
Integrating Voice Through Adhearsion
 
Infiltrating Telecoms Using Ruby
Infiltrating Telecoms Using RubyInfiltrating Telecoms Using Ruby
Infiltrating Telecoms Using Ruby
 
Telephony Through Ruby Colored Lenses
Telephony Through Ruby Colored LensesTelephony Through Ruby Colored Lenses
Telephony Through Ruby Colored Lenses
 
Voice Applications for the Modern Open Source Hacker
Voice Applications for the Modern Open Source HackerVoice Applications for the Modern Open Source Hacker
Voice Applications for the Modern Open Source Hacker
 
Multidextrous Voice Application Framework
Multidextrous Voice Application FrameworkMultidextrous Voice Application Framework
Multidextrous Voice Application Framework
 

Kürzlich hochgeladen

Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DaySri Ambati
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfPrecisely
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 

Kürzlich hochgeladen (20)

Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 

WebRTC Overview by Dan Burnett

  • 1. WebRTC Introduc)on  to  WebRTC   Dan  Burne4   Chief  Scien)st,  Tropo   Director  of  Standards,  Voxeo     Alan  Johnston   Dis)nguished  Engineer   Avaya    
  • 2. WebRTC  Tutorial  Topics   •  •  •  •  •  •  •  What  is  WebRTC?   How  to  Use  WebRTC   WebRTC  Peer-­‐to-­‐Peer  Media   WebRTC  Protocols  and  IETF  Standards   WebRTC  W3C  API  Overview   Pseudo  Code  Walkthrough   PracFcal  bits   AdhearsionConf  2013   2  
  • 4. WebRTC  is  “Voice  &  Video  in  the  browser”   •  Access  to  camera  and  microphone  without  a   plugin   –  No  proprietary  plugin  required!     •  Audio/video  direct  from  browser  to  browser   •  Why  does  it  maUer?   –  Media  can  stay  local   –  Mobile  devices  eventually  dropping  voice  channel   anyway   –  Games   AdhearsionConf  2013   4  
  • 5. The  Browser  RTC  FuncFon   Web   Server   Signaling   Server   HTTP  or  WebSockets     JavaScript/HTML/CSS   Other  APIs   Web   Browser   •  WebRTC  adds  new  Real-­‐ Time  CommunicaFon  (RTC)   FuncFon  built-­‐in  to   browsers   –  No  download   HTTP  or  WebSockets  –  No  Flash  or  other  plugins     (Signaling)   •  Contains   –  Audio  and  video  codecs   –  Ability  to  negoFate  peer-­‐to-­‐ peer  connecFons   On-­‐the-­‐wire  protocols   –  Echo  cancellaFon,  packet  loss   (Media  or  Data)   concealement   RTC  APIs   Browser   RTC   FuncFon   NaFve  OS  Services   •  In  Chrome  &  Firefox  today,   Internet  Explorer  someFme   and  Safari  eventually   AdhearsionConf  2013   5  
  • 6. Benefits  of  WebRTC   For  Developer     For  User   •  Streamlined  development  –   one  placorm   •  Simple  APIs  –  detailed   knowledge  of  RTC  protocols   not  needed   •  NAT  traversal  only  uses   expensive  relays  when  no   other  choice   •  Advanced  voice  and  video   codecs  without  licensing     •  No  download  or  install  –   easy  to  use   •  All  communicaton   encrypted  –  private   •  Reliable  session   establishment     –  “just  works”   •  Excellent  voice  and  video   quality     •  Many  more  choices  for  real-­‐ Fme  communicaFon   AdhearsionConf  2013   6  
  • 7. WebRTC  Support  of  MulFple  Media   Microphone  Audio   ApplicaFon  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   WebCam  Video   Stereo  Audio   Browser  L         on  Laptop   Browser  M   on  Mobile   •  MulFple  sources  of  audio  and  video  are  assumed   and  supported   •  All  media,  voice  and  video,  and  feedback  messages   are  mulFplexed  over  the  same  transport  address   AdhearsionConf  2013   7  
  • 8. WebRTC  Triangle   Web  Server   (ApplicaFon)   Peer  ConnecFon  (Audio,  Video,  and/or  Data)   Browser  L   Browser  M   (Running  HTML5  ApplicaFon     from  Web  Server)   (Running  HTML5  ApplicaFon     from  Web  Server)   •  Both  browsers  running  the  same  web  applicaFon  from  web   server   •  Peer  ConnecFon  established  between  them  with  the  help  of   the  web  server   AdhearsionConf  2013   8  
  • 9. WebRTC  Trapezoid   Web  Server  A   (ApplicaFon  A)   Browser  M   SIP     or  Jingle   Web  Server  B   (ApplicaFon  B)   Peer  ConnecFon  (Audio  and/or  Video)   Browser  T   (Running  HTML5  ApplicaFon     from  Web  Server  B)   (Running  HTML5  ApplicaFon     from  Web  Server  A)     •  Similar  to  SIP  Trapezoid     •  Web  Servers  communicate  using  SIP  or  Jingle  or  proprietary   •  Could  become  important  in  the  future.   AdhearsionConf  2013   9  
  • 10. WebRTC  and  SIP   Web  Server     SIP   SIP  Server   SIP   Browser  M   Peer  ConnecFon  (Audio  and/or  Video)   SIP  Client     •  SIP  (Session  IniFaFon  Protocol)  is  a  signaling  protocol  used  by  service   providers  and  enterprises  for  real-­‐Fme  communcaFon   •  Peer  ConnecFon  appears  as  a  standard  RTP  session,  described  by  SDP   •  SIP  Endpoint  must  support  RTCWEB  media  extensions       AdhearsionConf  2013   10  
  • 11. WebRTC  and  Jingle   Web  Server   Jingle   XMPP  Server   Jingle   Peer  ConnecFon  (Audio  and/or  Video)   Browser  M   Jingle  Client   •  Jingle  is  a  signaling  extension  to  XMPP  (Extensible  Messaging  and   Presence  Protocol,  aka  Jabber)   •  Peer  ConnecFon  SDP  can  be  mapped  to  Jingle   •  Jingle  Endpoint  must  support  RTCWEB  Media  extensions   AdhearsionConf  2013   11  
  • 12. WebRTC  and  PSTN   Web  Server   Peer  ConnecFon  (Audio)   PSTN  Gateway   Browser  M   Phone   •  Peer  ConnecFon  terminates  on  a  PSTN  Gateway   •  Audio  Only   •  EncrypFon  ends  at  Gateway   AdhearsionConf  2013   12  
  • 13. WebRTC  with  SIP   Web  Server     SIP  Proxy/Registrar  Server     WebSocket  (SIP)   HTTP     (HTML5/CSS/ JavaScript)   Browser  M   (running  JavaScript  SIP  UA)   HTTP     WebSocket   (HTML5/CSS/ (SIP)   JavaScript)   SRTP  Media   Browser  T   (running  JavaScript  SIP  UA)     •  Browser  runs  a  SIP  User  Agent  by  running  JavaScript  from  Web  Server     •  SRTP  media  connecFon  uses  WebRTC  APIs   •  Details  in  [dram-­‐iec-­‐sipcore-­‐websocket]  that  defines  SIP  transport  over   AdhearsionConf  2013   13   WebSockets  
  • 14. WebRTC  Signaling  Approaches   •  Signaling  is  required  for  exchange  of  candidate  transport   addresses,  codec  informaFon,  media  keying  informaFon   •  Many  opFons  –  choice  is  up  to  web  developer   AdhearsionConf  2013   14  
  • 15. How  to  Use  WebRTC  
  • 16. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   All  media  added   Set  Up  Peer   ConnecFon   Peer  ConnecFon  established   AUach  Media   or  Data   AUach  more  media  or  data   Ready  for  call   Exchange   Offer/Answer   2013   AdhearsionConf   16  
  • 17. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   •  getUserMedia() –  Audio  and/or  video   –  Constraints   –  User  permissions   All  media  added   Set  Up  Peer   ConnecFon   •  Browser  must  ask  before   allowing  a  page  to  access   microphone  or  camera   Peer  ConnecFon  established   AUach  Media   or  Data   AUach  more  media  or  data   •  MediaStream •  MediaStreamTrack –  CapabiliFes   –  States  (sepngs)   Ready  for  call   Exchange   Offer/Answer   AdhearsionConf  2013   17  
  • 18. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   •  RTCPeerConnection –  All  media  added   –  –  Set  Up  Peer   –  ConnecFon   –  Peer  ConnecFon  established   –  AUach  Media   AUach  more  media  or  data   –  or  Data   –  Ready  for  call   Direct  media   Between  two  peers   ICE  processing   SDP  processing   DTMF  support   Data  channels   IdenFty  verificaFon   StaFsFcs  reporFng   Exchange   Offer/Answer   AdhearsionConf  2013   18  
  • 19. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   •  addStream() –  Doesn't  change  media  state!   •  removeStream() All  media  added   –  DiUo!   Set  Up  Peer   ConnecFon   •  createDataChannel() Peer  ConnecFon  established   AUach  Media   or  Data   –  Depends  on  transport   AUach  more  media  or  data   Ready  for  call   Exchange   Offer/Answer   AdhearsionConf  2013   19  
  • 20. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   All  media  added   Set  Up  Peer   ConnecFon   Peer  ConnecFon  established   AUach  Media   or  Data   •  createOffer(), createAnswer() •  setLocalDescription(), setRemoteDescription() •  Applying  SDP  answer  makes   the  magic  happen AUach  more  media  or  data   Ready  for  call   Exchange   Session   DescripFons   AdhearsionConf  2013   20  
  • 21. WebRTC  usage  –  a  bit  more  detail   Set  Up  Signaling   Channel   Obtain  Local   Media   Get  more  media   Set  Up  Peer   ConnecFon   AUach  Media  or   Data   Exchange   Session   AdhearsionConf  2013   DescripFons   AUach  more  media  or  data   21  
  • 22. SDP  offer/answer   •  Session  DescripFons   –  Session  DescripFon  Protocol  created  for  use  by   SIP  in  sepng  up  voice  (and  video)  calls   –  Describes  real-­‐Fme  media  at  low  level  of  detail   •  Which  IP  addresses  and  ports  to  use   •  Which  codecs  to  use   •  Offer/answer  model  (JSEP)   –  One  side  sends  an  SDP  offer  lisFng  what  it  wants   to  send  and  what  it  can  receive   –  Other  side  replies  with  an  SDP  answer  lisFng  what   it  will  receive  and  send   AdhearsionConf  2013   22  
  • 24. Media  Flows  in  WebRTC   Web  Server       Internet   Home  WiFi   Router   Router           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  Router       Browser  L     AdhearsionConf  2013   24  
  • 25. Media  without  WebRTC   Web  Server       Internet   Home  WiFi   Router   Router           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  Router       Browser  L     AdhearsionConf  2013   25  
  • 26. Peer-­‐to-­‐Peer  Media  with  WebRTC   Web  Server       Internet   Home  WiFi   Router   Router           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  Router       Browser  L     AdhearsionConf  2013   26  
  • 27. NAT  Complicates  Peer-­‐to-­‐Peer  Media   Web  Server   Most  browsers  are  behind  NATs   on  the  Internet,  which   complicates  the  establishment   of  peer-­‐to-­‐peer  media  sessions.         Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     AdhearsionConf  2013   27  
  • 28. What  is  a  NAT?   •  Network  Address  Translator  (NAT)   •  Used  to  map  an  inside  address  (usually  a   private  IP  address)  to  outside  address   (usually  a  public  IP  address)  at  Layer  3   •  Network  Address  and  Port  TranslaFon   (NAPT)  also  changes  the  transport  port   number  (Layer  4)   – These  are  omen  just  called  NATs  as  well   •  One  reason  for  NAT  is  the  IP  address   shortage   AdhearsionConf  2013   28  
  • 29. NAT  Example   Internet   “Outside”      Public  IP  Address       203.0.113.4 “Inside”      Private  IP  Addresses       192.168.x.x Home  WiFi     with  NAT     Browser  M   192.168.0.5     Browser  T   192.168.0.6       AdhearsionConf  2013   29  
  • 30. NATs  and  ApplicaFons   •  NATs  are  compaFble  with  client/server  protocols   such  as  web,  email,  etc.   •  However,  NATs  generally  block  peer-­‐to-­‐peer   communicaFon   •  Typical  NAT  traversal  for  VoIP  and  video   services  today  use  a  media  relay  whenever  the   client  is  behind  a  NAT   –  Omen  done  with  an  SBC  –  Session  Border   Controller   –  This  is  a  major  expense  and  complicaFon  in   exisFng  VoIP  and  video  systems   •  WebRTC  has  a  built-­‐in  NAT  traversal  strategy:   InteracFve  ConnecFvity  Establishment  (ICE)   AdhearsionConf  2013   30  
  • 31. Peer-­‐to-­‐Peer  Media  Through  NAT   Web  Server   ICE  connecFvity  checks  can   omen  establish  a  direct  peer-­‐ to-­‐peer  session  between   browsers  behind  different   NATs         Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     AdhearsionConf  2013   31  
  • 32. ICE  ConnecFvity  Checks   •  ConnecFvity  through  NAT  can  be  achieved  using  ICE   connecFvity  checks   •  Browsers  exchange  a  list  of  candidates   –  Local:  read  from  network  interfaces     –  Reflexive:  obtained  using  a  STUN  Server   –  Relayed:  obtained  from  a  TURN  Server  (media  relay)   •  Browsers  aUempt  to  send  STUN  packets  to  the   candidate  list  received  from  other  browser   •  Checks  performed  by  both  sides  at  same  Fme   •  If  one  STUN  packet  gets  through,  a  response  is  sent   and  this  connecFon  used  for  communicaFon   –  TURN  relay  will  be  last  resort  (lowest  priority)   AdhearsionConf  2013   32  
  • 33. P2P  Media  Can  Stay  Local  to  NAT   If  both  browsers  are   behind  the  same  NAT,   connecFvity  checks  can   omen  establish  a   connecFon  that  never   leaves  the  NAT.     Web  Server       Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     AdhearsionConf  2013   33  
  • 34. ICE  Servers   Web  Server       STUN  Server   TURN  Server   198.51.100.9   198.51.100.2       ICE  uses  STUN  and  TURN   servers  in  the  public   Internet  to  help  with  NAT   traversal.     Internet   Home  WiFi     with  NAT   203.0.113.4 Router  with   NAT         Browser  M   192.168.0.5         Browser  D     Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     AdhearsionConf  2013   34  
  • 35. Browser  Queries  STUN  Server     Web  Server       STUN  Server   TURN  Server   198.51.100.9   198.51.100.2       Browser  sends  STUN  test   packet  to  STUN  server  to   learn  its  public  IP  address   (address  of  the  NAT).     Internet   Home  WiFi     with  NAT   203.0.113.4 Router  with   NAT         Browser  M   192.168.0.5         Browser  D     Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     AdhearsionConf  2013   35  
  • 36. TURN  Server  Can  Relay  Media   Web  Server       STUN   TURN  Server  as  a   Server   Media  Relay       In  some  cases,  connecFvity   checks  fail,  and  a  TURN   Media  Relay  on  the  public   Internet  must  be  used.       Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     AdhearsionConf  2013   36  
  • 37. WebRTC  Protocols  and  IETF   Standards  
  • 38. WebRTC:  A  Joint  Standards  Effort   •  Internet  Engineering  Task  Force  (IETF)  and  World   Wide  Web  ConsorFum  (W3C)  are  working  together  on   WebRTC   •  IETF   –  Protocols  –  “bits  on  wire”   –  Main  protocols  are  already  RFCs,  but  many  extensions  in   progress   –  RTCWEB  (Real-­‐Time  CommunicaFons  on  the  Web)  Working   Group  is  the  main  focus,  but  other  WGs  involved  as  well   –  hUp://www.iec.org     •  W3C   –  APIs  –  used  by  JavaScript  code  in  HTML5   –  hUp://www.w3c.org   AdhearsionConf  2013   38  
  • 39. WebRTC  Protocols   ApplicaFon  Layer   HTTP   ICE   WebSocket   SRTP   SDP   STUN   TURN   Transport  Layer   TLS   TCP   Network  Layer   DTLS   UDP   SCTP   IP   SIP  is  not  shown  as  it  is  opFonal   AdhearsionConf  2013   39  
  • 40. IETF  RTCWEB  Documents   Document) Ref) Overview' “Overview:'Real'Time'Protocols'for' Browser6based'Applications”' draft6ietf6rtcweb6 overview' Use'Cases'and'Requirements' “Web'Real6Time'Communication' Use6cases'and'Requirements”' draft6ietf6rtcweb6 use6cases6and6 requirements' RTP'Usage' “Web'Real6Time'Communication' (WebRTC):'Media'Transport'and' Use'of'RTP”' draft6ietf6rtcweb6 rtp6usage' Security'Architecture' “RTCWEB'Security'Architecture”' draft6ietf6rtcweb6 security6arch' Threat'Model' “Security'Considerations'for'RTC6 Web”' draft6ietf6rtcweb6 security' Data'Channel' “RTCWeb'Data'Channels”' draft6ietf6rtcweb6 data6channel' JSEP' “JavaScript'Session'Establishment' Protocol”' draft6ietf6rtcweb6 jsep' Audio' “WebRTC'Audio'Codec'and' Processing'Requirements”' draft6ietf6rtcweb6 audio' Quality'of'Service' ' Title) “DSCP'and'other'packet'markings' for'RTCWeb'QoS”' draft6ietf6rtcweb6 qos' AdhearsionConf  2013   40  
  • 41. Codecs                      RFC  6716                  .                 •  Mandatory  to  Implement  (MTI)  audio  codecs  are   seUled  on  Opus  and  G.711  (finally!)   •  Video  is  not  yet  decided!   AdhearsionConf  2013   41  
  • 42. WebRTC  W3C  API  Overview  
  • 43. Two  primary  API  secFons   •  Handling  local  media   –  Media  Capture  and  Streams  (getUserMedia)   specificaFon   •  Transmipng  media   –  WebRTC  (Peer  ConnecFon)  specificaFon   AdhearsionConf  2013   43  
  • 44. Local  Media  Handling   Audio   PresentaFon   Video   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Microphone  Audio   Presenter  Video   ApplicaFon  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Browser  M   Sources   PresentaFon  Stream   Captured   MediaStreams   •  In  this  example   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Captured  4  local  media  streams   –  Created  3  media  streams  from  them   –  Sent  streams  over  Peer  ConnecFon   AdhearsionConf  2013   44  
  • 45. Local  Media  Handling   Audio   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Browser  M   Sources   •  Sources   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Presenter  Video   Front  Camera  Video   Rear  Camera  Video   PresentaFon  Stream   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Captured   MediaStreams   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Encoded  together   –  Can't  manipulate  individually   AdhearsionConf  2013   45  
  • 46. Local  Media  Handling   Audio   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Browser  M   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Presenter  Video   Front  Camera  Video   Rear  Camera  Video   PresentaFon  Stream   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Sources   Captured   MediaStreams   •  Tracks  (MediaStreamTrack)   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Tied  to  a  source   –  Exist  primarily  as  part  of  Streams;  single  media  type   –  Globally  unique  ids;  opFonally  browser-­‐labeled   AdhearsionConf  2013   46  
  • 47. Local  Media  Handling   PresentaFon  Stream   Audio   “Audio”  Track   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Presenter  Stream   “Audio”  Track   Presenter  Video   “Presenter”     Track   DemonstraFon  Stream   Front  Camera  Video   Rear  Camera  Video   Browser  M   Sources   “PresentaFon”    Track   “Audio”  Track   DemonstraFon   Video   Captured   MediaStreams   •  Captured  MediaStream   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Returned  from  getUserMedia() –  Permission  check  required  to  obtain   AdhearsionConf  2013   47  
  • 48. Local  Media  Handling   Audio   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Browser  M   Sources   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Presenter  Video   Front  Camera  Video   Rear  Camera  Video   PresentaFon  Stream   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Captured   MediaStreams   •  MediaStream   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  All  contained  tracks  are  synchronized   –  Can  be  created,  transmiUed,  etc.   AdhearsionConf  2013   48  
  • 49. Local  Media  Handling   •  Sepngs   –  Current  values  of  source  properFes  (height,  width,   etc.)   –  Exposed  on  MediaStreamTrack •  CapabiliFes   –  Allowed  values  for  source  properFes   –  Exposed  on  MediaStreamTrack •  Constraints   –  Requested  ranges  for  track  properFes   –  Used  in  getUserMedia(),  applyConstraints() AdhearsionConf  2013   49  
  • 50. Transmipng  media   •  Signaling  channel   –  Non-­‐standard   –  Must  exist  to  set  up  Peer  ConnecFon   •  Peer  ConnecFon   –  Links  together  two  peers   –  Add/Remove  Media  Streams   •  addStream(),  removeStream() –  Handlers  for  ICE  or  media  change   –  Data  Channel  support   AdhearsionConf  2013   50  
  • 51. Peer  ConnecFon   •  "Links"  together  two  peers   –  Via  new RTCPeerConnection() –  Generates  Session  DescripFon  offers/answers   •  createOffer(),  createAnswer() –  From  SDP  answers,  iniFates  media   •  setLocalDescription(),  setRemoteDescription() –  Offers/answers  MUST  be  relayed  by  applicaFon   code!   –  ICE  candidates  can  also  be  relayed  and  added  by  app   •  addIceCandidate()   AdhearsionConf  2013   51  
  • 52. Peer  ConnecFon   •  Handlers  for  signaling,  ICE  or  media  change   –  onsignalingstatechange –  onicecandidate,   oniceconnectionstatechange –  onaddstream,  onremovestream –  onnegotiationneeded –  A  few  others   AdhearsionConf  2013   52  
  • 53. Peer  ConnecFon   •  “Extra”  APIs   –  Data   –  DTMF   –  StaFsFcs   –  IdenFty   •  Grouped  separately  in  WebRTC  spec   –  but  really  part  of  RTCPeerConnection   definiFon   –  all  are  mandatory  to  implement   AdhearsionConf  2013   53  
  • 54. Data  Channel  API   •  RTCDataChannel createDataChannel() •  Configurable  with   –  –  –  –  ordered maxRetransmits,  maxRetransmitTime negotiated id •  Provides  RTCDataChannel  with   –  send() –  onopen,  onerror,  onclose,  onmessage*   AdhearsionConf  2013   54  
  • 55. DTMF  API   •  RTCDTMFSender createDTMFSender() –  Associates  track  input  parameter  with  this   RTCPeerConnection •  RTCDTMFSender  provides   –  boolean canInsertDTMF() –  insertDTMF() –  ontonechange –  (other  stuff)   AdhearsionConf  2013   55  
  • 56. StaFsFcs  API   •  getStats() –  Callback  returns  staFsFcs  for  given  track   •  StaFsFcs  available  (local/remote)  are:   –  Bytes/packets  xmiUed   –  Bytes/packets  received   •  May  be  useful  for  congesFon-­‐based   adjustments   AdhearsionConf  2013   56  
  • 57. IdenFty  API   •  setIdentityProvider(),   getIdentityAssertion() •  Used  to  verify  idenFty  via  third  party,  e.g.,   Facebook  Connect   •  Both  methods  are  opFonal   •  onidentity  handler  called  amer  any   verificaFon  aUempt   •  RTCPeerConnection.peerIdentity  holds   any  verified  idenFty  asserFon   AdhearsionConf  2013   57  
  • 59. Pseudo  Code   •  Close  to  real  code,  but  .  .  .   •  No  HTML,  no  signaling  channel,  not   asynchronous,  and  API  is  sFll  in  flux   •  Don't  expect  this  to  work  anywhere   AdhearsionConf  2013   59  
  • 60. Back  to  first  diagram   Microphone  Audio   ApplicaFon  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   WebCam  Video   Stereo  Audio   Browser  L         on  Laptop   Browser  M   on  Mobile   •  Mobile  browser  "calls"  laptop  browser   •  Each  sends  media  to  the  other   AdhearsionConf  2013   60  
  • 61. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   61  
  • 62. Mobile  browser  produces  .  .  .   Audio   PresentaFon  Stream   “Audio”  Track   “PresentaFon”  Track   PresentaFon  Video   Microphone  Audio   ApplicaFon  Sharing  Video   “Audio”  Track   Presenter  Video   “Presenter”  Track   DemonstraFon  Stream   Front  Camera  Video   Rear  Camera  Video   Presenter  Stream   “Audio”  Track   DemonstraFon  Video   “DemonstraFon”  Track   Browser  M   Sources   Captured  MediaStreams   Created  MediaStreams   •  At  least  3  calls  to  getUserMedia() •  Three  calls  to  new MediaStream() •  App  sends  stream  ids,  then  streams   AdhearsionConf  2013   Tracks   62  
  • 63. funcFon  getMedia()  [1]   navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } . . . function attachMedia() { presentation = new MediaStream( •  Get  audio   •  (Get  window  video  –  out  of  scope)   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   63  
  • 64. funcFon  getMedia()  [2]   . . . constraint = {"video": {"mandatory": {"facingMode": "environment"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"facingMode": "user"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); •  Get  front-­‐facing  camera   •  Get  rear-­‐facing  camera   constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   64  
  • 65. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   65  
  • 66. funcFon  createPC()   var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; pc = new RTCPeerConnection(configuration); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; •  Create  RTCPeerConnection •  Set  handlers   function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   66  
  • 67. Mobile  browser  consumes  .  .  .   Audio  &  Video  Stream   Display   “Video”  Track   Right  Headphone   “Right”  Track   Lem  Headphone   Browser  M   “Lem”  Track   (Audio  &  Video  Stream  selected)   Stereo  Stream   “Right”  Track   “Lem”  Track   “Mono”  track   Mono  Stream   Sinks   MediaStreams   •  Receives  three  media  streams   •  Chooses  one •  Sends  tracks  to  output  channels   AdhearsionConf  2013   Tracks   67  
  • 68. FuncFon  handleIncomingStream()   if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); •  If  incoming  stream  has  video  track,  set  to   av_stream  and  display  it   •  If  it  has  two  audio  tracks,  must  be  stereo   •  Otherwise,  must  be  the  mono  stream   pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   68  
  • 69. FuncFon  show_av(st)   display.srcObject = new MediaStream(st.getVideoTracks()[0]); left.srcObject = new MediaStream(st.getAudioTracks()[0]); right.srcObject = new MediaStream(st.getAudioTracks()[1]); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); •  Using  new  srcObject  property  on  media,   •  Set  new  stream  as  source   } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   69  
  • 70. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   70  
  • 71. funcFon  aUachMedia()  [1]   presentation = new MediaStream( [microphone.getAudioTracks()[0], application.getVideoTracks()[0]]); presenter = new MediaStream( [microphone.getAudioTracks()[0], front.getVideoTracks()[0]]); demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); . . . // Audio // Presentation "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // Audio // Presenter // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( // Audio // Demonstration •  Create  3  new  streams,  all  with  same   audio  but  different  video   AdhearsionConf  2013   var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; 71  
  • 72. funcFon  aUachMedia()  [2]   pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); •  AUach  all  3  streams  to  Peer  ConnecFon   •  Send  stream  ids  to  peer  (before  streams!)   } function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   72  
  • 73. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   73  
  • 74. funcFon  call()   pc.createOffer(gotDescription, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function gotDescription(desc) { pc.setLocalDescription(desc, s, e); function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); } pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter •  Ask  browser  to  create  SDP  offer   •  Set  offer  as  local  descripFon   •  Send  offer  to  peer   demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; AdhearsionConf  2013   74  
  • 75. How  do  we  get  the  SDP  answer?   signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter }; demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); •  Signaling  channel  provides  message   •  If  SDP,  set  as  remote  descripFon   •  If  ICE  candidate,  tell  the  browser   AdhearsionConf  2013   } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; 75  
  • 76. And  now  the  laptop  browser  .  .  .   •  Watch  for  the  following   –  We  set  up  media  *amer*  receiving  the  offer   –  but  the  signaling  channel  sFll  must  exist  first!   –  Also,  need  to  save  incoming  stream  ids   AdhearsionConf  2013   76  
  • 77. Signaling  channel  message  is  trigger   signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; . . . pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; }; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { •  Set  up  PC  and  media  if  not  already  done   if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; AdhearsionConf  2013   77  
  • 78. Signaling  channel  message  is  trigger   signalingChannel.onmessage = function (msg) { . . . if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); }; •  If  SDP,  *also*  answer   •  But  if  neither  SDP  nor  ICE  candidate,  must   be  set  of  incoming  stream  ids,  so  save   AdhearsionConf  2013   signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; 78  
  • 79. FuncFon  prepareForIncomingCall()   createPC(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; getMedia(); var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); attachMedia(); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; •  No  suprises  here   •  Media  obtained  is  a  liUle  different   •  But  aUached  the  same  way   AdhearsionConf  2013   // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; 79  
  • 80. FuncFon  answer()   pc.createAnswer(gotDescription, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", function gotDescription(desc) { pc.setLocalDescription(desc, s, e); "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = signalingChannel.send(JSON.stringify({ "sdp": desc })); function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); } constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; •  createAnswer()  automaFcally  uses   value  of  remoteDescription  when   generaFng  new  SDP   // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; AdhearsionConf  2013   80  
  • 81. Laptop  browser  consumes  .  .  .   PresentaFon  Stream   “Audio”  Track   “PresentaFon”  Track   Presenter  Stream     “Audio”  Track   Speaker   “Presenter”  Track   Display   DemonstraFon  Stream   Display   “Audio”  Track   Display   “DemonstraFon”  Track   Browser  L   (All  video  streams  selected)   Tracks   MediaStreams   Sinks   •  Three  input  streams •  All  have  same  #  of  audio  and  video  tracks •  Need  stream  ids  to  disFnguish   AdhearsionConf  2013   81  
  • 82. FuncFon  handleIncomingStream()   if (st.id === incoming.presentation) { speaker.srcObject = new MediaStream(st.getAudioTracks()[0]); win1.srcObject = new MediaStream(st.getVideoTracks()[0]); } else if (st.id === incoming.presenter) { win2.srcObject = new MediaStream(st.getVideoTracks()[0]); } else { win3.srcObject = new MediaStream(st.getVideoTracks()[0]); } •  Use  ids  to  disFnguish  streams   •  Extract  one  audio  and  all  video  tracks   •  Assign  to  element  sources   AdhearsionConf  2013   var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; 82  
  • 83. Laptop  browser  produces  .  .  .   Audio  &  Video  Stream   Video   “Video”  Track   WebCam   Lem  Microphone   “Right”  Track   “Lem”  Track   Lem  Audio   Right  Microphone   Stereo  Stream   Browser  L   “Right”  Track   Right  Audio   “Lem”  Track   “Mono”  Track   Mono  Stream   Tracks   Created  MediaStreams   Captured   MediaStreams   Sources   •  Three  calls  to  getUserMedia() •  Three  calls  to  new MediaStream() •  No  stream  ids  needed   AdhearsionConf  2013   83  
  • 84. FuncFon  getMedia()  [1]   navigator.getUserMedia({"video": true}, function (stream) { webcam = stream; }, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); . . . pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; •  Request  webcam  video   // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; AdhearsionConf  2013   84