SlideShare ist ein Scribd-Unternehmen logo
1 von 146
Downloaden Sie, um offline zu lesen
#WWDC17
© 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple.
David Saracino, AirPlay Engineer
•
Introducing AirPlay 2
•
Unlocking multi-room audio
•
Session 509
Media
AirPlay 2
Audio
Wireless audio
AirPlay 2
Audio
Wireless audio
Multi-room playback
NEWNEW
AirPlay 2
Audio
Wireless audio
Multi-room playback
Enhanced buffering
NEW
AirPlay 2
Audio
Wireless audio
Multi-room playback
Enhanced buffering
Multi-device control
NEW
AirPlay 2
Supported platforms
tvOS
mac
OSiOS
AirPlay 2
Supported speakers
HomePod Apple TV* 3rd Party
* Requires Apple TV (4th generation)
•
AirPlay 2 Adoption
AirPlay 2 Adoption
Identify as long-form audio
AirPlay 2 Adoption
Identify as long-form audio
Add an AirPlay picker
AirPlay 2 Adoption
Identify as long-form audio
Add an AirPlay picker
Integrate with MediaPlayer framework
AirPlay 2 Adoption
Identify as long-form audio
Add an AirPlay picker
Integrate with MediaPlayer framework
Adopt an AirPlay 2 playback API
Identify as Long-Form Audio NEW
Long-form Audio Routing (iOS)
Music and phone call coexistence
Long-form Audio
(AirPlay 2)
System Audio
NEW
Long-form Audio Routing (iOS)
Music and phone call coexistence
Long-form Audio
(AirPlay 2)
System Audio
Long-form audio route
Session Arbitration
NEW
Long-form Audio Routing (iOS)
Music and phone call coexistence
Long-form Audio
(AirPlay 2)
System Audio
Long-form audio route
Session Arbitration
NEW
Long-form Audio Routing (iOS)
Music and phone call coexistence
Long-form Audio
(AirPlay 2)
System Audio
Long-form audio route
Session Arbitration
System audio route
Session Arbitration
and Mixing
NEW
NEW
Long-form Audio Routing (iOS and tvOS)
Long-form Audio
(AirPlay 2)
Session Arbitration
Applications that use

the long-form audio route
System Audio
Session Arbitration
and Mixing
Applications that use

the system audio route
//Long-form Audio Routing (iOS and tvOS), code example
let mySession = AVAudioSession.sharedInstance()
do {
try mySession.setCategory(AVAudioSessionCategoryPlayback,
mode: AVAudioSessionModeDefault,
routeSharingPolicy: .longForm)
} catch {
// handle errors
}
NEW
//Long-form Audio Routing (iOS and tvOS), code example
let mySession = AVAudioSession.sharedInstance()
do {
try mySession.setCategory(AVAudioSessionCategoryPlayback,
mode: AVAudioSessionModeDefault,
routeSharingPolicy: .longForm)
} catch {
// handle errors
}
NEW
//Long-form Audio Routing (iOS and tvOS), code example
let mySession = AVAudioSession.sharedInstance()
do {
try mySession.setCategory(AVAudioSessionCategoryPlayback,
mode: AVAudioSessionModeDefault,
routeSharingPolicy: .longForm)
} catch {
// handle errors
}
NEW
Long-form Audio Routing (macOS)
Long-form Audio
(AirPlay 2)
Arbitration
Applications that use

the long-form audio route
NEW
Default Device
Mixing
Applications that use

the system audio route
//Long-form Audio Routing (macOS), code example
let mySession = AVAudioSession.sharedInstance()
do {
try mySession.setRouteSharingPolicy(.longForm)
} catch {
// handle errors
}
NEW
//Long-form Audio Routing (macOS), code example
let mySession = AVAudioSession.sharedInstance()
do {
try mySession.setRouteSharingPolicy(.longForm)
} catch {
// handle errors
}
NEW
Add an AirPlay Picker
Add an AirPlay Picker
Adopt
Add an AirPlay Picker
Adopt
• AVRoutePickerView
NEW
Add an AirPlay Picker
Adopt
• AVRoutePickerView
• AVRouteDetector
NEW
Integrate with Media Player Framework
Handle remote media commands
• MPRemoteCommandCenter
Integrate with Media Player Framework
Handle remote media commands
• MPRemoteCommandCenter
Display current track info
• MPNowPlayingInfoCenter
Integrate with Media Player Framework
•
AirPlay 2 Playback APIs
Audio Buffering
Existing AirPlay
Real-time audio stream
Speaker adds a small buffer before output
Works fine for streaming to single speaker
Enhanced Audio Buffering
AirPlay 2
Large audio buffering capacity on speakers
Faster-than-real-time streaming to speakers
Benefits
• Adds robustness
• More responsive playback
NEW
Enhanced Audio Buffering
Adoption
Supported with specific playback APIs
Enhanced Audio Buffering
Adoption
Supported with specific playback APIs
• AVPlayer / AVQueuePlayer
Enhanced Audio Buffering
Adoption
Supported with specific playback APIs
• AVPlayer / AVQueuePlayer
• AVSampleBufferAudioRenderer
- AVSampleBufferRenderSynchronizer
NEW
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
NEW
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Your app has additional responsibilities
NEW
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Your app has additional responsibilities
• Sourcing and parsing the content
• Providing raw audio buffers for rendering
NEW
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Wants more data
Client App
SynchronizerAudioRenderer
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Client App
SynchronizerAudioRenderer
Audio
Data
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Client App
SynchronizerAudioRenderer
Audio
Data
Audio
Data
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Client App
SynchronizerAudioRenderer
Audio
Data
SetRate 1
Audio
Data
Enhanced Audio Buffering
AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer
Client App
SynchronizerAudioRenderer
Audio
Data
Audio
Data
Requested data amount varies by audio route
Audio Buffer Levels
AVSampleBufferAudioRenderer
Local Seconds
Bluetooth Seconds
AirPlay Seconds
AirPlay 2 Minutes
Manually changing Playhead location
Seek
AVSampleBufferAudioRenderer
Seek
AVSampleBufferAudioRenderer
Renderer
Source
Audio Data to Play
PlayheadPlayhead
Seek
AVSampleBufferAudioRenderer
Renderer
Source
Audio Data to Play
Playhead Playhead
Seek
AVSampleBufferAudioRenderer
Renderer
Source
Audio Data to Play
Flush Playhead
Seek
AVSampleBufferAudioRenderer
Renderer
Source
Audio Data to Play
Playhead
// Seek - AVSampleBufferAudioRenderer
func seek(toMediaTime mediaTime: CMTime) {
renderSynchronizer.setRate(0.0, time: kCMTimeZero)
audioRenderer.stopRequestingMediaData()
audioRenderer.flush()
myPrepareSampleGenerationForMediaTime(mediaTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) {
// ...
}
renderSynchronizer.setRate(1.0, time: mediaTime)
}
// Seek - AVSampleBufferAudioRenderer
func seek(toMediaTime mediaTime: CMTime) {
renderSynchronizer.setRate(0.0, time: kCMTimeZero)
audioRenderer.stopRequestingMediaData()
audioRenderer.flush()
myPrepareSampleGenerationForMediaTime(mediaTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) {
// ...
}
renderSynchronizer.setRate(1.0, time: mediaTime)
}
// Seek - AVSampleBufferAudioRenderer
func seek(toMediaTime mediaTime: CMTime) {
renderSynchronizer.setRate(0.0, time: kCMTimeZero)
audioRenderer.stopRequestingMediaData()
audioRenderer.flush()
myPrepareSampleGenerationForMediaTime(mediaTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) {
// ...
}
renderSynchronizer.setRate(1.0, time: mediaTime)
}
// Seek - AVSampleBufferAudioRenderer
func seek(toMediaTime mediaTime: CMTime) {
renderSynchronizer.setRate(0.0, time: kCMTimeZero)
audioRenderer.stopRequestingMediaData()
audioRenderer.flush()
myPrepareSampleGenerationForMediaTime(mediaTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) {
// ...
}
renderSynchronizer.setRate(1.0, time: mediaTime)
}
// Seek - AVSampleBufferAudioRenderer
func seek(toMediaTime mediaTime: CMTime) {
renderSynchronizer.setRate(0.0, time: kCMTimeZero)
audioRenderer.stopRequestingMediaData()
audioRenderer.flush()
myPrepareSampleGenerationForMediaTime(mediaTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) {
// ...
}
renderSynchronizer.setRate(1.0, time: mediaTime)
}
// Seek - AVSampleBufferAudioRenderer
func seek(toMediaTime mediaTime: CMTime) {
renderSynchronizer.setRate(0.0, time: kCMTimeZero)
audioRenderer.stopRequestingMediaData()
audioRenderer.flush()
myPrepareSampleGenerationForMediaTime(mediaTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) {
// ...
}
renderSynchronizer.setRate(1.0, time: mediaTime)
}
Play Queues
Timelines
Queue
Renderer
0 100 200
Continuous Timeline
0 100 1000 0
Item 1 Item 2 Item 3
Play Queues
Editing during playback
Queue
Last Enqueued Sample
Item 1 Item 2 Item 3
Renderer
Playhead
Item 1
Play Queues
Editing during playback
Queue
Renderer
Playhead
Item 3
Last Enqueued Sample
Item 3
Play Queues
Editing during playback
Queue Item 1 Item 4
Renderer Item 2’s Data!
Playhead Last Enqueued Sample
Flushing from a Source Time
1. Stop enqueueing audio data
2. flush(fromSourceTime:)
3. Wait for the callback
Item 3
Play Queues
Replacing incorrect renderer data
Queue Item 1 Item 4
Renderer
Flush from Source Time
Item 3
Play Queues
Replacing incorrect renderer data
Queue Item 1 Item 4
Renderer
Last Enqueued SamplePlayhead
Flushing from a Source Time
Gotchas
Flushing from a Source Time
Gotchas
Flush may fail!
Flushing from a Source Time
Gotchas
Flush may fail!
• Source time is too close to playhead
Flushing from a Source Time
Gotchas
Flush may fail!
• Source time is too close to playhead
Wait for the callback!
// FlushFromSourceTime - AVSampleBufferAudioRenderer
func performFlush(fromSourceTime sourceTime: CMTime) {
audioRenderer.stopRequestingMediaData()
// App-specific logic to ensure no more media data is enqueued
audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in
if flushSucceeded {
self.myPrepareSampleGenerationForMediaTime(sourceTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ }
}
else {
// Flush and interrupt playback
}
}
}
// FlushFromSourceTime - AVSampleBufferAudioRenderer
func performFlush(fromSourceTime sourceTime: CMTime) {
audioRenderer.stopRequestingMediaData()
// App-specific logic to ensure no more media data is enqueued
audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in
if flushSucceeded {
self.myPrepareSampleGenerationForMediaTime(sourceTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ }
}
else {
// Flush and interrupt playback
}
}
}
// FlushFromSourceTime - AVSampleBufferAudioRenderer
func performFlush(fromSourceTime sourceTime: CMTime) {
audioRenderer.stopRequestingMediaData()
// App-specific logic to ensure no more media data is enqueued
audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in
if flushSucceeded {
self.myPrepareSampleGenerationForMediaTime(sourceTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ }
}
else {
// Flush and interrupt playback
}
}
}
// FlushFromSourceTime - AVSampleBufferAudioRenderer
func performFlush(fromSourceTime sourceTime: CMTime) {
audioRenderer.stopRequestingMediaData()
// App-specific logic to ensure no more media data is enqueued
audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in
if flushSucceeded {
self.myPrepareSampleGenerationForMediaTime(sourceTime)
audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ }
}
else {
// Flush and interrupt playback
}
}
}
•
Audio Format Support
•
AVSampleBufferAudioRenderer
Supported Audio Formats
AVSampleBufferAudioRenderer
All platform-supported audio formats
• e.g. LPCM, AAC, mp3, or ALAC
• e.g. 44.1 kHZ or 48 kHZ
Supported Audio Formats
AVSampleBufferAudioRenderer
All platform-supported audio formats
• e.g. LPCM, AAC, mp3, or ALAC
• e.g. 44.1 kHZ or 48 kHZ
• various bit depths
Mixed formats may be enqueued
AAC @ 44.1kHZ MP3 @ 48kHz 16bit ALAC @ 48kHzRenderer
•
Video Synchronization
•
AVSampleBufferDisplayLayer
AudioRenderer Synchronizer
Video Synchronization
AVSampleBufferDisplayLayer
Client App
AudioRenderer Synchronizer
Video Synchronization
AVSampleBufferDisplayLayer
Client App
DisplayLayer
NEW
Client App
Video Synchronization
AVSampleBufferDisplayLayer
AudioRenderer
Audio
Data
Synchronizer DisplayLayer
Video
Data
NEW
Client App
Video Synchronization
AVSampleBufferDisplayLayer
SetRate 1
AudioRenderer
Audio
Data
Synchronizer DisplayLayer
Video
Data
NEW
Client App
Video Synchronization
AVSampleBufferDisplayLayer
AudioRenderer
Audio
Data
Synchronizer DisplayLayer
Video
Data
NEW
#WWDC17
© 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple.
Akshatha Nagesh, AudioEngine-eer
Béla Balázs, Audio Artisan
Torrey Holbrook Walker, Audio/MIDI Black Ops
•
What’s New in Audio
•
Session 501
Media
•
AVAudioEngine
•
AVAudioSession
•
watchOS
•
AUAudioUnit
•
Other Enhancements
•
Inter-Device Audio Mode (IDAM)
•
AVAudioEngine
Recap
Powerful, feature-rich, Objective-C / Swift API set
Simplifies realtime audio, easier to use
Supports
• Playback, recording, processing, mixing
• 3D spatialization
AVAudioEngine in Practice WWDC14
What’s New in Core Audio WWDC15
What’s New
AVAudioEngine
• Manual rendering
• Auto shutdown
AVAudioPlayerNode
• Completion callbacks
NEW
Sample Engine Setup
Karaoke
NodeTapBlock
(Analyze)
InputNode
EffectNode
(EQ)
PlayerNode
(Backing Track)
PlayerNode
(Sound Effects)
MixerNode OutputNode
Sample Engine Setup
InputNode
OutputNode
NodeTapBlock
(Analyze)
EffectNode
PlayerNode
PlayerNode
MixerNode
Sample Engine Setup
Manual rendering
NodeTapBlock
(Analyze)
InputNode EffectNode
PlayerNode
PlayerNode
MixerNode OutputNode
NEW
Application
Engine is not connected to any audio device
Renders in response to requests from the client
Modes
• Offline
• Realtime
Manual Rendering NEW
Engine and nodes operate under no deadlines or realtime constraints
A node may choose to:
• Use a more expensive signal processing algorithm
• Block on render thread for more data if needed
- For example, player node may wait until its worker thread reads the 

data from disk
Offline Manual Rendering NEW
Application
Offline Context
Offline Manual Rendering
Example
NEW
Source File Destination File
PlayerNode EffectNode OutputNode
Post-processing of audio files, for example, apply reverb, effects etc.
Mixing of audio files
Offline audio processing using CPU intensive (higher quality) algorithms
Tuning, debugging or testing the engine setup
Offline Manual Rendering
Applications
NEW
The engine and nodes:
• Operate under realtime constraints
• Do not make any blocking calls like blocking on a mutex, calling libdispatch etc.,
on the render thread
- A node may drop the data if it is not ready to be rendered in time
Realtime Manual Rendering NEW
Processing audio in an AUAudioUnit’s internalRenderBlock
Processing audio data in a movie/video during streaming/playback
Realtime Manual Rendering
Applications
NEW
Realtime Manual Rendering
Example
NEW
Realtime Context
Application
Audio from an

Input Movie Stream
Audio into an

Output Movie Stream
InputNode EffectNode OutputNode
TV
//Realtime Manual Rendering, code example
do {
let engine = AVAudioEngine() // by default engine will render to/from the audio device
// make connections, e.g. inputNode -> effectNode -> outputNode
// switch to manual rendering mode
engine.stop()
try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat,
maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms
let renderBlock = engine.manualRenderingBlock // cache the render block
NEW
//Realtime Manual Rendering, code example
do {
let engine = AVAudioEngine() // by default engine will render to/from the audio device
// make connections, e.g. inputNode -> effectNode -> outputNode
// switch to manual rendering mode
engine.stop()
try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat,
maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms
let renderBlock = engine.manualRenderingBlock // cache the render block
NEW
//Realtime Manual Rendering, code example
do {
let engine = AVAudioEngine() // by default engine will render to/from the audio device
// make connections, e.g. inputNode -> effectNode -> outputNode
// switch to manual rendering mode
engine.stop()
try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat,
maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms
let renderBlock = engine.manualRenderingBlock // cache the render block
NEW
//Realtime Manual Rendering, code example
do {
let engine = AVAudioEngine() // by default engine will render to/from the audio device
// make connections, e.g. inputNode -> effectNode -> outputNode
// switch to manual rendering mode
engine.stop()
try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat,
maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms
let renderBlock = engine.manualRenderingBlock // cache the render block
NEW
// set the block to provide input data to engine
engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) {
(inputFrameCount) -> UnsafePointer<AudioBufferList>? in
guard haveData else { return nil }
// fill and return the input audio buffer list
return inputBufferList
})
// create output buffer, cache the buffer list
let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat,
frameCapacity: engine.manualRenderingMaximumFrameCount)!
buffer.frameLength = buffer.frameCapacity
let outputBufferList = buffer.mutableAudioBufferList
try engine.start()
} catch { // handle errors }
NEW
// set the block to provide input data to engine
engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) {
(inputFrameCount) -> UnsafePointer<AudioBufferList>? in
guard haveData else { return nil }
// fill and return the input audio buffer list
return inputBufferList
})
// create output buffer, cache the buffer list
let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat,
frameCapacity: engine.manualRenderingMaximumFrameCount)!
buffer.frameLength = buffer.frameCapacity
let outputBufferList = buffer.mutableAudioBufferList
try engine.start()
} catch { // handle errors }
NEW
// set the block to provide input data to engine
engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) {
(inputFrameCount) -> UnsafePointer<AudioBufferList>? in
guard haveData else { return nil }
// fill and return the input audio buffer list
return inputBufferList
})
// create output buffer, cache the buffer list
let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat,
frameCapacity: engine.manualRenderingMaximumFrameCount)!
buffer.frameLength = buffer.frameCapacity
let outputBufferList = buffer.mutableAudioBufferList
try engine.start()
} catch { // handle errors }
NEW
// set the block to provide input data to engine
engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) {
(inputFrameCount) -> UnsafePointer<AudioBufferList>? in
guard haveData else { return nil }
// fill and return the input audio buffer list
return inputBufferList
})
// create output buffer, cache the buffer list
let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat,
frameCapacity: engine.manualRenderingMaximumFrameCount)!
buffer.frameLength = buffer.frameCapacity
let outputBufferList = buffer.mutableAudioBufferList
try engine.start()
} catch { // handle errors }
NEW
// to render from realtime context
OSStatus outputError = noErr;
const auto status = renderBlock(framesToRender, outputBufferList, &outputError);
switch (status) {
case AVAudioEngineManualRenderingStatusSuccess:
handleProcessedOutput(outputBufferList); // data rendered successfully
break;
case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode:
handleProcessedOutput(outputBufferList); // input node did not provide data,
// but other sources may have rendered
break;
..
default:
break;
}
NEW
// to render from realtime context
OSStatus outputError = noErr;
const auto status = renderBlock(framesToRender, outputBufferList, &outputError);
switch (status) {
case AVAudioEngineManualRenderingStatusSuccess:
handleProcessedOutput(outputBufferList); // data rendered successfully
break;
case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode:
handleProcessedOutput(outputBufferList); // input node did not provide data,
// but other sources may have rendered
break;
..
default:
break;
}
NEW
Offline
• Can use either ObjC/Swift render method or the block based render call
Realtime
• Must use the block based render call
Manual Rendering
Render calls
NEW
What’s New
AVAudioEngine
• Manual rendering
• Auto shutdown
AVAudioPlayerNode
• Completion callbacks
NEW
Hardware is stopped if running idle for a certain duration, 

started dynamically when needed
Safety net for conserving power
Enforced behavior on watchOS, optional on other platforms
Auto Shutdown
isAutoShutdownEnabled
NEW
What’s New
AVAudioEngine
• Manual rendering
• Auto shutdown
AVAudioPlayerNode
• Completion callbacks
NEW
Existing buffer/file completion handlers called when the data has been consumed
New completion handler and callback types
Completion Callbacks
AVAudioPlayerNodeCompletionCallbackType
.dataConsumed
.dataRendered
.dataPlayedBack
NEW
• Buffer/file has finished playing
• Applicable only when the engine is rendering to an audio device
• Accounts for both (small) downstream signal processing latency and (possibly
significant) audio playback device latency
Completion Callbacks
.dataPlayedBack
NEW
• Buffer/file has finished playing
• Applicable only when the engine is rendering to an audio device
• Accounts for both (small) downstream signal processing latency and (possibly
significant) audio playback device latency
Completion Callbacks
.dataPlayedBack
NEW
player.scheduleFile(file, at: nil, completionCallbackType: .dataPlayedBack) {
(callbackType) in
// file has finished playing from listener’s perspective
// notify to stop the engine and update UI
})
• Data has been consumed, same as the existing completion handlers
• The buffer can be recycled, more data can be scheduled
• Data has been output by the player
• Useful in manual rendering mode
• Does not account for any downstream signal processing latency
Completion Callbacks
.dataConsumed
.dataRendered
NEW
AVAudioEngine
Summary
AVAudioEngine
• Manual rendering
• Auto shutdown
AVAudioPlayerNode
• Completion callbacks
Deprecation coming soon (2018)
• AUGraph
NEW
•
AVAudioSession
AirPlay 2 Support
AirPlay 2 - new technology on iOS, tvOS and macOS
• Multi-room audio with AirPlay 2 capable devices
Long-form audio applications
• Content - music, podcasts etc.
• Separate, shared audio route to AirPlay 2 devices
• New AVAudioSession API for an application to identify itself as long-form
Introducing AirPlay 2 Session 509 Thursday 4:10PM
NEW
•
AUAudioUnit
AU MIDI Output
AU can emit MIDI output synchronized with its audio output
Host sets a block on the AU to be called every render cycle
Host can record/edit both MIDI performance and audio output from the AU
*MIDIOutputNames
MIDIOutputEventBlock
NEW
AU View Configuration
Host applications decide how to display UI for AUs
Current limitations
• No standard view sizes defined
• AUs are supposed to adapt to any view size chosen by host
Audio Unit ExtensionHost
AU Preferred View Configuration
supportedViewConfigurations
(availableViewConfigurations)
Array of all 

possible view configurations
NEW
Audio Unit ExtensionHost
AU Preferred View Configuration
supportedViewConfigurations
(availableViewConfigurations)
Array of all 

possible view configurations
IndexSet of 

supported view configurations
NEW
Audio Unit ExtensionHost
AU Preferred View Configuration
supportedViewConfigurations
(availableViewConfigurations)
Array of all 

possible view configurations
IndexSet of 

supported view configurations
Chosen view configuration select(viewConfiguration)
NEW
//AU Preferred View Configuration
Code example - host application
var smallConfigActive: Bool = false // true if the small view is the currently active one
@IBAction func toggleViewModes(_ sender: AnyObject?) {
guard audioUnit = self.engine.audioUnit else { return }
let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400,
hostHasController: false)
let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200,
hostHasController: true)
let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig])
if supportedIndices.count == 2 {
audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig)
self.smallConfigActive = !self.smallConfigActive
}
}
NEW
//AU Preferred View Configuration
Code example - host application
var smallConfigActive: Bool = false // true if the small view is the currently active one
@IBAction func toggleViewModes(_ sender: AnyObject?) {
guard audioUnit = self.engine.audioUnit else { return }
let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400,
hostHasController: false)
let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200,
hostHasController: true)
let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig])
if supportedIndices.count == 2 {
audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig)
self.smallConfigActive = !self.smallConfigActive
}
}
NEW
//AU Preferred View Configuration
Code example - host application
var smallConfigActive: Bool = false // true if the small view is the currently active one
@IBAction func toggleViewModes(_ sender: AnyObject?) {
guard audioUnit = self.engine.audioUnit else { return }
let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400,
hostHasController: false)
let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200,
hostHasController: true)
let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig])
if supportedIndices.count == 2 {
audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig)
self.smallConfigActive = !self.smallConfigActive
}
}
NEW
//AU Preferred View Configuration
Code example - host application
var smallConfigActive: Bool = false // true if the small view is the currently active one
@IBAction func toggleViewModes(_ sender: AnyObject?) {
guard audioUnit = self.engine.audioUnit else { return }
let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400,
hostHasController: false)
let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200,
hostHasController: true)
let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig])
if supportedIndices.count == 2 {
audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig)
self.smallConfigActive = !self.smallConfigActive
}
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func supportedViewConfigurations(_ availableViewConfigurations:
[AUAudioUnitViewConfiguration]) -> IndexSet {
var result = NSMutableIndexSet()
for (index, config) in availableViewConfigurations.enumerated() {
// check if the config (width, height, hostHasController) is supported
// a config of 0x0 (default full size) must always be supported
if isConfigurationSupported(config) {
result.add(index)
}
}
return result as IndexSet
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func supportedViewConfigurations(_ availableViewConfigurations:
[AUAudioUnitViewConfiguration]) -> IndexSet {
var result = NSMutableIndexSet()
for (index, config) in availableViewConfigurations.enumerated() {
// check if the config (width, height, hostHasController) is supported
// a config of 0x0 (default full size) must always be supported
if isConfigurationSupported(config) {
result.add(index)
}
}
return result as IndexSet
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func supportedViewConfigurations(_ availableViewConfigurations:
[AUAudioUnitViewConfiguration]) -> IndexSet {
var result = NSMutableIndexSet()
for (index, config) in availableViewConfigurations.enumerated() {
// check if the config (width, height, hostHasController) is supported
// a config of 0x0 (default full size) must always be supported
if isConfigurationSupported(config) {
result.add(index)
}
}
return result as IndexSet
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func supportedViewConfigurations(_ availableViewConfigurations:
[AUAudioUnitViewConfiguration]) -> IndexSet {
var result = NSMutableIndexSet()
for (index, config) in availableViewConfigurations.enumerated() {
// check if the config (width, height, hostHasController) is supported
// a config of 0x0 (default full size) must always be supported
if isConfigurationSupported(config) {
result.add(index)
}
}
return result as IndexSet
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func supportedViewConfigurations(_ availableViewConfigurations:
[AUAudioUnitViewConfiguration]) -> IndexSet {
var result = NSMutableIndexSet()
for (index, config) in availableViewConfigurations.enumerated() {
// check if the config (width, height, hostHasController) is supported
// a config of 0x0 (default full size) must always be supported
if isConfigurationSupported(config) {
result.add(index)
}
}
return result as IndexSet
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) {
// configuration selected by host, used by view controller to re-arrange its view
self.currentViewConfiguration = viewConfiguration
self.viewController?.selectViewConfig(self.currentViewConfiguration)
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) {
// configuration selected by host, used by view controller to re-arrange its view
self.currentViewConfiguration = viewConfiguration
self.viewController?.selectViewConfig(self.currentViewConfiguration)
}
NEW
//AU Preferred View Configuration
Code example - AU extension
override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) {
// configuration selected by host, used by view controller to re-arrange its view
self.currentViewConfiguration = viewConfiguration
self.viewController?.selectViewConfig(self.currentViewConfiguration)
}
NEW
•
Other Enhancements
Audio Formats
FLAC (Free Lossless Audio Codec)
• Codec, file, and streaming support
• Content distribution, streaming applications
Opus
• Codec support
• File I/O using .caf container
• VOIP applications
NEW
Spatial Audio Formats
B-Format
Audio stream is regular PCM
File container .caf
B-format: W,X,Y,Z
• 1st order ambisonics
kAudioChannelLayoutTag_Ambisonic_B_Format
NEW
•
AVAudioEngine
•
AVAudioSession
•
watchOS
•
AUAudioUnit
•
Other Enhancements
•
Inter-Device Audio Mode (IDAM)
Torrey Holbrook Walker, Audio/MIDI Black Ops
•
Inter-Device Audio Mode (IDAM)
Inter-Device Audio Mode
Record audio digitally via Lightning-to-

USB cable
USB 2.0 audio class-compliant implementation
Available since El Capitan and iOS 9
Inter-Device Audio Mode
IDAM
Inter-Device Audio Mode+ MIDI
IDAM
Inter-Device Audio + MIDI
Send and receive MIDI via Lightning-to-USB cable
Class-compliant USB MIDI implementation
Requires iOS 11 and macOS El Capitan or later
Auto-enabled in IDAM configuration
NEW
Inter-Device Audio + MIDI
Device can charge and sync in IDAM configuration
Photo import and tethering are temporarily disabled
Audio device aggregation is ok
Use your iOS devices as a MIDI controllers, destinations, or both
NEW
Summary
AVAudioEngine - manual rendering
AVAudioSession - Airplay 2 support
watchOS - recording
AUAudioUnit - preferred view size, MIDI output
Other enhancements - audio formats (FLAC, Opus, HOA)
Inter-Device Audio and MIDI

Weitere ähnliche Inhalte

Was ist angesagt?

LCU14 500 ARM Trusted Firmware
LCU14 500 ARM Trusted FirmwareLCU14 500 ARM Trusted Firmware
LCU14 500 ARM Trusted FirmwareLinaro
 
The Immigration Process Infographic
The Immigration Process InfographicThe Immigration Process Infographic
The Immigration Process InfographicComing To America
 
Android audio system(audiopolicy_manager)
Android audio system(audiopolicy_manager)Android audio system(audiopolicy_manager)
Android audio system(audiopolicy_manager)fefe7270
 
Integrated Register Allocation introduction
Integrated Register Allocation introductionIntegrated Register Allocation introduction
Integrated Register Allocation introductionShiva Chen
 
Embedded Linux from Scratch to Yocto
Embedded Linux from Scratch to YoctoEmbedded Linux from Scratch to Yocto
Embedded Linux from Scratch to YoctoSherif Mousa
 
Linux Ethernet device driver
Linux Ethernet device driverLinux Ethernet device driver
Linux Ethernet device driver艾鍗科技
 
Speed up your asset imports for big projects - Unite Copenhagen 2019
Speed up your asset imports for big projects - Unite Copenhagen 2019Speed up your asset imports for big projects - Unite Copenhagen 2019
Speed up your asset imports for big projects - Unite Copenhagen 2019Unity Technologies
 
NVIDIA OpenGL 4.6 in 2017
NVIDIA OpenGL 4.6 in 2017NVIDIA OpenGL 4.6 in 2017
NVIDIA OpenGL 4.6 in 2017Mark Kilgard
 
Taking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next GenerationTaking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
 
Prijslijst de nieuwe ford focus
Prijslijst de nieuwe ford focusPrijslijst de nieuwe ford focus
Prijslijst de nieuwe ford focussteeringnews
 
Ultra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-Resolution
Ultra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-ResolutionUltra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-Resolution
Ultra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-ResolutionIntel® Software
 
Stable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal FilteringStable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal FilteringElectronic Arts / DICE
 
Linux kernel status in RISC-V
Linux kernel status in RISC-VLinux kernel status in RISC-V
Linux kernel status in RISC-VAtish Patra
 
LCE13: Android Graphics Upstreaming
LCE13: Android Graphics UpstreamingLCE13: Android Graphics Upstreaming
LCE13: Android Graphics UpstreamingLinaro
 
OPTE: Online Per-title Encoding for Live Video Streaming.pdf
OPTE: Online Per-title Encoding for Live Video Streaming.pdfOPTE: Online Per-title Encoding for Live Video Streaming.pdf
OPTE: Online Per-title Encoding for Live Video Streaming.pdfVignesh V Menon
 
Run Qt on Linux embedded systems using Yocto
Run Qt on Linux embedded systems using YoctoRun Qt on Linux embedded systems using Yocto
Run Qt on Linux embedded systems using YoctoMarco Cavallini
 

Was ist angesagt? (20)

LCU14 500 ARM Trusted Firmware
LCU14 500 ARM Trusted FirmwareLCU14 500 ARM Trusted Firmware
LCU14 500 ARM Trusted Firmware
 
Codeql Variant Analysis
Codeql Variant AnalysisCodeql Variant Analysis
Codeql Variant Analysis
 
CMake best practices
CMake best practicesCMake best practices
CMake best practices
 
The Immigration Process Infographic
The Immigration Process InfographicThe Immigration Process Infographic
The Immigration Process Infographic
 
Android audio system(audiopolicy_manager)
Android audio system(audiopolicy_manager)Android audio system(audiopolicy_manager)
Android audio system(audiopolicy_manager)
 
Integrated Register Allocation introduction
Integrated Register Allocation introductionIntegrated Register Allocation introduction
Integrated Register Allocation introduction
 
Embedded Linux from Scratch to Yocto
Embedded Linux from Scratch to YoctoEmbedded Linux from Scratch to Yocto
Embedded Linux from Scratch to Yocto
 
Advanced C
Advanced C Advanced C
Advanced C
 
Linux Ethernet device driver
Linux Ethernet device driverLinux Ethernet device driver
Linux Ethernet device driver
 
Speed up your asset imports for big projects - Unite Copenhagen 2019
Speed up your asset imports for big projects - Unite Copenhagen 2019Speed up your asset imports for big projects - Unite Copenhagen 2019
Speed up your asset imports for big projects - Unite Copenhagen 2019
 
NVIDIA OpenGL 4.6 in 2017
NVIDIA OpenGL 4.6 in 2017NVIDIA OpenGL 4.6 in 2017
NVIDIA OpenGL 4.6 in 2017
 
Exoplayer 2
Exoplayer  2Exoplayer  2
Exoplayer 2
 
Taking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next GenerationTaking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next Generation
 
Prijslijst de nieuwe ford focus
Prijslijst de nieuwe ford focusPrijslijst de nieuwe ford focus
Prijslijst de nieuwe ford focus
 
Ultra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-Resolution
Ultra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-ResolutionUltra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-Resolution
Ultra HD Video Scaling: Low-Power HW FF vs. CNN-based Super-Resolution
 
Stable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal FilteringStable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal Filtering
 
Linux kernel status in RISC-V
Linux kernel status in RISC-VLinux kernel status in RISC-V
Linux kernel status in RISC-V
 
LCE13: Android Graphics Upstreaming
LCE13: Android Graphics UpstreamingLCE13: Android Graphics Upstreaming
LCE13: Android Graphics Upstreaming
 
OPTE: Online Per-title Encoding for Live Video Streaming.pdf
OPTE: Online Per-title Encoding for Live Video Streaming.pdfOPTE: Online Per-title Encoding for Live Video Streaming.pdf
OPTE: Online Per-title Encoding for Live Video Streaming.pdf
 
Run Qt on Linux embedded systems using Yocto
Run Qt on Linux embedded systems using YoctoRun Qt on Linux embedded systems using Yocto
Run Qt on Linux embedded systems using Yocto
 

Andere mochten auch

KKBOX WWDC17 Xcode IDE - Hardy
KKBOX WWDC17  Xcode IDE - HardyKKBOX WWDC17  Xcode IDE - Hardy
KKBOX WWDC17 Xcode IDE - HardyLiyao Chen
 
KKBOX WWDC17 Core Image - Daniel Tien
KKBOX WWDC17 Core Image - Daniel TienKKBOX WWDC17 Core Image - Daniel Tien
KKBOX WWDC17 Core Image - Daniel TienLiyao Chen
 
KKBOX WWDC17 Swift and Foundation - Liyao
KKBOX WWDC17 Swift and Foundation - LiyaoKKBOX WWDC17 Swift and Foundation - Liyao
KKBOX WWDC17 Swift and Foundation - LiyaoLiyao Chen
 
KKBOX WWDC17 SiriKit and CoreSpotlight - Seraph
KKBOX WWDC17  SiriKit and CoreSpotlight - SeraphKKBOX WWDC17  SiriKit and CoreSpotlight - Seraph
KKBOX WWDC17 SiriKit and CoreSpotlight - SeraphLiyao Chen
 
KKBOX WWDC17 Xcode debug - Oliver
KKBOX WWDC17  Xcode debug - OliverKKBOX WWDC17  Xcode debug - Oliver
KKBOX WWDC17 Xcode debug - OliverLiyao Chen
 
KKBOX WWDC17 UIKit - QQ
KKBOX WWDC17 UIKit - QQKKBOX WWDC17 UIKit - QQ
KKBOX WWDC17 UIKit - QQLiyao Chen
 
KKBOX WWDC17 UIKit Drag and Drop - Mario
KKBOX WWDC17  UIKit Drag and Drop - MarioKKBOX WWDC17  UIKit Drag and Drop - Mario
KKBOX WWDC17 UIKit Drag and Drop - MarioLiyao Chen
 
KKBOX WWDC17 Security - Antony
KKBOX WWDC17 Security - AntonyKKBOX WWDC17 Security - Antony
KKBOX WWDC17 Security - AntonyLiyao Chen
 
KKBOX WWDC17 Notification and Autolayout - Jefferey
KKBOX WWDC17 Notification and Autolayout - JeffereyKKBOX WWDC17 Notification and Autolayout - Jefferey
KKBOX WWDC17 Notification and Autolayout - JeffereyLiyao Chen
 
KKBOX WWDC17 WatchOS - Dada
KKBOX WWDC17  WatchOS  - DadaKKBOX WWDC17  WatchOS  - Dada
KKBOX WWDC17 WatchOS - DadaLiyao Chen
 
KKBOX WWDC17 Performance and Testing - Hokila
KKBOX WWDC17 Performance and Testing - HokilaKKBOX WWDC17 Performance and Testing - Hokila
KKBOX WWDC17 Performance and Testing - HokilaLiyao Chen
 
專利入門
專利入門專利入門
專利入門Keico Tu
 

Andere mochten auch (12)

KKBOX WWDC17 Xcode IDE - Hardy
KKBOX WWDC17  Xcode IDE - HardyKKBOX WWDC17  Xcode IDE - Hardy
KKBOX WWDC17 Xcode IDE - Hardy
 
KKBOX WWDC17 Core Image - Daniel Tien
KKBOX WWDC17 Core Image - Daniel TienKKBOX WWDC17 Core Image - Daniel Tien
KKBOX WWDC17 Core Image - Daniel Tien
 
KKBOX WWDC17 Swift and Foundation - Liyao
KKBOX WWDC17 Swift and Foundation - LiyaoKKBOX WWDC17 Swift and Foundation - Liyao
KKBOX WWDC17 Swift and Foundation - Liyao
 
KKBOX WWDC17 SiriKit and CoreSpotlight - Seraph
KKBOX WWDC17  SiriKit and CoreSpotlight - SeraphKKBOX WWDC17  SiriKit and CoreSpotlight - Seraph
KKBOX WWDC17 SiriKit and CoreSpotlight - Seraph
 
KKBOX WWDC17 Xcode debug - Oliver
KKBOX WWDC17  Xcode debug - OliverKKBOX WWDC17  Xcode debug - Oliver
KKBOX WWDC17 Xcode debug - Oliver
 
KKBOX WWDC17 UIKit - QQ
KKBOX WWDC17 UIKit - QQKKBOX WWDC17 UIKit - QQ
KKBOX WWDC17 UIKit - QQ
 
KKBOX WWDC17 UIKit Drag and Drop - Mario
KKBOX WWDC17  UIKit Drag and Drop - MarioKKBOX WWDC17  UIKit Drag and Drop - Mario
KKBOX WWDC17 UIKit Drag and Drop - Mario
 
KKBOX WWDC17 Security - Antony
KKBOX WWDC17 Security - AntonyKKBOX WWDC17 Security - Antony
KKBOX WWDC17 Security - Antony
 
KKBOX WWDC17 Notification and Autolayout - Jefferey
KKBOX WWDC17 Notification and Autolayout - JeffereyKKBOX WWDC17 Notification and Autolayout - Jefferey
KKBOX WWDC17 Notification and Autolayout - Jefferey
 
KKBOX WWDC17 WatchOS - Dada
KKBOX WWDC17  WatchOS  - DadaKKBOX WWDC17  WatchOS  - Dada
KKBOX WWDC17 WatchOS - Dada
 
KKBOX WWDC17 Performance and Testing - Hokila
KKBOX WWDC17 Performance and Testing - HokilaKKBOX WWDC17 Performance and Testing - Hokila
KKBOX WWDC17 Performance and Testing - Hokila
 
專利入門
專利入門專利入門
專利入門
 

Ähnlich wie KKBOX WWDC17 Airplay 2 - Dolphin

A (Mis-) Guided Tour of the Web Audio API
A (Mis-) Guided Tour of the Web Audio APIA (Mis-) Guided Tour of the Web Audio API
A (Mis-) Guided Tour of the Web Audio APIEdward B. Rockower
 
Multimedia in Higher Education
Multimedia in Higher EducationMultimedia in Higher Education
Multimedia in Higher Educationlearning20
 
Building Modern Audio Apps with AVAudioEngine
Building Modern Audio Apps with AVAudioEngineBuilding Modern Audio Apps with AVAudioEngine
Building Modern Audio Apps with AVAudioEngineBob McCune
 
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
 
Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
 
Introduction to AV Foundation
Introduction to AV FoundationIntroduction to AV Foundation
Introduction to AV FoundationChris Adamson
 
Multimedia in Higher Education
Multimedia in Higher EducationMultimedia in Higher Education
Multimedia in Higher Educationlearning20
 
Sound recording glossary improved 2
Sound recording glossary   improved 2Sound recording glossary   improved 2
Sound recording glossary improved 2CameronMcRae901
 
Ig2 task 1 work sheet
Ig2 task 1 work sheetIg2 task 1 work sheet
Ig2 task 1 work sheetNeilRogero
 
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Chris Adamson
 
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
 
Podcasting sfrome2011 by Maria Grazia Bovo (FAO)
Podcasting sfrome2011 by Maria Grazia Bovo (FAO)Podcasting sfrome2011 by Maria Grazia Bovo (FAO)
Podcasting sfrome2011 by Maria Grazia Bovo (FAO)ShareFair
 
Standardize Your Flash with Adobe OSMF (0.9)
Standardize Your Flash with Adobe OSMF (0.9)Standardize Your Flash with Adobe OSMF (0.9)
Standardize Your Flash with Adobe OSMF (0.9)rblank9
 
Ig2 task 1 work sheet - JS
Ig2 task 1 work sheet - JSIg2 task 1 work sheet - JS
Ig2 task 1 work sheet - JSJamieShepherd
 
Sound recording glossary
Sound recording glossarySound recording glossary
Sound recording glossaryJakeyhyatt123
 
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]Chris Adamson
 
yapi.js introduction (mopcon 2016 version)
yapi.js introduction (mopcon 2016 version)yapi.js introduction (mopcon 2016 version)
yapi.js introduction (mopcon 2016 version)Jesse (Chien Chen) Chen
 
Effective HTML5 game audio
Effective HTML5 game audioEffective HTML5 game audio
Effective HTML5 game audioChris Khoo
 

Ähnlich wie KKBOX WWDC17 Airplay 2 - Dolphin (20)

A (Mis-) Guided Tour of the Web Audio API
A (Mis-) Guided Tour of the Web Audio APIA (Mis-) Guided Tour of the Web Audio API
A (Mis-) Guided Tour of the Web Audio API
 
Multimedia in Higher Education
Multimedia in Higher EducationMultimedia in Higher Education
Multimedia in Higher Education
 
Building Modern Audio Apps with AVAudioEngine
Building Modern Audio Apps with AVAudioEngineBuilding Modern Audio Apps with AVAudioEngine
Building Modern Audio Apps with AVAudioEngine
 
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)
 
Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)
 
Android Multimedia Support
Android Multimedia SupportAndroid Multimedia Support
Android Multimedia Support
 
Introduction to AV Foundation
Introduction to AV FoundationIntroduction to AV Foundation
Introduction to AV Foundation
 
Multimedia in Higher Education
Multimedia in Higher EducationMultimedia in Higher Education
Multimedia in Higher Education
 
Sound recording glossary improved 2
Sound recording glossary   improved 2Sound recording glossary   improved 2
Sound recording glossary improved 2
 
Ig2 task 1 work sheet
Ig2 task 1 work sheetIg2 task 1 work sheet
Ig2 task 1 work sheet
 
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
 
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
 
Podcasting sfrome2011 by Maria Grazia Bovo (FAO)
Podcasting sfrome2011 by Maria Grazia Bovo (FAO)Podcasting sfrome2011 by Maria Grazia Bovo (FAO)
Podcasting sfrome2011 by Maria Grazia Bovo (FAO)
 
Standardize Your Flash with Adobe OSMF (0.9)
Standardize Your Flash with Adobe OSMF (0.9)Standardize Your Flash with Adobe OSMF (0.9)
Standardize Your Flash with Adobe OSMF (0.9)
 
Ig2 task 1 work sheet - JS
Ig2 task 1 work sheet - JSIg2 task 1 work sheet - JS
Ig2 task 1 work sheet - JS
 
Sound recording glossary
Sound recording glossarySound recording glossary
Sound recording glossary
 
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
Core Audio: Don't Be Afraid to Play it LOUD! [360iDev, San Jose 2010]
 
yapi.js introduction (mopcon 2016 version)
yapi.js introduction (mopcon 2016 version)yapi.js introduction (mopcon 2016 version)
yapi.js introduction (mopcon 2016 version)
 
Effective HTML5 game audio
Effective HTML5 game audioEffective HTML5 game audio
Effective HTML5 game audio
 
IG2 Task 1
IG2 Task 1 IG2 Task 1
IG2 Task 1
 

Mehr von Liyao Chen

Auto Layout part 1
Auto Layout part 1Auto Layout part 1
Auto Layout part 1Liyao Chen
 
iOS Unit testing II
iOS Unit testing IIiOS Unit testing II
iOS Unit testing IILiyao Chen
 
iOS Unit test getting stared
iOS Unit test getting starediOS Unit test getting stared
iOS Unit test getting staredLiyao Chen
 
Continuous Integration
Continuous  IntegrationContinuous  Integration
Continuous IntegrationLiyao Chen
 
iOS Design to Code - Code
iOS Design to Code - CodeiOS Design to Code - Code
iOS Design to Code - CodeLiyao Chen
 
iOS Design to Code - Design
iOS Design to Code - DesigniOS Design to Code - Design
iOS Design to Code - DesignLiyao Chen
 
Beta testing with CI
Beta testing with CIBeta testing with CI
Beta testing with CILiyao Chen
 
PTTHOT x IDEAS_HACKATHON 2014
PTTHOT x IDEAS_HACKATHON 2014PTTHOT x IDEAS_HACKATHON 2014
PTTHOT x IDEAS_HACKATHON 2014Liyao Chen
 
Windows 8 apps dev.整理及分享
Windows 8 apps dev.整理及分享Windows 8 apps dev.整理及分享
Windows 8 apps dev.整理及分享Liyao Chen
 

Mehr von Liyao Chen (10)

Auto Layout part 1
Auto Layout part 1Auto Layout part 1
Auto Layout part 1
 
iOS Unit testing II
iOS Unit testing IIiOS Unit testing II
iOS Unit testing II
 
iOS Unit test getting stared
iOS Unit test getting starediOS Unit test getting stared
iOS Unit test getting stared
 
Continuous Integration
Continuous  IntegrationContinuous  Integration
Continuous Integration
 
iOS Design to Code - Code
iOS Design to Code - CodeiOS Design to Code - Code
iOS Design to Code - Code
 
iOS Design to Code - Design
iOS Design to Code - DesigniOS Design to Code - Design
iOS Design to Code - Design
 
Beta testing with CI
Beta testing with CIBeta testing with CI
Beta testing with CI
 
PTTHOT x IDEAS_HACKATHON 2014
PTTHOT x IDEAS_HACKATHON 2014PTTHOT x IDEAS_HACKATHON 2014
PTTHOT x IDEAS_HACKATHON 2014
 
選擇
選擇選擇
選擇
 
Windows 8 apps dev.整理及分享
Windows 8 apps dev.整理及分享Windows 8 apps dev.整理及分享
Windows 8 apps dev.整理及分享
 

Kürzlich hochgeladen

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CVKhem
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 

Kürzlich hochgeladen (20)

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 

KKBOX WWDC17 Airplay 2 - Dolphin

  • 1. #WWDC17 © 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. David Saracino, AirPlay Engineer • Introducing AirPlay 2 • Unlocking multi-room audio • Session 509 Media
  • 4. AirPlay 2 Audio Wireless audio Multi-room playback Enhanced buffering NEW
  • 5. AirPlay 2 Audio Wireless audio Multi-room playback Enhanced buffering Multi-device control NEW
  • 7. AirPlay 2 Supported speakers HomePod Apple TV* 3rd Party * Requires Apple TV (4th generation)
  • 9. AirPlay 2 Adoption Identify as long-form audio
  • 10. AirPlay 2 Adoption Identify as long-form audio Add an AirPlay picker
  • 11. AirPlay 2 Adoption Identify as long-form audio Add an AirPlay picker Integrate with MediaPlayer framework
  • 12. AirPlay 2 Adoption Identify as long-form audio Add an AirPlay picker Integrate with MediaPlayer framework Adopt an AirPlay 2 playback API
  • 14. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio NEW
  • 15. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio Long-form audio route Session Arbitration NEW
  • 16. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio Long-form audio route Session Arbitration NEW
  • 17. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio Long-form audio route Session Arbitration System audio route Session Arbitration and Mixing NEW
  • 18. NEW Long-form Audio Routing (iOS and tvOS) Long-form Audio (AirPlay 2) Session Arbitration Applications that use
 the long-form audio route System Audio Session Arbitration and Mixing Applications that use
 the system audio route
  • 19. //Long-form Audio Routing (iOS and tvOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: .longForm) } catch { // handle errors } NEW
  • 20. //Long-form Audio Routing (iOS and tvOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: .longForm) } catch { // handle errors } NEW
  • 21. //Long-form Audio Routing (iOS and tvOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: .longForm) } catch { // handle errors } NEW
  • 22. Long-form Audio Routing (macOS) Long-form Audio (AirPlay 2) Arbitration Applications that use
 the long-form audio route NEW Default Device Mixing Applications that use
 the system audio route
  • 23. //Long-form Audio Routing (macOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setRouteSharingPolicy(.longForm) } catch { // handle errors } NEW
  • 24. //Long-form Audio Routing (macOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setRouteSharingPolicy(.longForm) } catch { // handle errors } NEW
  • 25. Add an AirPlay Picker
  • 26. Add an AirPlay Picker Adopt
  • 27. Add an AirPlay Picker Adopt • AVRoutePickerView NEW
  • 28. Add an AirPlay Picker Adopt • AVRoutePickerView • AVRouteDetector NEW
  • 29. Integrate with Media Player Framework
  • 30. Handle remote media commands • MPRemoteCommandCenter Integrate with Media Player Framework
  • 31. Handle remote media commands • MPRemoteCommandCenter Display current track info • MPNowPlayingInfoCenter Integrate with Media Player Framework
  • 33. Audio Buffering Existing AirPlay Real-time audio stream Speaker adds a small buffer before output Works fine for streaming to single speaker
  • 34. Enhanced Audio Buffering AirPlay 2 Large audio buffering capacity on speakers Faster-than-real-time streaming to speakers Benefits • Adds robustness • More responsive playback NEW
  • 35. Enhanced Audio Buffering Adoption Supported with specific playback APIs
  • 36. Enhanced Audio Buffering Adoption Supported with specific playback APIs • AVPlayer / AVQueuePlayer
  • 37. Enhanced Audio Buffering Adoption Supported with specific playback APIs • AVPlayer / AVQueuePlayer • AVSampleBufferAudioRenderer - AVSampleBufferRenderSynchronizer NEW
  • 38. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer NEW
  • 39. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Your app has additional responsibilities NEW
  • 40. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Your app has additional responsibilities • Sourcing and parsing the content • Providing raw audio buffers for rendering NEW
  • 41. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Wants more data Client App SynchronizerAudioRenderer
  • 42. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data
  • 43. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data Audio Data
  • 44. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data SetRate 1 Audio Data
  • 45. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data Audio Data
  • 46. Requested data amount varies by audio route Audio Buffer Levels AVSampleBufferAudioRenderer Local Seconds Bluetooth Seconds AirPlay Seconds AirPlay 2 Minutes
  • 47. Manually changing Playhead location Seek AVSampleBufferAudioRenderer
  • 52. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  • 53. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  • 54. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  • 55. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  • 56. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  • 57. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  • 58. Play Queues Timelines Queue Renderer 0 100 200 Continuous Timeline 0 100 1000 0 Item 1 Item 2 Item 3
  • 59. Play Queues Editing during playback Queue Last Enqueued Sample Item 1 Item 2 Item 3 Renderer Playhead
  • 60. Item 1 Play Queues Editing during playback Queue Renderer Playhead Item 3 Last Enqueued Sample
  • 61. Item 3 Play Queues Editing during playback Queue Item 1 Item 4 Renderer Item 2’s Data! Playhead Last Enqueued Sample
  • 62. Flushing from a Source Time 1. Stop enqueueing audio data 2. flush(fromSourceTime:) 3. Wait for the callback
  • 63. Item 3 Play Queues Replacing incorrect renderer data Queue Item 1 Item 4 Renderer Flush from Source Time
  • 64. Item 3 Play Queues Replacing incorrect renderer data Queue Item 1 Item 4 Renderer Last Enqueued SamplePlayhead
  • 65. Flushing from a Source Time Gotchas
  • 66. Flushing from a Source Time Gotchas Flush may fail!
  • 67. Flushing from a Source Time Gotchas Flush may fail! • Source time is too close to playhead
  • 68. Flushing from a Source Time Gotchas Flush may fail! • Source time is too close to playhead Wait for the callback!
  • 69. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  • 70. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  • 71. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  • 72. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  • 74. Supported Audio Formats AVSampleBufferAudioRenderer All platform-supported audio formats • e.g. LPCM, AAC, mp3, or ALAC • e.g. 44.1 kHZ or 48 kHZ
  • 75. Supported Audio Formats AVSampleBufferAudioRenderer All platform-supported audio formats • e.g. LPCM, AAC, mp3, or ALAC • e.g. 44.1 kHZ or 48 kHZ • various bit depths Mixed formats may be enqueued AAC @ 44.1kHZ MP3 @ 48kHz 16bit ALAC @ 48kHzRenderer
  • 80. Client App Video Synchronization AVSampleBufferDisplayLayer SetRate 1 AudioRenderer Audio Data Synchronizer DisplayLayer Video Data NEW
  • 82. #WWDC17 © 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. Akshatha Nagesh, AudioEngine-eer Béla Balázs, Audio Artisan Torrey Holbrook Walker, Audio/MIDI Black Ops • What’s New in Audio • Session 501 Media
  • 85. Recap Powerful, feature-rich, Objective-C / Swift API set Simplifies realtime audio, easier to use Supports • Playback, recording, processing, mixing • 3D spatialization AVAudioEngine in Practice WWDC14 What’s New in Core Audio WWDC15
  • 86. What’s New AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks NEW
  • 89. Sample Engine Setup Manual rendering NodeTapBlock (Analyze) InputNode EffectNode PlayerNode PlayerNode MixerNode OutputNode NEW Application
  • 90. Engine is not connected to any audio device Renders in response to requests from the client Modes • Offline • Realtime Manual Rendering NEW
  • 91. Engine and nodes operate under no deadlines or realtime constraints A node may choose to: • Use a more expensive signal processing algorithm • Block on render thread for more data if needed - For example, player node may wait until its worker thread reads the 
 data from disk Offline Manual Rendering NEW
  • 92. Application Offline Context Offline Manual Rendering Example NEW Source File Destination File PlayerNode EffectNode OutputNode
  • 93. Post-processing of audio files, for example, apply reverb, effects etc. Mixing of audio files Offline audio processing using CPU intensive (higher quality) algorithms Tuning, debugging or testing the engine setup Offline Manual Rendering Applications NEW
  • 94. The engine and nodes: • Operate under realtime constraints • Do not make any blocking calls like blocking on a mutex, calling libdispatch etc., on the render thread - A node may drop the data if it is not ready to be rendered in time Realtime Manual Rendering NEW
  • 95. Processing audio in an AUAudioUnit’s internalRenderBlock Processing audio data in a movie/video during streaming/playback Realtime Manual Rendering Applications NEW
  • 96. Realtime Manual Rendering Example NEW Realtime Context Application Audio from an
 Input Movie Stream Audio into an
 Output Movie Stream InputNode EffectNode OutputNode TV
  • 97. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  • 98. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  • 99. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  • 100. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  • 101. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  • 102. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  • 103. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  • 104. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  • 105. // to render from realtime context OSStatus outputError = noErr; const auto status = renderBlock(framesToRender, outputBufferList, &outputError); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleProcessedOutput(outputBufferList); // data rendered successfully break; case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleProcessedOutput(outputBufferList); // input node did not provide data, // but other sources may have rendered break; .. default: break; } NEW
  • 106. // to render from realtime context OSStatus outputError = noErr; const auto status = renderBlock(framesToRender, outputBufferList, &outputError); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleProcessedOutput(outputBufferList); // data rendered successfully break; case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleProcessedOutput(outputBufferList); // input node did not provide data, // but other sources may have rendered break; .. default: break; } NEW
  • 107. Offline • Can use either ObjC/Swift render method or the block based render call Realtime • Must use the block based render call Manual Rendering Render calls NEW
  • 108. What’s New AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks NEW
  • 109. Hardware is stopped if running idle for a certain duration, 
 started dynamically when needed Safety net for conserving power Enforced behavior on watchOS, optional on other platforms Auto Shutdown isAutoShutdownEnabled NEW
  • 110. What’s New AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks NEW
  • 111. Existing buffer/file completion handlers called when the data has been consumed New completion handler and callback types Completion Callbacks AVAudioPlayerNodeCompletionCallbackType .dataConsumed .dataRendered .dataPlayedBack NEW
  • 112. • Buffer/file has finished playing • Applicable only when the engine is rendering to an audio device • Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency Completion Callbacks .dataPlayedBack NEW
  • 113. • Buffer/file has finished playing • Applicable only when the engine is rendering to an audio device • Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency Completion Callbacks .dataPlayedBack NEW player.scheduleFile(file, at: nil, completionCallbackType: .dataPlayedBack) { (callbackType) in // file has finished playing from listener’s perspective // notify to stop the engine and update UI })
  • 114. • Data has been consumed, same as the existing completion handlers • The buffer can be recycled, more data can be scheduled • Data has been output by the player • Useful in manual rendering mode • Does not account for any downstream signal processing latency Completion Callbacks .dataConsumed .dataRendered NEW
  • 115. AVAudioEngine Summary AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks Deprecation coming soon (2018) • AUGraph NEW
  • 117. AirPlay 2 Support AirPlay 2 - new technology on iOS, tvOS and macOS • Multi-room audio with AirPlay 2 capable devices Long-form audio applications • Content - music, podcasts etc. • Separate, shared audio route to AirPlay 2 devices • New AVAudioSession API for an application to identify itself as long-form Introducing AirPlay 2 Session 509 Thursday 4:10PM NEW
  • 119. AU MIDI Output AU can emit MIDI output synchronized with its audio output Host sets a block on the AU to be called every render cycle Host can record/edit both MIDI performance and audio output from the AU *MIDIOutputNames MIDIOutputEventBlock NEW
  • 120. AU View Configuration Host applications decide how to display UI for AUs Current limitations • No standard view sizes defined • AUs are supposed to adapt to any view size chosen by host
  • 121. Audio Unit ExtensionHost AU Preferred View Configuration supportedViewConfigurations (availableViewConfigurations) Array of all 
 possible view configurations NEW
  • 122. Audio Unit ExtensionHost AU Preferred View Configuration supportedViewConfigurations (availableViewConfigurations) Array of all 
 possible view configurations IndexSet of 
 supported view configurations NEW
  • 123. Audio Unit ExtensionHost AU Preferred View Configuration supportedViewConfigurations (availableViewConfigurations) Array of all 
 possible view configurations IndexSet of 
 supported view configurations Chosen view configuration select(viewConfiguration) NEW
  • 124. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  • 125. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  • 126. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  • 127. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  • 128. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  • 129. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  • 130. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  • 131. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  • 132. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  • 133. //AU Preferred View Configuration Code example - AU extension override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentViewConfiguration = viewConfiguration self.viewController?.selectViewConfig(self.currentViewConfiguration) } NEW
  • 134. //AU Preferred View Configuration Code example - AU extension override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentViewConfiguration = viewConfiguration self.viewController?.selectViewConfig(self.currentViewConfiguration) } NEW
  • 135. //AU Preferred View Configuration Code example - AU extension override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentViewConfiguration = viewConfiguration self.viewController?.selectViewConfig(self.currentViewConfiguration) } NEW
  • 137. Audio Formats FLAC (Free Lossless Audio Codec) • Codec, file, and streaming support • Content distribution, streaming applications Opus • Codec support • File I/O using .caf container • VOIP applications NEW
  • 138. Spatial Audio Formats B-Format Audio stream is regular PCM File container .caf B-format: W,X,Y,Z • 1st order ambisonics kAudioChannelLayoutTag_Ambisonic_B_Format NEW
  • 140. Torrey Holbrook Walker, Audio/MIDI Black Ops • Inter-Device Audio Mode (IDAM)
  • 141. Inter-Device Audio Mode Record audio digitally via Lightning-to-
 USB cable USB 2.0 audio class-compliant implementation Available since El Capitan and iOS 9
  • 144. Inter-Device Audio + MIDI Send and receive MIDI via Lightning-to-USB cable Class-compliant USB MIDI implementation Requires iOS 11 and macOS El Capitan or later Auto-enabled in IDAM configuration NEW
  • 145. Inter-Device Audio + MIDI Device can charge and sync in IDAM configuration Photo import and tethering are temporarily disabled Audio device aggregation is ok Use your iOS devices as a MIDI controllers, destinations, or both NEW
  • 146. Summary AVAudioEngine - manual rendering AVAudioSession - Airplay 2 support watchOS - recording AUAudioUnit - preferred view size, MIDI output Other enhancements - audio formats (FLAC, Opus, HOA) Inter-Device Audio and MIDI