2. What is the connection?
Get Lucky (2013)
I Feel Love (1977)
3. Highest level is best
•
most of the time if you need audio services you will be using
AVFoundation/AVFoundation.h or
AudioToolbox/AudioServices.h
•
an example using AudioToolbox - loading a sound from a
mp3 file:
.
.
.
•
@property (assign, nonatomic) SystemSoundID noteE1;
NSURL *url = [[NSBundle mainBundle] URLForResource:@"guitarE1" withExtension:@"mp3"];
AudioServicesCreateSystemSoundID((__bridge CFURLRef)url, ¬eE1);
…then play the sound
AudioServicesPlaySystemSound(self.noteE1);
•
demo - quick look at TrackTouch source and simulator demo
4. ‘High level’ - another example:
•
SpriteKit allows you to play music (or sound effects)
as an action
•
an example of a ‘play sound’ action:
SKAction *playMusic = [SKAction playSoundFileNamed:@"flight.caf" waitForCompletion:NO];
•
demo - quick look at the Seed2 source for MyScene
and a simulator demo
6. Core Audio - some things you
need to dive deeper for
•
Say… you wanted to build an application like Final Cut Pro,
Sound Track Pro, Garage Band, your own Synth App for the iOS
or indeed the ‘Pano Composer’ :-)
•
Core Audio is the engine underlying all audio services on iOS
and OSX. Programming it directly has a cost - it’s complicated
and quite a lot of the ‘documentation’ is reading the header files
and trying things out
•
First, make sure that you really need to be at this level. For
example you could build a simple DJ mixer in AVFoundation
•
However, if you wanted to build the real full featured DJ Mixer with
complex fades, effects and playing things backwards(!) you’d be
at CoreAudio level
7. Core Audio is built on
AudioUnit components
•
Audio Units are arranged in the form of a graph (AUGraph), in
a pipeline similar to the way Core Image works with filters
•
You end up with a single output from an AUGraph that results in
the final output sound stream
•
Audio Units are based on a plugin architecture with Apple
supplying an extensive set of Audio Units that cover just about
anything you could need. The standard CoreAudio bundle is in
‘/System/Library/Components’ and there is a search path for
components so you can include Audio Component Plugins
•
You could build your own Audio Units and package them as
plugins - for an overview see WWDC 2011 Video #411
9. Core Audio - a real example
with some code
An iOS example to:
- load multiple instruments from a sound font file
- perform real-time pitch adjustment on a single instrument
- mix the output (play multiple instruments simultaneously)
We will call it the Synth Mixer Demo…
15. Documentation & Resources
What’s missing from the Apple Documentation?
- The ‘Unified’ CoreAudio Overview Document :-)
Well there is a Core Audio Overview document in the Developer Library but it was last updated in 2008…
Documentation & Resources available:
Your number one place to go is the iOS Developer Library (now much better but not complete)
The videos:
WWDC 2013: What’s New In Core Audio for iOS - Video #602
WWDC 2011: AudioSession and MultiRoute Audio - Video #505
WWDC 2011: Audio Session Management For iOS - Video #413
WWDC 2011: Music in iOS and Lion - Video #411
The documents:
https://developer.apple.com/library/ios/navigation/#section=Frameworks&topic=AVFoundation
https://developer.apple.com/library/ios/navigation/#section=Frameworks&topic=AudioToolbox
https://developer.apple.com/library/ios/navigation/#section=Frameworks&topic=AudioUnit
https://developer.apple.com/library/ios/navigation/#section=Frameworks&topic=CoreAudio
Tech notes:
- iOS Developer Library - Technical Note TN2283 ‘AUSampler - Loading Instruments’
- Example of the level of detail available: Audio Queue - Looping Compressed Audio
The header files:
This is where you will find most (and sometimes all the) detail ‘documentation’ :-)