What s New in Core Audio

Similar documents
Audio Queue Services. Dave

Audio Unit Extensions

Delivering an Exceptional Audio Experience

What s New in Audio. Media #WWDC17. Akshatha Nagesh, AudioEngine-eer Béla Balázs, Audio Artisan Torrey Holbrook Walker, Audio/MIDI Black Ops

Contents. Introduction 7. What Is Core Audio? 10. Core Audio Essentials 19. Organization of This Document 8 See Also 8

Accessibility on ios. Developing for everyone. Frameworks #WWDC14. Session 210 Clare Kasemset ios Accessibility

Mastering Xcode for iphone OS Development Part 2. Marc Verstaen Sr. Manager, iphone Tools

CS193P - Lecture 16. iphone Application Development. Audio APIs Video Playback Displaying Web Content Settings

For Mac and iphone. James McCartney Core Audio Engineer. Eric Allamanche Core Audio Engineer

Cross Platform Nearby Networking

Introducing the Photos Frameworks

WatchKit In-Depth, Part 2

Creating Great App Previews

Building Watch Apps #WWDC15. Featured. Session 108. Neil Desai watchos Engineer

ITP 342 Mobile App Dev. Audio

Announcements. Today s Topics

Creating Audio Apps for watchos

Introducing CloudKit. A how-to guide for icloud for your Apps. Frameworks #WWDC14. Session 208 Olivier Bonnet CloudKit Client Software

What s New in tvos #WWDC16. App Frameworks. Session 206. Hans Kim tvos Engineer

Audio Unit Properties Reference

What s New in watchos

Content Protection for HTTP Live Streaming

AVAudioRecorder & System Sound Services

Accessibility on OS X

Apple Core Audio Format Specification 1.0

iphone Programming Patrick H. Madden SUNY Binghamton Computer Science Department

AVAudioPlayer. avtouch Application

View Controller Lifecycle

What s New in NSCollectionView Session 225

the gamedesigninitiative at cornell university Lecture 15 Game Audio

Playing & Recording Audio Audio Queue Services Recording - AQRecorder Class

Using and Extending the Xcode Source Editor

Advanced Notifications

ITP 342 Mobile App Dev

Intro to Native ios Development. Dave Koziol Arbormoon Software, Inc.

Implementing UI Designs in Interface Builder

Stanford CS193p. Developing Applications for ios Fall Stanford CS193p. Fall 2013

Bluetooth MIDI Connection Guide

ITP 342 Mobile App Dev. Audio

Creating Extensions for ios and OS X, Part Two

Introducing the Modern WebKit API

Media and Gaming Accessibility

What s New in Core Location

HKUST. CSIT 6910A Report. iband - Musical Instrument App on Mobile Devices. Student: QIAN Li. Supervisor: Prof. David Rossiter

Address Book for iphone

Managing Documents In Your ios Apps

Mastering UIKit on tvos

Extending Your Apps with SiriKit

Working with Metal Overview

Kevin van Vechten Core OS

What s New in Core Bluetooth

Installing energyxt2.5. Mac. Double-click energyxt2.5 disk image file. Drag the energyxt2.5 folder to your "Applica- tions" folder / your desktop.

Getting the Most out of Playgrounds in Xcode

lecture 10 UI/UX and Programmatic Design cs : spring 2018

Storyboards and Controllers on OS X

Introducing On Demand Resources

Editing Media with AV Foundation

View Controller Advancements for ios8

ITP 342 Advanced Mobile App Dev. Core Data

Create an App that will drop PushPins onto a map based on addresses that the user inputs.

Using the Camera with AV Foundation Overview and best practices. Brad Ford iphone Engineering

What s New in Metal, Part 2

Enhancing your apps for the next dimension of touch

Cisco StadiumVision Mobile API for Apple ios

Enabling Your App for CarPlay

Advances in AVFoundation Playback

Integrating Game Center into a BuzzTouch 1.5 app

CS193p Spring 2010 Wednesday, May 26, 2010

CS193p Spring 2010 Wednesday, March 31, 2010

Objective-C. Stanford CS193p Fall 2013

What s New in Xcode App Signing

Introducing the Contacts Framework

What s New in imessage Apps

What s New in Testing

OVERVIEW. Why learn ios programming? Share first-hand experience. Identify platform differences. Identify similarities with.net

SONAR LE Quick Start Guide

What s New in CloudKit

Introduction to ArcGIS API for ios. Divesh Goyal Eric Ito

Advanced Cocoa Text Tips and Tricks. Aki Inoue Cocoa Engineer

BackBeat PRO 2. BackBeat PRO 2 Special Edition. User Guide

Introduction to Siri Shortcuts

What s New in Energy Debugging

Topics in Mobile Computing

Game Center Techniques, Part 1

What s New in Cocoa for macos

iphone Programming Touch, Sound, and More! Norman McEntire Founder Servin Flashlight CodeTour TouchCount CodeTour

ios Core Data Example Application

Playing & Recording Audio Audio Queue Services Playback

App Extension Best Practices

KENWOOD Remote Application. JVCKENWOOD Corporation. User Guide. Caution: Do not operate any function that takes your attention away from safe driving.

Mastering Xcode for iphone OS Development Part 1. Todd Fernandez Sr. Manager, IDEs

Creating Content with iad JS

Adopting Advanced Features of the New UI

AAC-ELD based Audio Communication on ios A Developer s Guide

Multitasking Support on the ios Platform

What s New in Foundation for Swift Session 207

Writing Energy Efficient Apps

Working With Metal Advanced

Seamless Linking to Your App

CarPlay Navigation App Programming Guide. September 28, 2018

Transcription:

Media #WWDC14 What s New in Core Audio Session 501 Torrey Holbrook Walker Master of Ceremonies 2014 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple.

What s New in Core Audio Overview Core MIDI enhancements Inter-App Audio UI views Enhanced AV Foundation audio Audio Unit Manager AVAudioSession Utility classes AVAudioEngine

What s New with Core MIDI Torrey Holbrook Walker Vice Chair, Itty Bitty MIDI Committee

Your Current MIDI Studio Mac USB MIDI Controller USB MIDI Interface Network Session Drum Machine Analog Synthesizer

Your New MIDI Studio Mac USB MIDI Interface Mac Bluetooth ipad Bluetooth Drum Machine Analog Synthesizer

MIDI over Bluetooth Send and receive MIDI over Bluetooth LE for OS X and ios Connection is secure Appears as an ordinary MIDI device

Bluetooth Overview Two key roles Central Peripheral

Bluetooth Overview Peripheral advertises MIDI capabilities Central Peripheral

Bluetooth Overview Peripheral advertises MIDI capabilities Central Peripheral

Bluetooth Overview Central scans and connects Central Peripheral

Bluetooth Overview Central scans and connects Central Peripheral

Bluetooth Overview Bluetooth MIDI I/O via characteristic Central Peripheral

Bluetooth Overview Bluetooth MIDI I/O via characteristic Central Peripheral

OS X You re Ready Already Audio MIDI Setup New Bluetooth panel

OS X You re Ready Already Advertise Scan/connect Creates standard MIDI devices

ios CoreAudioKit View Controllers New UI for Bluetooth MIDI Required to manage Bluetooth connections for your app Scan or Advertise Unused connections are terminated after a few minutes

Demo Bluetooth MIDI UI on Mac OS X + ios

Final Thoughts Works with Mac, iphone and ipad with native Bluetooth LE support Low latency Bluetooth LE bandwidth >> 3,125 B/s Standardization in the works Add Bluetooth UI views on ios

CoreAudioKit Framework Michael Hopkins Core Audio User Interface Wizard

What Is CoreAudioKit? New ios framework Provides standardized user interface elements MIDI over Bluetooth LE Inter-App Audio views Easy to adopt Works on iphone and ipad

MIDI over Bluetooth LE User interface elements Michael Hopkins Core Audio User Interface Wizard

MIDI over Bluetooth LE CABTMIDILocalPeripheralViewController Required to advertise ios device as a Bluetooth peripheral

Using BLE UI Classes #import <CoreAudioKit/CoreAudioKit.h> Configuring a local peripheral CABTMIDILocalPeripheralViewController *viewcontroller = [[CABTMIDILocalPeripheralViewController alloc] init]; [self.navigationcontroller pushviewcontroller: viewcontroller animated: YES];

Using BLE UI Classes #import <CoreAudioKit/CoreAudioKit.h> Configuring a local peripheral CABTMIDILocalPeripheralViewController *viewcontroller = [[CABTMIDILocalPeripheralViewController alloc] init]; [self.navigationcontroller pushviewcontroller: viewcontroller animated: YES];

Using BLE UI Classes #import <CoreAudioKit/CoreAudioKit.h> Configuring a local peripheral CABTMIDILocalPeripheralViewController *viewcontroller = [[CABTMIDILocalPeripheralViewController alloc] init]; [self.navigationcontroller pushviewcontroller: viewcontroller animated: YES];

MIDI over Bluetooth LE CABTMIDICentralViewController Required for discovering and connecting to Bluetooth peripherals

Using BLE UI Classes #import <CoreAudioKit/CoreAudioKit.h> Searching for and connecting to external devices CABTMIDICentralViewController *viewcontroller = [[CABTMIDICentralViewController alloc] init]; [self.navigationcontroller pushviewcontroller: viewcontroller animated: YES];

Using BLE UI Classes #import <CoreAudioKit/CoreAudioKit.h> Searching for and connecting to external devices CABTMIDICentralViewController *viewcontroller = [[CABTMIDICentralViewController alloc] init]; [self.navigationcontroller pushviewcontroller: viewcontroller animated: YES];

Using BLE UI Classes #import <CoreAudioKit/CoreAudioKit.h> Searching for and connecting to external devices CABTMIDICentralViewController *viewcontroller = [[CABTMIDICentralViewController alloc] init]; [self.navigationcontroller pushviewcontroller: viewcontroller animated: YES];

Inter-App Audio User interface elements

Inter-App Audio Review Streams audio between apps in realtime on ios Node apps can be discovered by host apps See WWDC 2013 Session 206 What s New in Core Audio for ios

Inter-App Audio Audio Host App Node App Audio I/O

Inter-App Audio Audio Host App MIDI Node App (Instrument) Audio I/O

Inter-App Audio User Interface Elements Overview Inter-App Audio Switcher Provides a simple way to switch between connected apps

Inter-App Audio User Interface Elements Overview Inter-App Audio Host Transport Displays host transport (play/pause, rewind, and record) controls 0:24

Demo Inter-App Audio user interface elements Michael Hopkins Core Audio UI Maestro

Inter-App Audio User Interface Elements Provides a consistent user experience Flexible sizing Minimal code required to adopt Subclass of UIView

CAInterAppAudioSwitcherView Creating from a nib file #import <CoreAudioKit/CoreAudioKit.h> IBOutlet CAInterAppAudioSwitcherView *switcherview; -(void) viewdidload { [super viewdidload]; switcherview.backgroundcolor = [UIColor darkgraycolor]; [switcherview setoutputaudiounit: audiounit]; }

CAInterAppAudioSwitcherView Creating from a nib file #import <CoreAudioKit/CoreAudioKit.h> IBOutlet CAInterAppAudioSwitcherView *switcherview; -(void) viewdidload { [super viewdidload]; switcherview.backgroundcolor = [UIColor darkgraycolor]; [switcherview setoutputaudiounit: audiounit]; }

CAInterAppAudioSwitcherView Creating from a nib file #import <CoreAudioKit/CoreAudioKit.h> IBOutlet CAInterAppAudioSwitcherView *switcherview; -(void) viewdidload { [super viewdidload]; switcherview.backgroundcolor = [UIColor darkgraycolor]; [switcherview setoutputaudiounit: audiounit]; }

CAInterAppAudioSwitcherView Creating from a nib file #import <CoreAudioKit/CoreAudioKit.h> IBOutlet CAInterAppAudioSwitcherView *switcherview; -(void) viewdidload { [super viewdidload]; switcherview.backgroundcolor = [UIColor darkgraycolor]; [switcherview setoutputaudiounit: outputau]; }

CAInterAppAudioTransportView Creating programmatically CAInterAppAudioTransportView *transportview = [[CAInterAppAudioTransportView alloc] initwithframe: CGRectMake(0, 0, ktransportviewheight, ktransportviewwidth)]; transportview.rewindbuttoncolor = transportview.playbuttoncolor = transportview.pausebuttoncolor = [UIColor whitecolor]; transportview.labelcolor = [UIColor lightgraycolor]; transportview.backgroundcolor = [UIColor darkgraycolor]; [transportview setoutputaudiounit: outputau]; [self addsubview: transportview];

CAInterAppAudioTransportView Creating programmatically CAInterAppAudioTransportView *transportview = [[CAInterAppAudioTransportView alloc] initwithframe: CGRectMake(0, 0, ktransportviewheight, ktransportviewwidth)]; transportview.rewindbuttoncolor = transportview.playbuttoncolor = transportview.pausebuttoncolor = [UIColor whitecolor]; transportview.labelcolor = [UIColor lightgraycolor]; transportview.backgroundcolor = [UIColor darkgraycolor]; [transportview setoutputaudiounit: outputau]; [self addsubview: transportview];

CAInterAppAudioTransportView Creating programmatically CAInterAppAudioTransportView *transportview = [[CAInterAppAudioTransportView alloc] initwithframe: CGRectMake(0, 0, ktransportviewheight, ktransportviewwidth)]; transportview.rewindbuttoncolor = transportview.playbuttoncolor = transportview.pausebuttoncolor = [UIColor whitecolor]; transportview.labelcolor = [UIColor lightgraycolor]; transportview.backgroundcolor = [UIColor darkgraycolor]; [transportview setoutputaudiounit: outputau]; [self addsubview: transportview];

CAInterAppAudioTransportView Creating programmatically CAInterAppAudioTransportView *transportview = [[CAInterAppAudioTransportView alloc] initwithframe: CGRectMake(0, 0, ktransportviewheight, ktransportviewwidth)]; transportview.rewindbuttoncolor = transportview.playbuttoncolor = transportview.pausebuttoncolor = [UIColor whitecolor]; transportview.labelcolor = [UIColor lightgraycolor]; transportview.backgroundcolor = [UIColor darkgraycolor]; [transportview setoutputaudiounit: outputau]; [self addsubview: transportview];

CAInterAppAudioTransportView Creating programmatically CAInterAppAudioTransportView *transportview = [[CAInterAppAudioTransportView alloc] initwithframe: CGRectMake(0, 0, ktransportviewheight, ktransportviewwidth)]; transportview.rewindbuttoncolor = transportview.playbuttoncolor = transportview.pausebuttoncolor = [UIColor whitecolor]; transportview.labelcolor = [UIColor lightgraycolor]; transportview.backgroundcolor = [UIColor darkgraycolor]; [transportview setoutputaudiounit: outputau]; [self addsubview: transportview];

Managing Audio Units Using AVAudioUnitComponentManager Michael Hopkins Audio Unit Wrangler

AVAudioUnitComponentManager Introduction Objective-C based API for Audio Unit host applications Querying methods for Audio Units Simple API for getting information about Audio Units Audio Unit tagging facilities Centralized Audio Unit cache

AVAudioUnitComponentManager API New classes in AV Foundation AVAudioUnitComponentManager Provides multiple search mechanisms - NSPredicates - Block-based - Backwards-compatibility mode using AudioComponentDescriptions AVAudioUnitComponent Provides information about individual Audio Units

Getting Information About an Audio Unit Finding all stereo effects using AudioComponent API #include <AudioUnit/AUComponent.h> #include <AudioUnit/AudioUnit.h> #include <AudioUnit/AudioUnitProperties.h> AudioComponentDescription desc = {kaudiounittype_effect, 0, 0, 0, 0}; AudioComponent comp = NULL; while (comp = AudioComponentFindNext(comp, &desc)) { AudioUnit au; if (AudioComponentInstanceNew(comp, &au) = noerr) { AudioStreamBasicDescription inputdesc, outputdesc; UInt32 datasize = sizeof(audiostreambasicdescription); OSStatus result1 = AudioUnitGetProperty(au, kaudiounitproperty_streamformat, kaudiounitscope_output, 0, &inputdesc, &datasize);

Getting Information About an Audio Unit Finding all stereo effects continued } } OSStatus result2 = AudioUnitGetProperty(au, kaudiounitproperty_streamformat, kaudiounitscope_input, 0, &outputdesc, &datasize); if (result1 = noerr result2 = noerr) continue; if (inputdesc.mchannelsperframe == 2 && outputdesc.mchannelsperframe == 2) { // do something with the component to store it into a list // then close the Audio Unit }

Getting Information About an Audio Unit Finding all stereo effects using AVAudioUnitComponentManager API #import <AVFoundation/AVAudioUnitComponent.h> NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentspassingtest: ^(AVAudioUnitComponent *comp, BOOL *stop) { return [[comp typename] isequaltostring: AVAudioUnitTypeEffect] && [comp supportsnumberinputchannels: 2 outputchannels: 2]; }];

Getting Information About an Audio Unit Finding all stereo effects using AVAudioUnitComponentManager API #import <AVFoundation/AVAudioUnitComponent.h> NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentspassingtest: ^(AVAudioUnitComponent *comp, BOOL *stop) { return [[comp typename] isequaltostring: AVAudioUnitTypeEffect] && [comp supportsnumberinputchannels: 2 outputchannels: 2]; }];

Getting Information About an Audio Unit Finding all stereo effects using AVAudioUnitComponentManager API #import <AVFoundation/AVAudioUnitComponent.h> NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentspassingtest: ^(AVAudioUnitComponent *comp, BOOL *stop) { return [[comp typename] isequaltostring: AVAudioUnitTypeEffect] && [comp supportsnumberinputchannels: 2 outputchannels: 2]; }];

Getting Information About an Audio Unit Finding all stereo effects using AVAudioUnitComponentManager API #import <AVFoundation/AVAudioUnitComponent.h> NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentspassingtest: ^(AVAudioUnitComponent *comp, BOOL *stop) { return [[comp typename] isequaltostring: AVAudioUnitTypeEffect] && [comp supportsnumberinputchannels: 2 outputchannels: 2]; }];

Getting Information About an Audio Unit Finding all stereo effects using AVAudioUnitComponentManager API #import <AVFoundation/AVAudioUnitComponent.h> NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentspassingtest: ^(AVAudioUnitComponent *comp, BOOL *stop) { return [[comp typename] isequaltostring: AVAudioUnitTypeEffect] && [comp supportsnumberinputchannels: 2 outputchannels: 2]; }];

Tagging Audio Units Introduction Many users have lots of Audio Units Finding the right one can be difficult

AVAudioUnitComponentManager Tags Audio Units can have one or more tags System tags Defined by creator of Audio Unit (read-only) User tags Specified by current user (read/write) Each user can have a unique set of tags

AVAudioUnitComponentManager Tags continued A tag is a localized string Arbitrary Predefined (AudioComponent.h) - Type (equalizer, dynamics, distortion, etc.) - Usage (drums, guitar, vocal, etc.)

Demo Audio Unit tags in AU Lab Michael Hopkins Audio Unit Scientist

Finding Audio Units with a Specific Tag #import <AVFoundation/AVAudioUnitComponent.h> NSString *astring = @ My Favorite Tag ; NSPredicate *apredicate = [NSPredicate predicatewithformat: @"alltagnames CONTAINS[cd] %@", astring]; NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentsmatchingpredicate: apredicate];

Finding Audio Units with a Specific Tag #import <AVFoundation/AVAudioUnitComponent.h> NSString *astring = @ My Favorite Tag ; NSPredicate *apredicate = [NSPredicate predicatewithformat: @"alltagnames CONTAINS[cd] %@", astring]; NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentsmatchingpredicate: apredicate];

Finding Audio Units with a Specific Tag #import <AVFoundation/AVAudioUnitComponent.h> NSString *astring = @ My Favorite Tag ; NSPredicate *apredicate = [NSPredicate predicatewithformat: @"alltagnames CONTAINS[cd] %@", astring]; NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentsmatchingpredicate: apredicate];

Finding Audio Units with a Specific Tag #import <AVFoundation/AVAudioUnitComponent.h> NSString *astring = @ My Favorite Tag ; NSPredicate *apredicate = [NSPredicate predicatewithformat: @"alltagnames CONTAINS[cd] %@", astring]; NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentsmatchingpredicate: apredicate];

Finding Audio Units with a Specific Tag #import <AVFoundation/AVAudioUnitComponent.h> NSString *astring = @ My Favorite Tag ; NSPredicate *apredicate = [NSPredicate predicatewithformat: @"alltagnames CONTAINS[cd] %@", astring]; NSArray *components = [[AVAudioUnitComponentManager sharedaudiounitmanager] componentsmatchingpredicate: apredicate];

Tagging Facilities Getting user tags of an AVAudioUnitComponent NSArray *usertags = acomp.usertagnames;

Tagging Facilities Getting user tags of an AVAudioUnitComponent NSArray *usertags = acomp.usertagnames;

Tagging Facilities Getting user tags of an AVAudioUnitComponent NSArray *usertags = acomp.usertagnames; Setting user tags of an AVAudioUnitComponent acomp.usertagnames = @[@ Trippy, @ Favorites ];

Tagging Facilities Getting user tags of an AVAudioUnitComponent NSArray *usertags = acomp.usertagnames; Setting user tags of an AVAudioUnitComponent acomp.usertagnames = @[@ Trippy, @ Favorites ];

Tagging Facilities Getting user tags of an AVAudioUnitComponent NSArray *usertags = acomp.usertagnames; Setting user tags of an AVAudioUnitComponent acomp.usertagnames = @[@ Trippy, @ Favorites ]; Getting all tags of an AVAudioUnitComponent NSArray *alltags = acomp.alltagnames;

Tagging Facilities Getting user tags of an AVAudioUnitComponent NSArray *usertags = acomp.usertagnames; Setting user tags of an AVAudioUnitComponent acomp.usertagnames = @[@ Trippy, @ Favorites ]; Getting all tags of an AVAudioUnitComponent NSArray *alltags = acomp.alltagnames;

More Tagging Facilities Getting a localized list of standard system tags AVAudioUnitComponentManager *manager = [AVAudioUnitComponentManager sharedaudiounitcomponentmanager]; NSArray *standardtags = manager.standardlocalizedtagnames;

More Tagging Facilities Getting a localized list of standard system tags AVAudioUnitComponentManager *manager = [AVAudioUnitComponentManager sharedaudiounitcomponentmanager]; NSArray *standardtags = manager.standardlocalizedtagnames;

More Tagging Facilities Getting a localized list of standard system tags AVAudioUnitComponentManager *manager = [AVAudioUnitComponentManager sharedaudiounitcomponentmanager]; NSArray *standardtags = manager.standardlocalizedtagnames;

More Tagging Facilities Getting a localized list of standard system tags AVAudioUnitComponentManager *manager = [AVAudioUnitComponentManager sharedaudiounitcomponentmanager]; NSArray *standardtags = manager.standardlocalizedtagnames; Getting a complete list of all available localized tags NSArray *alltags = manager.tagnames;

More Tagging Facilities Getting a localized list of standard system tags AVAudioUnitComponentManager *manager = [AVAudioUnitComponentManager sharedaudiounitcomponentmanager]; NSArray *standardtags = manager.standardlocalizedtagnames; Getting a complete list of all available localized tags NSArray *alltags = manager.tagnames;

Adding Built-in Tags to an AudioUnit Add a tags section to Info.plist of AudioComponent bundle <key>tags</key> <array> <string>effect</string> <string>equalizer</string> <string>custom Tag</string> </array>

Adding Built-in Tags to an AudioUnit Add a tags section to Info.plist of AudioComponent bundle <key>tags</key> <array> <string>effect</string> <string>equalizer</string> <string>custom Tag</string> </array>

Adding Built-in Tags to an AudioUnit Add a tags section to Info.plist of AudioComponent bundle <key>tags</key> <array> <string>effect</string> <string>equalizer</string> <string>custom Tag</string> </array>

Adding Built-in Tags to an AudioUnit Add a tags section to Info.plist of AudioComponent bundle <key>tags</key> <array> <string>effect</string> <string>equalizer</string> <string>custom Tag</string> </array> Localize Custom Tag by adding AudioUnitTags.strings file in bundle Custom Tag = Localized Tag ;

Adding Built-in Tags to an AudioUnit Add a tags section to Info.plist of AudioComponent bundle <key>tags</key> <array> <string>effect</string> <string>equalizer</string> <string>custom Tag</string> </array> Localize Custom Tag by adding AudioUnitTags.strings file in bundle Custom Tag = Localized Tag ;

Adding Built-in Tags to an AudioUnit Add a tags section to Info.plist of AudioComponent bundle <key>tags</key> <array> <string>effect</string> <string>equalizer</string> <string>custom Tag</string> </array> Localize Custom Tag by adding AudioUnitTags.strings file in bundle Custom Tag = Localized Tag ; Do not localize Standard System Tags

Action Items Users Tag your Audio Units Host developers Adopt AVAudioUnitComponentManager API Surprise and delight your users Audio Unit developers Add system tags to your Audio Units

AVAudioSession Tips Eric Johnson Audio Traffic Controller

References Updated audio session programming guide https://developer.apple.com/library/ios/documentation/audio/conceptual/ AudioSessionProgrammingGuide WWDC 2012 Session 505 Audio Session and Multiroute Audio in ios

Managing Session Activation State App state vs. audio session activation state App state Not running Foreground inactive/foreground active Background Suspended Audio session activation state Active or inactive Interruptions

Session State Changes Your audio session Phone audio session

Session State Changes User launches App Your audio session Phone audio session

Session State Changes Activate session and begin playback Your audio session Phone audio session

Session State Changes Your audio session Phone audio session

Session State Changes Interrupted by phone call Your audio session Phone audio session

Session State Changes ring Your audio session Phone audio session

Session State Changes ring Your audio session Phone audio session

Session State Changes Your audio session Phone audio session

Session State Changes User ends call, interruption ends Your audio session Phone audio session

Session State Changes Reactivate session, resume playback Your audio session Phone audio session

Managing Session Activation State Needs vary by application type Games Media playback VoIP and chat apps Metering and monitoring apps Browser-like apps Navigation and fitness apps Music-making apps

Managing Session Activation State Activation Game apps Go active upon app delegate s applicationdidbecomeactive: method Handle interruptions - Begin Update internal state - End Go active and resume audio playback

Managing Session Activation State Activation Media playback apps Music, podcasts, streaming radio Go active when the user presses play button Stay active, unless interrupted Handle interruptions - Begin Update UI and internal state - End Honor AVAudioSessionInterruptionOptionShouldResume

Managing Session Activation State Deactivation End ducking of other audio Navigation/Fitness Allow other audio to resume VoIP/Chat Short videos in browser views AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation

Muting Secondary Audio Primary vs. secondary audio Game App Primary Audio Secondary Audio

Muting Secondary Audio secondaryaudioshouldbesilencedhint Hint to foreground app for silencing secondary audio Check in app delegate s applicationdidbecomeactive: method

Muting Secondary Audio secondaryaudioshouldbesilencedhint Hint to foreground app for silencing secondary audio Check in app delegate s applicationdidbecomeactive: method AVAudioSessionSilenceSecondaryAudioHintNotification Delivered to foreground, active audio sessions Begin Mute secondary audio End Resume secondary audio

Muting Secondary Audio Game App (Foreground) Music App (Background)

Muting Secondary Audio Game App (Foreground) Music App (Background) Begin Playing

Muting Secondary Audio Game App (Foreground) Music App (Background) Notification Begin Playing

Muting Secondary Audio Game App (Foreground) Music App (Background) Notification Begin Playing

Muting Secondary Audio Game App (Foreground) Music App (Background)

Muting Secondary Audio Game App (Foreground) Music App (Background) Pauses

Muting Secondary Audio Game App (Foreground) Music App (Background) Notification Pauses

Muting Secondary Audio Game App (Foreground) Music App (Background) Notification Pauses

Muting Secondary Audio Updated best practices Do not change category based on isotheraudioplaying Use AVAudioSessionCategoryAmbient Use secondaryaudioshouldbesilencedhint Use AVAudioSessionSilenceSecondaryAudioHintNotification

New AV Foundation Audio Classes Doug Wyatt Core Audio Syntactic Confectioner

New AV Foundation Audio Classes Introduction Tour of the classes Example

Introduction Core Audio/AudioToolbox Powerful but not always simple Historically C++ classes in Core Audio SDK New Objective-C classes in AVFoundation.framework Available OS X 10.10 and ios 8.0

Goals Wrap C structs in Objective-C Simpler to build Can be passed to low-level APIs AVAudioEngine Stay realtime-safe

Classes AVAudioFile uses AVAudioPCMBuffer has has AVAudioFormat has AVAudioChannelLayout

AVAudioFormat AVAudioFile AVAudioPCMBuffer AVAudioFormat AVAudioChannelLayout

AVAudioFormat Describes data in audio file or stream Wraps struct AudioStreamBasicDescription { Float64 msamplerate; UInt32 mformatid; UInt32 mformatflags; UInt32 mbytesperpacket; UInt32 mframesperpacket; UInt32 mbytesperframe; UInt32 mchannelsperframe; UInt32 mbitsperchannel; UInt32 mreserved; }; - (instancetype)initwithstreamdescription: (const AudioStreamBasicDescription *)asbd; @property (nonatomic, readonly) const AudioStreamBasicDescription *streamdescription;

Standard Formats Canonical Was float on OS X, 8.24 fixed-point on ios Now deprecated Standard Non-interleaved 32-bit float On both platforms (instancetype)initstandardformatwithsamplerate: (double)samplerate channels:(avaudiochannelcount)channels; @property (nonatomic, readonly, getter=isstandard) BOOL standard;

Common Formats Formats often used in signal processing Always native-endian typedef NS_ENUM(NSUInteger, AVAudioCommonFormat) { AVAudioOtherFormat = 0, AVAudioPCMFormatFloat32 = 1, AVAudioPCMFormatFloat64 = 2, AVAudioPCMFormatInt16 = 3, AVAudioPCMFormatInt32 = 4 }; - (instancetype)initwithcommonformat:(avaudiocommonformat)format samplerate: (double)samplerate channels:(avaudiochannelcount)channels interleaved: (BOOL)interleaved; @property (nonatomic, readonly) AVAudioCommonFormat commonformat;

AVAudioChannelLayout AVAudioFile AVAudioPCMBuffer AVAudioFormat AVAudioChannelLayout

AVAudioChannelLayout Describes ordering/roles of multiple channels AVAudioFormat has an AVAudioChannelLayout AVAudioFormat initializers require a layout when describing > 2 channels Wraps AudioChannelLayout (CoreAudioTypes.h)

AVAudioPCMBuffer AVAudioFile AVAudioPCMBuffer AVAudioFormat AVAudioChannelLayout

AVAudioPCMBuffer Memory for storing audio data in any PCM format Wraps struct AudioBuffer { UInt32 mnumberchannels; UInt32 mdatabytesize; void * mdata; }; struct AudioBufferList { UInt32 mnumberbuffers; AudioBuffer mbuffers[1]; // variable length }; @property (nonatomic, readonly) const AudioBufferList *audiobufferlist; @property (nonatomic, readonly) AudioBufferList *mutableaudiobufferlist;

AVAudioPCMBuffer - (instancetype)initwithpcmformat:(avaudioformat *)format framecapacity: (AVAudioFrameCount)frameCapacity; Has an AVAudioFormat @property (nonatomic, readonly) AVAudioFormat *format; Has separate frame capacity and length @property (nonatomic, readonly) AVAudioFrameCount framecapacity; @property (nonatomic) AVAudioFrameCount framelength;

AVAudioPCMBuffer Sample access @property (nonatomic, readonly) float * const *floatchanneldata; @property (nonatomic, readonly) int16_t * const *int16channeldata; @property (nonatomic, readonly) int32_t * const *int32channeldata; Realtime-safety Don t access buffer s properties on audio I/O thread Cache AudioBufferList pointer or one of the above pointer arrays

AVAudioFile AVAudioFile AVAudioPCMBuffer AVAudioFormat AVAudioChannelLayout

AVAudioFile Read and write files of any supported format Transparently decodes while reading, encodes while writing Processing format Specified at initialization Either standard or common Similar to ExtAudioFile

AVAudioFile Initializers and Formats

AVAudioFile Initializers and Formats - (instancetype)initforreading:(nsurl *)fileurl error:(nserror **)outerror;

AVAudioFile Initializers and Formats - (instancetype)initforreading:(nsurl *)fileurl error:(nserror **)outerror; - (instancetype)initforwriting:(nsurl *)fileurl settings:(nsdictionary *)settings error:(nserror **)outerror;

AVAudioFile Initializers and Formats - (instancetype)initforreading:(nsurl *)fileurl error:(nserror **)outerror; - (instancetype)initforwriting:(nsurl *)fileurl settings:(nsdictionary *)settings error:(nserror **)outerror; @property (nonatomic, readonly) AVAudioFormat *fileformat;

AVAudioFile Initializers and Formats - (instancetype)initforreading:(nsurl *)fileurl error:(nserror **)outerror; - (instancetype)initforwriting:(nsurl *)fileurl settings:(nsdictionary *)settings error:(nserror **)outerror; @property (nonatomic, readonly) AVAudioFormat *fileformat; @property (nonatomic, readonly) AVAudioFormat *processingformat;

AVAudioFile I/O

AVAudioFile I/O - (BOOL)readIntoBuffer:(AVAudioPCMBuffer *)buffer error:(nserror **)outerror;

AVAudioFile I/O - (BOOL)readIntoBuffer:(AVAudioPCMBuffer *)buffer error:(nserror **)outerror; - (BOOL)writeFromBuffer:(const AVAudioPCMBuffer *)buffer error:(nserror **)outerror;

AVAudioFile I/O - (BOOL)readIntoBuffer:(AVAudioPCMBuffer *)buffer error:(nserror **)outerror; - (BOOL)writeFromBuffer:(const AVAudioPCMBuffer *)buffer error:(nserror **)outerror; @property (nonatomic) AVAudioFramePosition frameposition;

Example Reading an Audio File Open the file static BOOL AudioFileInfo(NSURL *fileurl) { NSError *error = nil; AVAudioFile *audiofile = [[AVAudioFile alloc] initforreading: fileurl commonformat: AVAudioPCMFormatFloat32 interleaved: NO error: &error];

Fetch and print basic info NSLog(@ File URL: %@\n, fileurl.absolutestring); NSLog(@ File format: %@\n, audiofile.fileformat.description); NSLog(@ Processing format: %@\n, audiofile.processingformat.description); AVAudioFramePosition filelength = audiofile.length; NSLog(@ Length: %lld frames, %.3f seconds\n", (long long)filelength, filelength / audiofile.fileformat.samplerate);

Create a buffer to read into const AVAudioFrameCount kbufferframecapacity = 128 * 1024L; AVAudioPCMBuffer *readbuffer = [[AVAudioPCMBuffer alloc] initwithpcmformat: audiofile.processingformat framecapacity: kbufferframecapacity];

Read the file a buffer at a time to find the loudest sample float loudestsample = 0.0f; AVAudioFramePosition loudestsampleposition = 0; while (audiofile.frameposition < filelength) { AVAudioFramePosition readposition = audiofile.frameposition; if ([audiofile readintobuffer: readbuffer error: &error]) { NSLog(@"failed to read audio file: %@", error); return NO; } if (readbuffer.framelength == 0) break; // finished

For each channel, check each audio sample for (AVAudioChannelCount channelindex = 0; channelindex < readbuffer.format.channelcount; ++channelindex) { float *channeldata = readbuffer.floatchanneldata[channelindex]; for (AVAudioFrameCount frameindex = 0; frameindex < readbuffer.framelength; ++frameindex) { float sampleabslevel = fabs(channeldata[frameindex]); if (sampleabslevel > loudestsample) { loudestsample = sampleabslevel; loudestsampleposition = readposition + frameindex; }

With AVAudioEngine Classes AVAudioNode has is a AVAudioPlayerNode uses uses AVAudioFile AVAudioPCMBuffer AVAudioFormat

AV Foundation Audio Summary Classes AVAudioFormat AVAudioChannelLayout AVAudioPCMBuffer AVAudioFile Use with Core Audio, Audio Toolbox and Audio Unit C APIs Via accessors AVAudioEngine

Summary MIDI over Bluetooth Inter-App Audio UI Views Enhanced AV Foundation audio Audio Unit Manager AVAudioSession Utility classes AVAudioEngine

More Information Filip Iliescu Graphics and Game Technologies Evangelist filiescu@apple.com Apple Developer Forums http://devforums.apple.com

Related Sessions AVAudioEngine in Practice Marina Tuesday 10:15AM

Labs Audio Lab Media Lab A Tuesday 11:30AM Audio Lab Media Lab B Wednesday 12:45PM