What s New in Audio. Media #WWDC17. Akshatha Nagesh, AudioEngine-eer Béla Balázs, Audio Artisan Torrey Holbrook Walker, Audio/MIDI Black Ops

Similar documents
Delivering an Exceptional Audio Experience

What s New in watchos

Creating Audio Apps for watchos

Audio Unit Extensions

Advances in AVFoundation Playback

Media and Gaming Accessibility

WatchKit In-Depth, Part 2

Enabling Your App for CarPlay

Announcements. Today s Topics

AVContentKeySession Best Practices

What s New in tvos #WWDC16. App Frameworks. Session 206. Hans Kim tvos Engineer

the gamedesigninitiative at cornell university Lecture 15 Game Audio

Vision Framework. Building on Core ML. Media #WWDC17. Brett Keating, Apple Manager Frank Doepke, He who wires things together

Getting the Most Out of HealthKit

Implementing UI Designs in Interface Builder

Introducing MusicKit. Media #WWDC17. Tim Parthemore, MusicKit Services Joel Lopes Da Silva, ios Music

Mastering UIKit on tvos

Writing Energy Efficient Apps

What s New in Core Audio

Introducing Metal 2. Graphics and Games #WWDC17. Michal Valient, GPU Software Engineer Richard Schreyer, GPU Software Engineer

Enhancing your apps for the next dimension of touch

Localizing with Xcode 9

What s New in Core Data?

Content Protection for HTTP Live Streaming

Monetize and Promote Your App with iad

Finding Bugs Using Xcode Runtime Tools

ITP 342 Mobile App Dev. Audio

HLS Authoring Update. Media #WWDC17. Eryk Vershen, AVFoundation Engineer

Building Watch Apps #WWDC15. Featured. Session 108. Neil Desai watchos Engineer

Using and Extending the Xcode Source Editor

Your Apps and the Future of macos Security

What s New in SpriteKit

Completing the Multimedia Architecture

What s New in Xcode App Signing

imessage Apps and Stickers, Part 2

Streaming Media. Advanced Audio. Erik Noreke Standardization Consultant Chair, OpenSL ES. Copyright Khronos Group, Page 1

Thread Sanitizer and Static Analysis

Mastering Drag and Drop

Extending Your Apps with SiriKit

What s New in Testing

Modern User Interaction on ios

Building Faster in Xcode

What s New in Device Configuration, Deployment, and Management

Building Apps with Dynamic Type

Audio Queue Services. Dave

What s New in imessage Apps

Advances in TVMLKit. App Frameworks #WWDC17. Trevor Cortez, Localization Engineer Parry Panesar, tvos Engineer Jeremy Foo, tvos Engineer

Introduction to Siri Shortcuts

Understanding Undefined Behavior

Validating HTTP Live Streams

ESOTERIC HR Audio Player

Introducing Password AutoFill for Apps

What s New in Energy Debugging

Pro Tools LE 7.1.2r2 on Mac for Pro Tools LE Systems on Mac OS X 10.4 ( Tiger ) Only

What's New in Core Spotlight

ITP 342 Mobile App Dev. Audio

Data Delivery with Drag and Drop

Sounding Better Than Ever: High Quality Audio. Simon Forrest Connected Home Marketing

Core ML in Depth. System Frameworks #WWDC17. Krishna Sridhar, Core ML Zach Nation, Core ML

Blaze Audio Karaoke Sing-n-Burn

Contents. Introduction 7. What Is Core Audio? 10. Core Audio Essentials 19. Organization of This Document 8 See Also 8

Multimedia. Mobile Application Development in ios School of EECS Washington State University Instructor: Larry Holder

For Mac and iphone. James McCartney Core Audio Engineer. Eric Allamanche Core Audio Engineer

Managing Documents In Your ios Apps

Seamless Linking to Your App

Building Visually Rich User Experiences

ZYLIA Ambisonics Converter

CS193P - Lecture 16. iphone Application Development. Audio APIs Video Playback Displaying Web Content Settings

Quick Interaction Techniques for watchos

Using Grouped Notifications

Building for Voice with Siri Shortcuts

What s New in Core Bluetooth

TEAC HR Audio Player. Music Playback Software for TEAC USB AUDIO DAC Devices OWNER S MANUAL

TEAC HR Audio Player. Music Playback Software for TEAC USB AUDIO DAC Devices OWNER S MANUAL

CloudKit Tips And Tricks

What s New in SiriKit

Introducing the Photos Frameworks

Touch Bar Fundamentals

Automatic Strong Passwords and Security Code AutoFill

What s New in MapKit. App Frameworks #WWDC17. Fredrik Olsson

Computer Audio Setup Guide

App Extension Best Practices

SceneKit: What s New Session 604

DVS-200 Configuration Guide

Audio Unit Properties Reference

Audio over IP devices System description Ed 2.0

NuForce 192kHz/24bit USB Driver Setup Guide

MBS Xojo AudioPlayer Kit

Creating Complications with ClockKit Session 209

Introducing Swift Playgrounds

Working with Metal Overview

What s New in Notifications

What s New in Core Image

Creating Great App Previews

Pro Tools LE 7.0 on Macintosh for Pro Tools LE Systems on Mac OS X 10.4 ( Tiger ) Only

ios Accessibility Developing for everyone Session 201 Ian Fisch ios Accessibility

What s New in tvos 12

Multitasking Support on the ios Platform

Pro Tools LE 7.1 on Macintosh for Pro Tools LE Systems on Mac OS X 10.4 ( Tiger ) Only

What s New in TVMLKit

Transcription:

Session Media #WWDC17 What s New in Audio 501 Akshatha Nagesh, AudioEngine-eer Béla Balázs, Audio Artisan Torrey Holbrook Walker, Audio/MIDI Black Ops 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple.

Audio Stack Application AVFoundation AVAudioPlayer AVAudioSession AVPlayer Core AVAudioRecorder AVAudioEngine AVCapture MIDI AudioToolbox AUAudioUnit Audio Codecs AudioHAL and Drivers Delivering an Exceptional Audio Experience WWDC16

AVAudioEngine AVAudioSession watchos AUAudioUnit Other Enhancements Inter-Device Audio Mode (IDAM)

AVAudioEngine

Recap Powerful, feature-rich, Objective-C / Swift API set Simplifies realtime audio, easier to use Supports Playback, recording, processing, mixing 3D spatialization AVAudioEngine in Practice WWDC14 What s New in Core Audio WWDC15

Sample Engine Setup Karaoke InputNode EffectNode (EQ) PlayerNode (Backing Track) MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode (Sound Effects)

Sample Engine Setup Karaoke InputNode EffectNode (EQ) PlayerNode (Backing Track) MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode (Sound Effects)

Sample Engine Setup Karaoke InputNode EffectNode (EQ) PlayerNode (Backing Track) MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode (Sound Effects)

Sample Engine Setup Karaoke InputNode EffectNode (EQ) PlayerNode (Backing Track) MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode (Sound Effects)

Sample Engine Setup Karaoke InputNode EffectNode (EQ) PlayerNode (Backing Track) MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode (Sound Effects)

Sample Engine Setup Karaoke InputNode EffectNode (EQ) PlayerNode (Backing Track) MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode (Sound Effects)

What s New AVAudioEngine Manual rendering Auto shutdown AVAudioPlayerNode Completion callbacks

What s New AVAudioEngine Manual rendering Auto shutdown AVAudioPlayerNode Completion callbacks

Sample Engine Setup InputNode EffectNode PlayerNode MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode

Sample Engine Setup InputNode EffectNode PlayerNode MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode

Sample Engine Setup Manual rendering InputNode EffectNode PlayerNode MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode

Sample Engine Setup Manual rendering InputNode EffectNode PlayerNode MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode Application

Sample Engine Setup Manual rendering InputNode EffectNode PlayerNode MixerNode OutputNode NodeTapBlock (Analyze) PlayerNode Application

Manual Rendering Engine is not connected to any audio device Renders in response to requests from the client Modes Offline Realtime

Offline Manual Rendering Engine and nodes operate under no deadlines or realtime constraints A node may choose to: Use a more expensive signal processing algorithm Block on render thread for more data if needed - For example, player node may wait until its worker thread reads the data from disk

Offline Manual Rendering Example

Offline Manual Rendering Example Source File Application Destination File

Offline Manual Rendering Example Offline Context PlayerNode EffectNode OutputNode Source File Application Destination File

Offline Manual Rendering Applications Post-processing of audio files, for example, apply reverb, effects etc. Mixing of audio files Offline audio processing using CPU intensive (higher quality) algorithms Tuning, debugging or testing the engine setup

Demo AVAudioEngine - Offline Manual Rendering

Realtime Manual Rendering The engine and nodes: Operate under realtime constraints Do not make any blocking calls like blocking on a mutex, calling libdispatch etc., on the render thread - A node may drop the data if it is not ready to be rendered in time

Realtime Manual Rendering Applications Processing audio in an AUAudioUnit s internalrenderblock Processing audio data in a movie/video during streaming/playback

Realtime Manual Rendering Example

Realtime Manual Rendering Example Audio from an Input Movie Stream Application Audio into an Output Movie Stream TV

Realtime Manual Rendering Example Realtime Context InputNode EffectNode OutputNode Audio from an Input Movie Stream Application Audio into an Output Movie Stream TV

Realtime Manual Rendering Code example Realtime Context InputNode EffectNode OutputNode Application

//Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputnode -> effectnode -> outputnode // switch to manual rendering mode engine.stop() try engine.enablemanualrenderingmode(.realtime, format: outputpcmformat, maximumframecount: framecount) // e.g. 1024 @ 48 khz = 21.33 ms let renderblock = engine.manualrenderingblock // cache the render block

//Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputnode -> effectnode -> outputnode // switch to manual rendering mode engine.stop() try engine.enablemanualrenderingmode(.realtime, format: outputpcmformat, maximumframecount: framecount) // e.g. 1024 @ 48 khz = 21.33 ms let renderblock = engine.manualrenderingblock // cache the render block

//Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputnode -> effectnode -> outputnode // switch to manual rendering mode engine.stop() try engine.enablemanualrenderingmode(.realtime, format: outputpcmformat, maximumframecount: framecount) // e.g. 1024 @ 48 khz = 21.33 ms let renderblock = engine.manualrenderingblock // cache the render block

//Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputnode -> effectnode -> outputnode // switch to manual rendering mode engine.stop() try engine.enablemanualrenderingmode(.realtime, format: outputpcmformat, maximumframecount: framecount) // e.g. 1024 @ 48 khz = 21.33 ms let renderblock = engine.manualrenderingblock // cache the render block

// set the block to provide input data to engine engine.inputnode.setmanualrenderinginputpcmformat(inputpcmformat) { (inputframecount) -> UnsafePointer<AudioBufferList>? in guard havedata else { return nil } // fill and return the input audio buffer list return inputbufferlist }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputpcmformat, framecapacity: engine.manualrenderingmaximumframecount)! buffer.framelength = buffer.framecapacity let outputbufferlist = buffer.mutableaudiobufferlist try engine.start() } catch { // handle errors }

// set the block to provide input data to engine engine.inputnode.setmanualrenderinginputpcmformat(inputpcmformat) { (inputframecount) -> UnsafePointer<AudioBufferList>? in guard havedata else { return nil } // fill and return the input audio buffer list return inputbufferlist }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputpcmformat, framecapacity: engine.manualrenderingmaximumframecount)! buffer.framelength = buffer.framecapacity let outputbufferlist = buffer.mutableaudiobufferlist try engine.start() } catch { // handle errors }

// set the block to provide input data to engine engine.inputnode.setmanualrenderinginputpcmformat(inputpcmformat) { (inputframecount) -> UnsafePointer<AudioBufferList>? in guard havedata else { return nil } // fill and return the input audio buffer list return inputbufferlist }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputpcmformat, framecapacity: engine.manualrenderingmaximumframecount)! buffer.framelength = buffer.framecapacity let outputbufferlist = buffer.mutableaudiobufferlist try engine.start() } catch { // handle errors }

// set the block to provide input data to engine engine.inputnode.setmanualrenderinginputpcmformat(inputpcmformat) { (inputframecount) -> UnsafePointer<AudioBufferList>? in guard havedata else { return nil } // fill and return the input audio buffer list return inputbufferlist }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputpcmformat, framecapacity: engine.manualrenderingmaximumframecount)! buffer.framelength = buffer.framecapacity let outputbufferlist = buffer.mutableaudiobufferlist try engine.start() } catch { // handle errors }

// set the block to provide input data to engine engine.inputnode.setmanualrenderinginputpcmformat(inputpcmformat) { (inputframecount) -> UnsafePointer<AudioBufferList>? in guard havedata else { return nil } // fill and return the input audio buffer list return inputbufferlist }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputpcmformat, framecapacity: engine.manualrenderingmaximumframecount)! buffer.framelength = buffer.framecapacity let outputbufferlist = buffer.mutableaudiobufferlist try engine.start() } catch { // handle errors }

// to render from realtime context OSStatus outputerror = noerr; const auto status = renderblock(framestorender, outputbufferlist, &outputerror); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleprocessedoutput(outputbufferlist); // data rendered successfully break; } case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleprocessedoutput(outputbufferlist); // input node did not provide data, // but other sources may have rendered break;.. default: break;

// to render from realtime context OSStatus outputerror = noerr; const auto status = renderblock(framestorender, outputbufferlist, &outputerror); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleprocessedoutput(outputbufferlist); // data rendered successfully break; } case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleprocessedoutput(outputbufferlist); // input node did not provide data, // but other sources may have rendered break;.. default: break;

// to render from realtime context OSStatus outputerror = noerr; const auto status = renderblock(framestorender, outputbufferlist, &outputerror); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleprocessedoutput(outputbufferlist); // data rendered successfully break; } case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleprocessedoutput(outputbufferlist); // input node did not provide data, // but other sources may have rendered break;.. default: break;

// to render from realtime context OSStatus outputerror = noerr; const auto status = renderblock(framestorender, outputbufferlist, &outputerror); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleprocessedoutput(outputbufferlist); // data rendered successfully break; } case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleprocessedoutput(outputbufferlist); // input node did not provide data, // but other sources may have rendered break;.. default: break;

Manual Rendering Render calls Offline Can use either ObjC/Swift render method or the block based render call Realtime Must use the block based render call

What s New AVAudioEngine Manual rendering Auto shutdown AVAudioPlayerNode Completion callbacks

Auto Shutdown Hardware is stopped if running idle for a certain duration, started dynamically when needed Safety net for conserving power Enforced behavior on watchos, optional on other platforms isautoshutdownenabled

What s New AVAudioEngine Manual rendering Auto shutdown AVAudioPlayerNode Completion callbacks

Completion Callbacks Existing buffer/file completion handlers called when the data has been consumed New completion handler and callback types AVAudioPlayerNodeCompletionCallbackType.dataConsumed.dataRendered.dataPlayedBack

Completion Callbacks.dataConsumed Data has been consumed, same as the existing completion handlers The buffer can be recycled, more data can be scheduled.datarendered Data has been output by the player Useful in manual rendering mode Does not account for any downstream signal processing latency

Completion Callbacks.dataPlayedBack Buffer/file has finished playing Applicable only when the engine is rendering to an audio device Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency

Completion Callbacks.dataPlayedBack Buffer/file has finished playing Applicable only when the engine is rendering to an audio device Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency

Completion Callbacks.dataPlayedBack Buffer/file has finished playing Applicable only when the engine is rendering to an audio device Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency player.schedulefile(file, at: nil, completioncallbacktype:.dataplayedback) { (callbacktype) in // file has finished playing from listener s perspective // notify to stop the engine and update UI })

AVAudioEngine Summary AVAudioEngine Manual rendering Auto shutdown AVAudioPlayerNode Completion callbacks Deprecation coming soon (2018) AUGraph

AVAudioSession

AirPlay 2 Support AirPlay 2 - new technology on ios, tvos and macos Multi-room audio with AirPlay 2 capable devices Long-form audio applications Content - music, podcasts etc. Separate, shared audio route to AirPlay 2 devices New AVAudioSession API for an application to identify itself as long-form Introducing AirPlay 2 Session 509 Thursday 4:10PM

Audio Routing (ios) - Current Behavior Music and phone call Session Arbitration and Mixing AirPlay System Audio

Audio Routing (ios) - Current Behavior Music and phone call Session Arbitration and Mixing AirPlay System Audio

Audio Routing (ios) - Current Behavior Music and phone call Interrupted Session Arbitration and Mixing AirPlay System Audio

Audio Routing (ios) - Current Behavior Music and phone call Interrupted Session Arbitration and Mixing AirPlay System Audio

Audio Routing (ios) - Current Behavior Music and phone call Interrupted Session Arbitration and Mixing AirPlay System Audio

Audio Routing (ios) - Current Behavior Music and phone call Session Arbitration and Mixing AirPlay System Audio

Long-form Audio Routing (ios) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio

Long-form Audio Routing (ios) Music and phone call coexistence Long-form audio route Session Arbitration Long-form Audio (AirPlay 2) System Audio

Long-form Audio Routing (ios) Music and phone call coexistence Long-form audio route Session Arbitration Long-form Audio (AirPlay 2) System Audio

Long-form Audio Routing (ios) Music and phone call coexistence Long-form audio route System audio route Session Arbitration Session Arbitration and Mixing Long-form Audio (AirPlay 2) System Audio

Long-form Audio Routing (ios and tvos) Applications that use the long-form audio route Session Arbitration Long-form Audio (AirPlay 2)

Long-form Audio Routing (ios and tvos) Applications that use the long-form audio route Applications that use the system audio route Session Arbitration Session Arbitration and Mixing Long-form Audio (AirPlay 2) System Audio

//Long-form Audio Routing (ios and tvos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setcategory(avaudiosessioncategoryplayback, mode: AVAudioSessionModeDefault, routesharingpolicy:.longform) } catch { // handle errors }

//Long-form Audio Routing (ios and tvos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setcategory(avaudiosessioncategoryplayback, mode: AVAudioSessionModeDefault, routesharingpolicy:.longform) } catch { // handle errors }

//Long-form Audio Routing (ios and tvos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setcategory(avaudiosessioncategoryplayback, mode: AVAudioSessionModeDefault, routesharingpolicy:.longform) } catch { // handle errors }

//Long-form Audio Routing (ios and tvos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setcategory(avaudiosessioncategoryplayback, mode: AVAudioSessionModeDefault, routesharingpolicy:.longform) } catch { // handle errors }

Long-form Audio Routing (macos) Applications that use the long-form audio route Arbitration Long-form Audio (AirPlay 2)

Long-form Audio Routing (macos) Applications that use the long-form audio route Applications that use the system audio route Arbitration Mixing Long-form Audio (AirPlay 2) Default Device

//Long-form Audio Routing (macos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setroutesharingpolicy(.longform) } catch { // handle errors }

//Long-form Audio Routing (macos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setroutesharingpolicy(.longform) } catch { // handle errors }

//Long-form Audio Routing (macos), code example let mysession = AVAudioSession.sharedInstance() do { try mysession.setroutesharingpolicy(.longform) } catch { // handle errors }

Enhancements in watchos

watchos 4.0 Playback and recording Playback AVAudioPlayer (watchos 3.1 SDK) Recording AVAudioInputNode (AVAudioEngine) AVAudioRecorder AVAudioSession recording permissions Formats supported AAC-LC, AAC-ELD, HE-AAC, HE-AACv2, MP3 (decoding only), Opus

watchos 4.0 Recording policies Recording can start only in foreground Recording allowed to continue in the background (red microphone icon displayed) Recording in background is CPU limited https://developer.apple.com/reference/ healthkit/hkworkoutsession

AVAudioEngine AVAudioSession watchos AUAudioUnit Other Enhancements Inter-Device Audio Mode (IDAM)

AUAudioUnit

AU View Configuration Host applications decide how to display UI for AUs Current limitations No standard view sizes defined AUs are supposed to adapt to any view size chosen by host

AU Preferred View Configuration Host Audio Unit Extension

AU Preferred View Configuration Host Audio Unit Extension Array of all possible view configurations supportedviewconfigurations (availableviewconfigurations)

AU Preferred View Configuration Host Audio Unit Extension Array of all possible view configurations IndexSet of supported view configurations supportedviewconfigurations (availableviewconfigurations)

AU Preferred View Configuration Host Audio Unit Extension Array of all possible view configurations IndexSet of supported view configurations supportedviewconfigurations (availableviewconfigurations) Chosen view configuration select(viewconfiguration)

AU Preferred View Configuration Code example - AU extension Host Audio Unit Extension Array of all possible view configurations IndexSet of supported view configurations supportedviewconfigurations (availableviewconfigurations) Chosen view configuration select(viewconfiguration)

//AU Preferred View Configuration Code example - AU extension override public func supportedviewconfigurations(_ availableviewconfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableviewconfigurations.enumerated() { // check if the config (width, height, hosthascontroller) is supported // a config of 0x0 (default full size) must always be supported if isconfigurationsupported(config) { result.add(index) } } return result as IndexSet }

//AU Preferred View Configuration Code example - AU extension override public func supportedviewconfigurations(_ availableviewconfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableviewconfigurations.enumerated() { // check if the config (width, height, hosthascontroller) is supported // a config of 0x0 (default full size) must always be supported if isconfigurationsupported(config) { result.add(index) } } return result as IndexSet }

//AU Preferred View Configuration Code example - AU extension override public func supportedviewconfigurations(_ availableviewconfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableviewconfigurations.enumerated() { // check if the config (width, height, hosthascontroller) is supported // a config of 0x0 (default full size) must always be supported if isconfigurationsupported(config) { result.add(index) } } return result as IndexSet }

//AU Preferred View Configuration Code example - AU extension override public func supportedviewconfigurations(_ availableviewconfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableviewconfigurations.enumerated() { // check if the config (width, height, hosthascontroller) is supported // a config of 0x0 (default full size) must always be supported if isconfigurationsupported(config) { result.add(index) } } return result as IndexSet }

//AU Preferred View Configuration Code example - AU extension override public func supportedviewconfigurations(_ availableviewconfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableviewconfigurations.enumerated() { // check if the config (width, height, hosthascontroller) is supported // a config of 0x0 (default full size) must always be supported if isconfigurationsupported(config) { result.add(index) } } return result as IndexSet }

//AU Preferred View Configuration Code example - AU extension override public func select(_ viewconfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentviewconfiguration = viewconfiguration self.viewcontroller?.selectviewconfig(self.currentviewconfiguration) }

//AU Preferred View Configuration Code example - AU extension override public func select(_ viewconfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentviewconfiguration = viewconfiguration self.viewcontroller?.selectviewconfig(self.currentviewconfiguration) }

//AU Preferred View Configuration Code example - AU extension override public func select(_ viewconfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentviewconfiguration = viewconfiguration self.viewcontroller?.selectviewconfig(self.currentviewconfiguration) }

AU Preferred View Configuration Code example - host application Host Audio Unit Extension Array of all possible view configurations IndexSet of supported view configurations supportedviewconfigurations (availableviewconfigurations) Chosen view configuration select(viewconfiguration)

//AU Preferred View Configuration Code example - host application var smallconfigactive: Bool = false // true if the small view is the currently active one @IBAction func toggleviewmodes(_ sender: AnyObject?) { guard audiounit = self.engine.audiounit else { return } let largeconfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hosthascontroller: false) let smallconfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hosthascontroller: true) } let supportedindices = audiounit.supportedviewconfigurations([smallconfig, largeconfig]) if supportedindices.count == 2 { audiounit.select(self.smallconfigactive? largeconfig : smallconfig) self.smallconfigactive =!self.smallconfigactive }

//AU Preferred View Configuration Code example - host application var smallconfigactive: Bool = false // true if the small view is the currently active one @IBAction func toggleviewmodes(_ sender: AnyObject?) { guard audiounit = self.engine.audiounit else { return } let largeconfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hosthascontroller: false) let smallconfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hosthascontroller: true) } let supportedindices = audiounit.supportedviewconfigurations([smallconfig, largeconfig]) if supportedindices.count == 2 { audiounit.select(self.smallconfigactive? largeconfig : smallconfig) self.smallconfigactive =!self.smallconfigactive }

//AU Preferred View Configuration Code example - host application var smallconfigactive: Bool = false // true if the small view is the currently active one @IBAction func toggleviewmodes(_ sender: AnyObject?) { guard audiounit = self.engine.audiounit else { return } let largeconfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hosthascontroller: false) let smallconfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hosthascontroller: true) } let supportedindices = audiounit.supportedviewconfigurations([smallconfig, largeconfig]) if supportedindices.count == 2 { audiounit.select(self.smallconfigactive? largeconfig : smallconfig) self.smallconfigactive =!self.smallconfigactive }

//AU Preferred View Configuration Code example - host application var smallconfigactive: Bool = false // true if the small view is the currently active one @IBAction func toggleviewmodes(_ sender: AnyObject?) { guard audiounit = self.engine.audiounit else { return } let largeconfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hosthascontroller: false) let smallconfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hosthascontroller: true) } let supportedindices = audiounit.supportedviewconfigurations([smallconfig, largeconfig]) if supportedindices.count == 2 { audiounit.select(self.smallconfigactive? largeconfig : smallconfig) self.smallconfigactive =!self.smallconfigactive }

AU MIDI Output AU can emit MIDI output synchronized with its audio output Host sets a block on the AU to be called every render cycle Host can record/edit both MIDI performance and audio output from the AU *MIDIOutputNames MIDIOutputEventBlock

Other Updates Entitlement AU extension host applications linked against ios 11 SDK and later will need 'inter-appaudio' entitlement AU short name *audiounitshortname

Demo AUAudioUnit Béla Balázs, Audio Artisan

Other Enhancements

Audio Formats FLAC (Free Lossless Audio Codec) Codec, file, and streaming support Content distribution, streaming applications Opus Codec support File I/O using.caf container VOIP applications

Spatial Audio Formats B-Format Audio stream is regular PCM File container.caf B-format: W,X,Y,Z 1st order ambisonics kaudiochannellayouttag_ambisonic_b_format

Spatial Audio Formats Higher Order Ambisonics N order ambisonics (N is 1..254) kaudiochannellayouttag_hoa_acn_sn3d - SN3D normalized streams kaudiochannellayouttag_hoa_acn_n3d - N3D normalized streams ACN (Ambisonic Channel Number) Channels kaudiochannellabel_hoa_acn_0..65024 AudioFormat support for converting Between B-format, ACN_SN3D, ACN_N3D From ambisonics to arbitrary speaker layout

Spatial Mixer Head-Related Transfer Function (HRTF) AUSpatialMixer - kspatializationalgorithm_hrtfhq AVAudioEnvironmentNode - AVAudio3DMixingRenderingAlgorithmHRTFHQ Features Better frequency response Better localization of sources in a 3D space

AVAudioEngine AVAudioSession watchos AUAudioUnit Other Enhancements Inter-Device Audio Mode (IDAM)

Inter-Device Audio Mode (IDAM) Torrey Holbrook Walker, Audio/MIDI Black Ops

Inter-Device Audio Mode Record audio digitally via Lightning-to- USB cable USB 2.0 audio class-compliant implementation Available since El Capitan and ios 9

IDAM Inter-Device Audio Mode

IDAM Inter-Device Audio Mode

IDAM Inter-Device Audio + Mode MIDI

Inter-Device Audio + MIDI Send and receive MIDI via Lightning-to-USB cable Class-compliant USB MIDI implementation Requires ios 11 and macos El Capitan or later Auto-enabled in IDAM configuration

Inter-Device Audio + MIDI Device can charge and sync in IDAM configuration Photo import and tethering are temporarily disabled Audio device aggregation is ok Use your ios devices as a MIDI controllers, destinations, or both

Demo MIDI using IDAM Configuration

Summary AVAudioEngine - manual rendering AVAudioSession - Airplay 2 support watchos - recording AUAudioUnit - preferred view size, MIDI output Other enhancements - audio formats (FLAC, Opus, HOA) Inter-Device Audio and MIDI

More Information https://developer.apple.com/wwdc17/501

Related Sessions Introducing MusicKit Grand Ballroom B Tuesday 3:10PM What's New in watchos Hall 2 Wednesday 9:00AM Introducing AirPlay 2 Executive Ballroom Thursday 4:10PM

Labs Audio Lab Technology Lab F Tue 4:10PM 6:00PM Airplay Lab Technology Lab A Wed 11:00AM-1:00PM Audio Lab Technology Lab G Thu 1:00PM 3:00PM Airplay Lab Technology Lab A Fri 9:00AM-11:00AM