Using the Camera with AV Foundation Overview and best practices. Brad Ford iphone Engineering

Similar documents
Editing Media with AV Foundation

Announcements. Today s Topics

Creating Great App Previews

Cross Platform Nearby Networking

Vision Framework. Building on Core ML. Media #WWDC17. Brett Keating, Apple Manager Frank Doepke, He who wires things together

CS193P - Lecture 16. iphone Application Development. Audio APIs Video Playback Displaying Web Content Settings

Kevin van Vechten Core OS

Introducing the Photos Frameworks

Using and Extending the Xcode Source Editor

Audio Queue Services. Dave

Building Watch Apps #WWDC15. Featured. Session 108. Neil Desai watchos Engineer

Accessibility on ios. Developing for everyone. Frameworks #WWDC14. Session 210 Clare Kasemset ios Accessibility

What's New in UIKit Dynamics and Visual Effects Session 229

Mastering Xcode for iphone OS Development Part 2. Marc Verstaen Sr. Manager, iphone Tools

Introducing the Modern WebKit API

Mastering UIKit on tvos

Data Delivery with Drag and Drop

Video Options. Tap ➀ while not recording to open video options.

Advances in AVFoundation Playback

Game Center Techniques, Part 1

App Extension Best Practices

WatchKit In-Depth, Part 2

Apple Watch Design Tips and Tricks

For Mac and iphone. James McCartney Core Audio Engineer. Eric Allamanche Core Audio Engineer

Creating Content with iad JS

Creating Extensions for ios and OS X, Part Two

Enhancing your apps for the next dimension of touch

What s New in Xcode App Signing

Advanced Cocoa Text Tips and Tricks. Aki Inoue Cocoa Engineer

Validating HTTP Live Streams

What s New in Core Audio

Accessibility on OS X

App Publishing with itunes Connect

Mastering Xcode for iphone OS Development Part 1. Todd Fernandez Sr. Manager, IDEs

Content Protection for HTTP Live Streaming

Monetize and Promote Your App with iad

Stanford CS193p. Developing Applications for iphone 4, ipod Touch, & ipad Fall Stanford CS193p Fall 2010

Creating Complications with ClockKit Session 209

What s New in tvos #WWDC16. App Frameworks. Session 206. Hans Kim tvos Engineer

Introduction to Using the Canon VIXIA HFM31

Quick Interaction Techniques for watchos

Improving your Existing Apps with Swift

What s New in Audio. Media #WWDC17. Akshatha Nagesh, AudioEngine-eer Béla Balázs, Audio Artisan Torrey Holbrook Walker, Audio/MIDI Black Ops

ALIBI Witness 2.0 v3 Smartphone App for Apple ios Mobile Devices User Guide

Mastering Drag and Drop

Finding Bugs Using Xcode Runtime Tools

Modern User Interaction on ios

Creating Photo and Video Effects Using Depth

Building a Game with SceneKit

ios Application Development Course Details

CS193p Spring 2010 Wednesday, March 31, 2010

Stanford CS193p. Developing Applications for ios. Fall Stanford CS193p. Fall 2011

Editing in Premiere Pro CC 2015

What s New in Metal, Part 2

Mysteries of Auto Layout, Part 1

Seamless Linking to Your App

What s New in Core Location

Contents. Player Camera AudioIn, AudioOut Image & ImageUtil AudioRecorder VideoRecorder AudioDecoder/Encoder, VideoDecoder/Encoder

Using Accelerate and simd

HKUST. CSIT 6910A Report. iband - Musical Instrument App on Mobile Devices. Student: QIAN Li. Supervisor: Prof. David Rossiter

Contents. iphone Training. Industry Trainers. Classroom Training Online Training ON-DEMAND Training. Read what you need

ITP 342 Mobile App Dev. Animation

Using an iphone For Your Video Submission Assignment

Streaming Media. Advanced Audio. Erik Noreke Standardization Consultant Chair, OpenSL ES. Copyright Khronos Group, Page 1

Mevo. 1. Control Button f2.8 Glass Lens 3. Stereo Microphones 4. Speaker 5. Magnetic Base

Advanced Memory Analysis with Instruments. Daniel Delwood Performance Tools Engineer

WorldNow Producer. Encoding Video

Vemotion VB-10 Android Encoder. Support Guide

ITP 342 Mobile App Dev. Audio

Introducing On Demand Resources

Address Book for iphone

Advanced Scrollviews and Touch Handling Techniques

Working with Metal Overview

ITP 342 Mobile App Dev. Audio

Adopting Advanced Features of the New UI

What s New in Core Image

ANDROID APPS DEVELOPMENT FOR MOBILE AND TABLET DEVICE (LEVEL II)

Localizing with Xcode 6

Completing the Multimedia Architecture

User's Guide.

Thread Sanitizer and Static Analysis

Blackmagic Production Camera 4K EF

What is imovie? How can it help in the classroom? Presentation given by Erika Lee Garza

Create a movie project (using imovie app, version 211, on iphone 6)

Optimizing A/V Content For Mobile Delivery

How to create interactive documents

ios vs Android By: Group 2

DATA MUSIC User Manual GD-MUP-0005-AB. rev.ab.

Mobile Application Development

Designing for Apple Watch

FL Tips for a Successful Video Submission

News. Ad Specifications for iad. March 2016

Blackmagic Cinema Camera PL

INSTRUCTIONS FOR SESSION CHAIRS AND PRESENTERS

Stanford CS193p. Developing Applications for ios. Spring Stanford CS193p. Spring 2012

Advanced Debugging and the Address Sanitizer

CONTENTS Getting Started Using the Livescribe+ Menu

Evidence.com 1.31 Release Notes

Designing Great Apple Watch Experiences

Apple ipad Air 32GB WIFi Tablet

Transcription:

Using the Camera with AV Foundation Overview and best practices Brad Ford iphone Engineering 2

What You ll Learn Why and when you should use AV Foundation capture The AV Foundation capture programming model Performance considerations when working with the camera 3

Sample Code for This Session Only on iphone Find My icone AVCam Demo PinchyPreview Wavy Materials available at: http://developer.apple.com/wwdc/attendee/ 4

Technology Framework MediaPlayer UIKit AVFoundation CoreAudio CoreMedia CoreAnimation 5

Using the Camera in iphone OS 3.x Simple programmatic access UIImagePickerController API for high, medium, or low quality recording 00:00:15 Hideable camera controls UI takephoto API Control overlay for start/stop recording User touch to focus, like Camera app 6

Using the Camera in ios 4 UIImagePickerController offers more control Movie recording at high, medium, and low quality High resolution still image capture Access to the flash Switching between front and back cameras Incorporating the Camera and Photo Library in your App Presidio Thursday 9:00AM 7

Why Use AV Foundation Capture? Full access to the camera Independent focus, exposure, and white balance controls Independent locking Independent points of interest for focus and exposure Access to video frame data Accurate timestamps Per-frame metadata Configurable output format (e.g. 420v, BGRA) Configurable max frame rate Configurable resolution 8

Why Use AV Foundation Capture? Flexible output Still Image YUV and RGB output Exif metadata insertion QuickTime Movie Movie metadata insertion Orientation lock Video Preview in a CALayer Aspect fit, aspect fill, and stretch Audio level metering Access to audio sample data 9

Capture Basics: Inputs and Outputs Inputs Video Audio Outputs 10

Capture Basics: Inputs and Outputs Inputs Video Audio AVCaptureInput AVCaptureInput AVCaptureVideoPreviewLayer AVCaptureSession Outputs AVCaptureOutput AVCaptureOutput AVCaptureOutput AVCaptureOutput 11

Common Capture Use Cases Process YUV video frames from the camera Control the camera, take photos, and record movies Preview video from the camera to a Core Animation layer Process PCM audio data from the microphone to draw a waveform 12

Common Capture Use Cases Process YUV video frames from the camera Control the camera, take photos, and record movies Preview video from the camera to a Core Animation layer Process PCM audio data from the microphone to draw a waveform 13

Demo Find My icone Processing frames using AVCaptureVideoDataOutput 14

Find My icone Inputs Video Inputs Outputs Outputs 15

Find My icone Inputs Video AVCaptureDevice Inputs AVCaptureDeviceInput AVCaptureVideoPreviewLayer AVCaptureSession Outputs AVCaptureVideoDataOutput Outputs CMSampleBuffer 16

Find My icone (Code Snippet SP16) Create an AVCaptureSession AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionpreset = AVCaptureSessionPresetHigh; Find a suitable AVCaptureDevice AVCaptureDevice *device = [AVCaptureDevice defaultdevicewithmediatype:avmediatypevideo]; Create and add an AVCaptureDeviceInput AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceinputwithdevice:device error:&error]; [session addinput:input]; 17

Find My icone (Code Snippet SP16) Create and add an AVCaptureVideoDataOutput AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; [session addoutput:output]; Configure your output, and start the session dispatch_queue_t queue = dispatch_queue_create( myqueue, NULL); [output setsamplebufferdelegate:self queue:queue]; [session startrunning]; Implement your delegate callback - (void)captureoutput:(avcaptureoutput *)captureoutput didoutputsamplebuffer:(cmsamplebufferref)samplebuffer fromconnection:(avcaptureconnection *)connection { // CFShow(sampleBuffer); } 18

AVCaptureVideoDataOutput What is a CMSampleBuffer? Defined in <CoreMedia/CMSampleBuffer.h> Reference-counted Core Foundation object containing: Sample data CVPixelBufferRef pixelbuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // With the pixelbuffer, you can access the buffer s base address, // row bytes, etc. // See <CoreVideo/CVPixelBuffer.h> 19

AVCaptureVideoDataOutput What is a CMSampleBuffer? Defined in <CoreMedia/CMSampleBuffer.h> Reference-counted Core Foundation object containing: Timing Information CMTime presentationtime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); CMTime decodetime = CMSampleBufferGetDecodeTimeStamp(sampleBuffer); 20

AVCaptureVideoDataOutput What is a CMSampleBuffer? Defined in <CoreMedia/CMSampleBuffer.h> Reference-counted Core Foundation object containing: Format Information CMFormatDescriptionRef desc = CMSampleBufferGetFormatDescription(sampleBuffer); int32_t pixeltype = CMVideoFormatDescriptionGetCodecType(desc); CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc); 21

AVCaptureVideoDataOutput What is a CMSampleBuffer? Defined in <CoreMedia/CMSampleBuffer.h> Reference-counted Core Foundation object containing: Metadata about the sample data // Metadata are carried as attachments CFDictionaryRef metadatadictionary = CMGetAttachment(sampleBuffer, CFSTR("MetadataDictionary"), NULL); if (metadatadictionary) CFShow(metadataDictionary); 22

AVCaptureVideoDataOutput What is a CMSampleBuffer? Defined in <CoreMedia/CMSampleBuffer.h> Reference-counted Core Foundation object containing: Sample Data Timing Information Format Information Metadata about the sample data See Code Snippet SP20. Refer to the CoreMedia framework documentation 23

AVCaptureVideoDataOutput Performance considerations setsamplebufferdelegate:queue: requires a serial dispatch queue to ensure properly ordered buffer callbacks Note: Don t pass dispatch_get_current_queue(). By default, buffers are emitted in the camera s most efficient format. Set the videosettings property to specify a custom output format, such as BGRA (see Code Snippet SP16) Hint: Both CoreGraphics and OpenGL work well with BGRA 24

AVCaptureVideoDataOutput Performance considerations Set the minframeduration property to cap the max frame rate Configure the session to output the lowest practical resolution Set the alwaysdiscardslatevideoframes property to YES (the default) for early, efficient dropping of late video frames Your sample buffer delegate s callback must be FAST! The camera may stop delivering frames if you hold onto buffers for too long 25

AVCaptureSession The central hub of the AV Foundation capture APIs The place for adding inputs and outputs Starts the flow of data when you call -startrunning Allows quality customization using -sessionpreset High Medium Low 640x480 1280x720 Photo Highest recording quality Suitable for WiFi sharing Suitable for 3G sharing VGA 720p HD Full photo resolution 26

AVCaptureVideoDataOutput Supported resolutions and formats Preset iphone 3G (2vuy, BGRA) iphone 3GS (420v, BGRA) iphone 4 (Back) (420v, BGRA) iphone 4 (Front) (420v, BGRA) High 400x304 640x480 1280x720 640x480 Medium 400x304 480x360 480x360 480x360 Low 400x304 192x144 192x144 192x144 640x480 N/A 640x480 640x480 640x480 1280x720 N/A N/A 1280x720 N/A Photo N/A N/A N/A N/A 27

Common Capture Use Cases Process YUV video frames from the camera Control the camera, take photos, and record movies Preview video from the camera to a Core Animation layer Process PCM audio data from the microphone to draw a waveform 28

Demo AVCam Controlling the camera, taking still images, and recording video 29

AVCam Demo Inputs Video Audio Outputs 30

AVCam Demo Inputs Video Audio AVCaptureDevice AVCaptureDevice AVCaptureDeviceInput AVCaptureDeviceInput AVCaptureVideoPreviewLayer AVCaptureSession Outputs AVCaptureStillImageOutput AVCaptureMovieFileOutput 31

AVCam Demo: Camera Controls Focus support Focus modes: AVCaptureFocusModeLocked AVCaptureFocusModeAutoFocus AVCaptureFocusModeContinuousAutoFocus Use isfocusmodesupported: To determine if the AVCaptureDevice supports a given mode The adjustingfocus property is key-value observable 32

AVCam Demo: Camera Controls Focus support focuspointofinterest is a CGPoint from (0, 0) to (1, 1) Top-left is (0,0), bottom-right is (1,1) iphone sensor is mounted portrait, but scans landscape (0,0) Direction of scan (1,1) 33

AVCam Demo: Camera Controls Focus support Setting the focus mode or focus point of interest requires calling lockforconfiguration: On the AVCaptureDevice if ( [device isfocusmodesupported:avcapturefocusmodelocked] ) { if ( YES == [device lockforconfiguration:&error] ) { device.focusmode = AVCaptureFocusModeLocked; [device unlockforconfiguration]; } } Avoid holding lockforconfiguration: Unnecessarily 34

AVCam Demo: Camera Controls Exposure support Exposure modes: AVCaptureExposureModeLocked AVCaptureExposureModeContinuousAutoExposure Use isexposuremodesupported: To determine if the AVCaptureDevice supports a given mode The adjustingexposure property is key-value observable Exposure point of interest and focus point of interest are mutually exclusive, as are focus mode and exposure mode You must call lockforconfiguration: To set exposure mode or point of interest 35

AVCam Demo: Camera Controls White balance support White balance modes: AVCaptureWhiteBalanceModeLocked AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance Use iswhitebalancemodesupported: To determine if the AVCaptureDevice supports a given mode The adjustingwhitebalance property is key-value observable You must call lockforconfiguration: To set white balance mode 36

AVCam Demo: Camera Controls Flash support Flash modes: AVCaptureFlashModeOff AVCaptureFlashModeOn AVCaptureFlashModeAuto Use hasflash to determine if the AVCaptureDevice has this feature You must call lockforconfiguration: To set the flash mode 37

AVCam Demo: Camera Controls support Torch support Torch modes: AVCaptureTorchModeOff AVCaptureTorchModeOn AVCaptureTorchModeAuto Use hastorch to determine if the AVCaptureDevice has this feature You must call lockforconfiguration: To set the torch mode *Note* The torch only turns on if the device is associated with an AVCaptureSession instance that is running 38

AVCaptureDevice Supported camera controls Feature iphone 3G iphone 3GS iphone 4 (Back) iphone 4 (Front) focusmode focuspointofinterest exposuremode exposurepointofinterest whitebalancemode flashmode torchmode 39

AVCam Demo: Camera switching Performance consideration startrunning and stoprunning are synchronous An AVCaptureSession may be reconfigured while running Use beginconfiguration and commitconfiguration Follow along in Code Snippet SP21 // Don t -stoprunning before reconfiguring. [session beginconfiguration]; [session removeinput:frontfacingcameradeviceinput]; [session addinput:backfacingcameradeviceinput]; [session commitconfiguration]; // Changes are only committed when the outermost -commitconfiguration // is invoked. 40

AVCam Demo: Movie Recording Performance consideration Initiate a QuickTime movie recording by supplying a file URL and delegate [moviefileoutput startrecordingtooutputfileurl:myurl recordingdelegate:self]; One AVCaptureFileOutputRecordingDelegate method is mandatory - (void)captureoutput:(avcapturefileoutput *)captureoutput didfinishrecordingtooutputfileaturl:(nsurl *)outputfileurl fromconnections:(nsarray *)connections error:(nserror *)error { // Write resulting movie to the camera roll. // See Code Snippet SP24 } 41

AVCam Demo: Movie Recording AVCaptureMovieFileOutput writing policy You must pass a file-based NSURL You may not pass a URL to an existing file; The movie file output does not overwrite existing resources You must have permission to write to the URL specified 42

AVCam Demo: Movie Recording Setting limits -maxrecordedduration -maxrecordedfilesize -minfreediskspacelimit - (void)captureoutput:(avcapturefileoutput *)captureoutput didfinishrecordingtooutputfileaturl:(nsurl *)outputfileurl fromconnections:(nsarray *)connections error:(nserror *)error { // Check the error AND its userinfo dictionary! BOOL recordingsuccess = YES; // assume success. if ( [error code]!= noerr ) { id val = [[error userinfo] objectforkey:averrorrecordingsuccessfullyfinishedkey]; if ( val ) recordingsuccess = [val boolvalue]; } } 43

AVCam Demo: Movie Recording Early recording termination conditions AVErrorDiskFull AVErrorDeviceWasDisconnected AVErrorMaximumDurationReached AVErrorMaximumFileSizeReached AVErrorSessionWasInterrupted See Code Snippet SP24 for more detail 44

AVCam Demo: Movie Recording Metadata You may set movie level metadata at any time while recording Allows for slow metadata, such as GPS location, to be acquired after the recording has started NSMutableArray *metadata = [[NSMutableArray alloc] initwithcapacity:1]; AVMutableMetadataItem *item = [[AVMutableMetadataItem alloc] init]; item.keyspace = AVMetadataKeySpaceCommon; item.key = AVMetadataCommonKeyLocation; item.value = [NSString stringwithformat:@"%+08.4lf%+09.4lf/", location.coordinate.latitute, location.coordinate.longitude]; [metadata addobject:item]; [item release]; moviefileoutput.metadata = metadata; See Code Snippet SP25 for more metadata examples 45

AVCaptureMovieFileOutput Supported resolutions and bitrates (H.264 + AAC except where noted by *) Preset iphone 3G iphone 3GS iphone 4 (Back) iphone 4 (Front) High No video *Apple Lossless 640x480 3.5 mbps 1280x720 10.5 mbps 640x480 3.5 mbps Medium No video *Apple Lossless 480x360 700 kbps 480x360 700 kbps 480x360 700 kbps Low No video *Apple Lossless 192x144 128 kbps 192x144 128 kbps 192x144 128 kbps 640x480 No video *Apple Lossless 640x480 3.5 mbps 640x480 3.5 mbps 640x480 3.5 mbps 1280x720 No video *Apple Lossless No video 64 kbps AAC No video 64 kbps AAC No video 64 kbps AAC Photo N/A N/A N/A N/A 46

AVCam Demo: Photos AVCaptureStillImageOutput considerations Uses a block-style completion handler to deliver image data as a CMSampleBuffer The sample buffer contains useful still image metadata [stillimageoutput capturestillimageasynchronouslyfromconnection:connection! completionhandler: ^(CMSampleBufferRef imagesamplebuffer, NSError *error) {!! CFDictionaryRef exifattachments = CMGetAttachment(imageSampleBuffer, kcgimagepropertyexifdictionary, NULL); if (exifattachments) { // Attachments may be read or additional ones written } } ]; 47

AVCam Demo: Photos Performance considerations Use -availableimagedatacvpixelformattypes and -availableimagedatacodectypes to discover supported output formats Set your desired output format using outputsettings If your final destination still image format is jpeg, let AVCaptureStillImageOutput do the compression for you Use +jpegstillimagensdatarepresentation: To write a jpeg sample buffer to an NSData without recompression, merging in all metadata buffer attachments 48

AVCaptureStillImageOutput Supported resolutions and formats Preset iphone 3G (yuvs, 2vuy, BGRA, jpeg) iphone 3GS (420f, 420v, BGRA, jpeg) iphone 4 (Back) (420f, 420v, BGRA, jpeg) iphone 4 (Front) (420f, 420v, BGRA, jpeg) High 400x304 640x480 1280x720 640x480 Medium 400x304 480x360 480x360 480x360 Low 400x304 192x144 192x144 192x144 640x480 N/A 640x480 640x480 640x480 1280x720 N/A N/A 1280x720 N/A Photo 1600x1200 2048x1536 2592x1936 640x480 49

Common Capture Use Cases Process YUV video frames from the camera Control the camera, take photos, and record movies Preview video from the camera to a Core Animation layer Process PCM audio data from the microphone to draw a waveform 50

Demo PinchyPreview Previewing images from the camera using AVCaptureVideoPreviewLayer 51

PinchyPreview Video 52

PinchyPreview Video AVCaptureDevice AVCaptureDeviceInput AVCaptureVideoPreviewLayer AVCaptureSession 53

PinchyPreview AVCaptureVideoPreviewLayer considerations Unlike AVCaptureOutput, it retains and owns its AVCaptureSession Behaves like any other CALayer in a rendering tree You may set the orientation property to tell the preview layer how it should rotate images coming from the camera On iphone 4, the preview layer supports mirroring, which is the default when previewing from the front-facing camera 54

PinchyPreview AVCaptureVideoPreviewLayer considerations Supports three video gravity modes: AVLayerVideoGravityResizeAspect AVLayerVideoGravityResizeAspectFill AVLayerVideoGravityResize 55

PinchyPreview AVCaptureVideoPreviewLayer considerations Supports three video gravity modes: AVLayerVideoGravityResizeAspect AVLayerVideoGravityResizeAspectFill AVLayerVideoGravityResize 56

PinchyPreview AVCaptureVideoPreviewLayer considerations Supports three video gravity modes: AVLayerVideoGravityResizeAspect AVLayerVideoGravityResizeAspectFill AVLayerVideoGravityResize 57

PinchyPreview AVCaptureVideoPreviewLayer considerations Supports three video gravity modes: AVLayerVideoGravityResizeAspect AVLayerVideoGravityResizeAspectFill AVLayerVideoGravityResize 58

PinchyPreview AVCaptureVideoPreviewLayer considerations Supports three video gravity modes: AVLayerVideoGravityResizeAspect AVLayerVideoGravityResizeAspectFill AVLayerVideoGravityResize 59

AVCaptureVideoPreviewLayer Considerations when driving tap-to-focus using a preview When mapping a focus touch from a video preview, you must account for: Preview orientation Preview video gravity (aspect fit/fill/stretch) Mirroring Refer to AVCam Demo (0,0) Direction of scan (1,1) 60

Common Capture Use Cases Process YUV video frames from the camera Control the camera, take photos, and record movies Preview video from the camera to a Core Animation layer Process PCM audio data from the microphone to draw a waveform 61

Demo Wavy Audio sample processing using AVCaptureAudioDataOutput 62

Wavy Audio 63

Wavy Audio AVCaptureDevice AVCaptureDeviceInput AVCaptureSession AVCaptureAudioDataOutput CMSampleBuffer 64

AVCaptureAudioDataOutput Performance considerations setsamplebufferdelegate:queue:requires a serial dispatch queue to ensure properly ordered buffer callbacks Note: Don t pass dispatch_get_current_queue(). Emitted sample buffers contain interleaved 16-bit signed integer PCM samples, mono or stereo, at the audio hardware s sample rate Unlike AVCaptureVideoDataOutput, there is no affordance for dropping samples Be FAST in your delegate callback! 65

AVCaptureAudioDataOutput CMSampleBuffer usage Contains no image buffer, unlike AVCaptureVideoDataOutput Uses CMBlockBuffer instead CMBlockBufferRef blockbuffer = CMSampleBufferGetDataBuffer(sampleBuffer); char *datapointer = NULL; size_t lengthatoffset, totallength; OSStatus err = CMBlockBufferGetDataPointer( blockbuffer, 0, // offset &lengthatoffset, &totallength, &datapointer); 66

Multitasking Considerations You may record/process audio in the background if you set the appropriate UIBackgroundModes in your plist 67

Multitasking Considerations Users receive the double-height red status bar when your app runs the audio device in the background 68

Multitasking Considerations Background apps may not record/process video from cameras When backgrounded, apps running the camera: Receive AVCaptureSessionWasInterruptedNotification Receive AVCaptureSessionDidStopRunningNotification Movie recordings in progress are stopped When re-entering foreground, apps running the camera: Receive AVCaptureSessionInterruptionEndedNotification Receive AVCaptureSessionDidStartRunningNotification running and interrupted are also key-value observable properties 69

AVCaptureConnections The glue that holds the session, inputs, and outputs together 70

Using AVCaptureConnections Inputs Video Audio Outputs 71

Using AVCaptureConnections Inputs Video Audio AVCaptureDevice AVCaptureDevice AVCaptureDeviceInput AVCaptureDeviceInput AVCaptureVideoPreviewLayer AVCaptureSession AVCaptureConnection AVCaptureConnection AVCaptureConnection Outputs AVCaptureStillImageOutput AVCaptureMovieFileOutput 72

Purpose of AVCaptureConnection Identifies a specific stream of captured media Allows you to control exactly what is captured by enabling or disabling the connection An audio connection allows you to monitor audio levels A video connection allows you to affect the video delivered to the output video orientation video mirroring *Note: AVCaptureVideoDataOutput does not support setvideoorientation or setvideomirrored on its connection 73

AVCaptureConnection as Status Monitor AVCaptureConnection exposes the current state of the media stream while the session is running Refer to Code Snippet SP27 AVCaptureConnection *connection; //... find an audio connection... NSArray *audiochans = connection.audiochannels; for (AVCaptureAudioChannel *channel in audiochans) { float avg = channel.averagepowerlevel;! float peak= channel.peakholdlevel; } // update level meter UI 74

AVCaptureConnection as an Identifier An AVCaptureConnection instance refers to a specific stream in output delegate APIs AVCaptureSession AVCaptureMovieFileOutput - (void)captureoutput:(avcapturefileoutput *)captureoutput didfinishrecordingtooutputfileaturl:(nsurl *)outputfileurl fromconnections:(nsarray *)connections error:(nserror *)error { } 75

Finding AVCaptureConnections AVCaptureOutput implements - (NSArray *)connections; Examples: NSArray *moviefileoutputconnections = [moviefileoutput connections]; AVCaptureDevice AVCaptureInput AVCaptureSession AVCaptureMovieFileOutput 76

Disabling Connections Follow along in Code Snippet SP26 - (void)disableoutputconnection:(avcaptureoutput *)output connectionwithmediatype:(nsstring *)mediatype { NSArray *connections = output.connections; for ( AVCaptureConnection *conn in connections ) { // Ask the current connection's input for its media type NSArray *inputports = conn.inputports; for ( AVCaptureInputPort *port in inputports ) { if ( [port.mediatype isequal:mediatype] ) { // Found a match. Disable the connection. connection.enabled = NO; return; } } } } 77

Summary Process YUV video frames from the camera Control the camera, take photos, and record movies Preview video from the camera to a Core Animation layer Process PCM audio data from the microphone to draw a waveform AVCaptureVideoDataOutput AVCaptureDevice, AVCaptureStillImageOutput, AVCaptureMovieFileOutput AVCaptureVideoPreviewLayer AVCaptureAudioDataOutput 78

More Information Eryk Vershen Media Technologies Evangelist evershen@apple.com Apple Developer Forums http://devforums.apple.com 79

Related Sessions Discovering AV Foundation Editing Media with AV Foundation Incorporating the Camera and Photo Library in your App Discovering AV Foundation (Repeat) Presidio Tuesday 2:00PM Presidio Tuesday 3:15PM Presidio Thursday 9:00AM Nob Hill Thursday 4:30PM 80

Labs AV Foundation Lab AV Foundation Lab Core Animation Lab Graphics and Media Lab C Wednesday 9:00AM Graphics and Media Lab B Friday 11:30AM Graphics and Media Lab D Thursday 3:15PM 81

82

83