Mobile Development 10 min read

Implementing Multi‑Camera Capture on iOS with AVCaptureMultiCamSession

This article explains how to implement dual‑camera capture on iOS 13+ using AVCaptureMultiCamSession, covering device limits, session configuration, code examples for camera setup, resolution adjustment, audio handling, system‑pressure monitoring, and performance tips for outdoor live streaming.

Huajiao Technology
Huajiao Technology
Huajiao Technology
Implementing Multi‑Camera Capture on iOS with AVCaptureMultiCamSession

With the rise of short‑video and live‑streaming apps, using multiple cameras simultaneously has become a crucial feature for iOS developers. iOS 13 introduced support for dual‑camera capture on newer iPhone and iPad models, enabling outdoor broadcasters to combine front‑ and rear‑camera feeds into a single stream.

Device and system requirements :

iPhone models with A12 chip or later (iPhone XS, XS Max, XR, etc.)

iPad Pro 2019 (A12X) or later

iOS 13 or newer

Multi‑camera session creation replaces the usual AVCaptureSession with AVCaptureMultiCamSession . The session must be configured with separate inputs and outputs for the front and back cameras, as well as the microphone.

- (void)configSession {
    // Check if multi‑camera is supported
    if (AVCaptureMultiCamSession.isMultiCamSupported == NO) {
        return;
    }
    // Create the multi‑camera session
    self.cameraSession = [[AVCaptureMultiCamSession alloc] init];
    [self.cameraSession beginConfiguration];
    if ([self configBackCamera] == NO) {
        [self.cameraSession commitConfiguration];
        return;
    }
    if ([self configFrontCamera] == NO) {
        [self.cameraSession commitConfiguration];
        return;
    }
    if ([self configMicrophone] == NO) {
        [self.cameraSession commitConfiguration];
        return;
    }
    [self.cameraSession commitConfiguration];
}

The configFrontCamera method demonstrates how to add a front‑camera input without automatically creating connections, then manually create and configure the AVCaptureConnection so that the preview layer is mirrored correctly.

- (BOOL)configFrontCamera {
    AVCaptureDevice *frontCamera = [self.class getCaptureDeviceWithPosition:AVCaptureDevicePositionFront];
    if (frontCamera == nil) { return NO; }
    NSError *error = nil;
    self.frontDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:frontCamera error:&error];
    if (![self.cameraSession canAddInput:self.frontDeviceInput]) { return NO; }
    [self.cameraSession addInputWithNoConnections:self.frontDeviceInput];

    self.frontVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    self.frontVideoDataOutput.videoSettings = @{ (__bridge NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    [self.frontVideoDataOutput setSampleBufferDelegate:self queue:self.dataOutputQueue];
    if (![self.cameraSession canAddOutput:self.frontVideoDataOutput]) { return NO; }
    [self.cameraSession addOutputWithNoConnections:self.frontVideoDataOutput];

    AVCaptureInputPort *port = [[self.frontDeviceInput portsWithMediaType:AVMediaTypeVideo sourceDeviceType:frontCamera.deviceType sourceDevicePosition:frontCamera.position] firstObject];
    AVCaptureConnection *frontConnection = [[AVCaptureConnection alloc] initWithInputPorts:@[port] output:self.frontVideoDataOutput];
    if (![self.cameraSession canAddConnection:frontConnection]) { return NO; }
    [self.cameraSession addConnection:frontConnection];
    [frontConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    [frontConnection setAutomaticallyAdjustsVideoMirroring:NO];
    [frontConnection setVideoMirrored:YES];

    self.frontPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSessionWithNoConnection:self.cameraSession];
    AVCaptureConnection *frontPreviewLayerConnection = [[AVCaptureConnection alloc] initWithInputPort:port videoPreviewLayer:self.frontPreviewLayer];
    [frontPreviewLayerConnection setAutomaticallyAdjustsVideoMirroring:NO];
    [frontPreviewLayerConnection setVideoMirrored:YES];
    if (![self.cameraSession canAddConnection:frontPreviewLayerConnection]) { return NO; }
    self.frontPreviewLayer.frame = CGRectMake(30, 30, 180, 320);
    [self.containerView.layer addSublayer:self.frontPreviewLayer];
    [self.cameraSession addConnection:frontPreviewLayerConnection];
    return YES;
}

Because AVCaptureMultiCamSession does not support setSessionPreset: , each AVCaptureDevice must be configured individually for resolution and frame rate.

- (BOOL)reduceResolutionForCamera:(AVCaptureDevicePosition)position {
    for (AVCaptureConnection *connect in self.cameraSession.connections) {
        for (AVCaptureInputPort *inputPort in connect.inputPorts) {
            if (inputPort.mediaType == AVMediaTypeVideo && inputPort.sourceDevicePosition == position) {
                AVCaptureDeviceInput *videoDeviceInput = (AVCaptureDeviceInput *)inputPort.input;
                NSArray *formats = videoDeviceInput.device.formats;
                for (AVCaptureDeviceFormat *format in formats) {
                    if (format.isMultiCamSupported) {
                        CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
                        if (dimensions.width == 1280 && dimensions.height == 720) {
                            NSError *error = nil;
                            [self.cameraSession beginConfiguration];
                            if ([videoDeviceInput.device lockForConfiguration:&error]) {
                                videoDeviceInput.device.activeFormat = format;
                                [videoDeviceInput.device setActiveVideoMinFrameDuration:CMTimeMake(1, 15)];
                                [videoDeviceInput.device setActiveVideoMaxFrameDuration:CMTimeMake(1, 15)];
                                [videoDeviceInput.device unlockForConfiguration];
                                [self.cameraSession commitConfiguration];
                                return YES;
                            }
                            [self.cameraSession commitConfiguration];
                        }
                    }
                }
            }
        }
    }
    return NO;
}

Data callbacks remain the same as in previous versions; the delegate receives sample buffers from each video output.

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (output == self.frontVideoDataOutput) {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    } else if (output == self.backVideoDataOutput) {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    }
}

Audio capture follows the same pattern: the front camera uses the front microphone, the rear camera uses the rear microphone, and a dedicated surround mic can be selected via AVAudioSession .

Important notes :

When adding inputs or outputs, use addInputWithNoConnections: and addOutputWithNoConnections: to avoid automatic connection creation.

Initialize AVCaptureVideoPreviewLayer with initWithSessionWithNoConnection: .

Manually add each AVCaptureConnection you need.

iOS supports a single session with multiple cameras; macOS can support multiple sessions.

Running multiple cameras increases CPU, GPU, and battery load, which can trigger system‑pressure warnings. The pressure levels are:

AVCaptureSystemPressureLevelNominal   // normal
AVCaptureSystemPressureLevelFair      // slightly elevated
AVCaptureSystemPressureLevelSerious   // highly elevated
AVCaptureSystemPressureLevelCritical  // critically elevated
AVCaptureSystemPressureLevelShutdown // beyond critical

System pressure can be observed via KVO on the systemPressureState property.

- (void)addObserver {
    if (@available(iOS 11.0, *)) {
        [self.inputCamera addObserver:self forKeyPath:@"systemPressureState" options:NSKeyValueObservingOptionNew context:nil];
    }
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary
*)change context:(void *)context {
    if ([keyPath isEqualToString:@"systemPressureState"]) {
        if (@available(iOS 11.0, *)) {
            AVCaptureSystemPressureState *state = change[NSKeyValueChangeNewKey];
            NSDictionary *dict = @{ @"AVCaptureSystemPressureLevel" : state.level, @"AVCaptureSystemPressureFactors" : @(state.factors) };
            // react to high pressure here
        }
    }
}

When pressure becomes too high, you can mitigate it by lowering frame rate or resolution, reducing other GPU/CPU work, or disabling one camera (e.g., frontCameraInputVideoPort.enabled = false; ).

For a complete example, see the MulitCameraTest repository on GitHub.

References :

Introducing Multi‑Camera Capture for iOS – WWDC 2019 (https://developer.apple.com/videos/play/wwdc2019/249/)

AVMultiCamPiP – Apple Documentation (https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avmulticampip_capturing_from_multiple_cameras?language=objc)

Mobile DevelopmentiOSObjective‑Cvideo captureAVCaptureMulti-Camera
Huajiao Technology
Written by

Huajiao Technology

The Huajiao Technology channel shares the latest Huajiao app tech on an irregular basis, offering a learning and exchange platform for tech enthusiasts.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.