Recording Video with AVFoundation in Swift

Low-level control of the entire capture pipeline, from camera selection to file writing

0  Overview

 Apple’s AVFoundation framework gives you low-level, fine-grained control over the entire capture pipeline—from selecting camera hardware to writing compressed movie files. Because it exposes every stage of the pipeline, you can build anything from a simple “Record” button to a pro-grade camera with live filters, manual exposure, and multi-camera picture-in-picture.

1  Prerequisites & Privacy

 Add the following keys to Info.plist or your app will crash when requesting the camera or microphone:

 Request authorization with AVCaptureDevice.requestAccess(for: .video) and .audio before configuring the session.

2  Anatomy of an AVCaptureSession

AVCaptureSession acts as the traffic-cop that routes media from one or more inputs (AVCaptureDeviceInput) to one or more outputs (e.g. AVCaptureMovieFileOutput).

 Always wrap changes in beginConfiguration()/commitConfiguration() so the session pauses briefly and applies all changes at once, avoiding dropped frames.


// 1. Create the session
let session = AVCaptureSession()
session.beginConfiguration()
session.sessionPreset = .high   // or .hd4K3840x2160 for 4-K 

// 2. Select devices
guard let cam = AVCaptureDevice.default(.builtInWideAngleCamera,
                                        for: .video, position: .back),
      let mic = AVCaptureDevice.default(for: .audio) else { fatalError() }

// 3. Wrap in inputs
let videoIn  = try AVCaptureDeviceInput(device: cam)
let audioIn  = try AVCaptureDeviceInput(device: mic)

// 4. Add inputs to the session
for input in [videoIn, audioIn] where session.canAddInput(input) {
    session.addInput(input)
}

// 5. Create & add an output
let movieOut = AVCaptureMovieFileOutput()
if session.canAddOutput(movieOut) { session.addOutput(movieOut) }

session.commitConfiguration()     // 6. Apply atomically

3  Discovering & Configuring Devices

AVCaptureDevice represents a physical (or virtual) camera or microphone. Use AVCaptureDevice.DiscoverySession to enumerate available cameras, then lock the device for configuration before changing focus, exposure, HDR, or torch settings.


// Switch between front & back cameras at runtime
func switchCamera(in session: AVCaptureSession) throws {
    guard let current = session.inputs.compactMap({ $0 as? AVCaptureDeviceInput })
                                   .first(where:{ $0.device.hasMediaType(.video) })
    else { return }

    let newPos: AVCaptureDevice.Position = current.device.position == .back ? .front : .back
    guard let newDevice = AVCaptureDevice.default(.builtInWideAngleCamera,
                                                  for: .video, position: newPos)
    else { return }

    let newInput = try AVCaptureDeviceInput(device: newDevice)

    session.beginConfiguration()
    session.removeInput(current)
    if session.canAddInput(newInput) { session.addInput(newInput) }
    session.commitConfiguration()
}

4  Displaying a Live Preview

 Create an AVCaptureVideoPreviewLayer, add it to a UIView layer, and keep it in sync with interface rotation:


let preview = AVCaptureVideoPreviewLayer(session: session)
preview.videoGravity = .resizeAspectFill
preview.connection?.videoOrientation = .portrait
view.layer.insertSublayer(preview, at: 0)
preview.frame = view.bounds          // update this in viewDidLayoutSubviews

5  Recording with AVCaptureMovieFileOutput

AVCaptureMovieFileOutput writes H.264/HEVC QuickTime movies directly to disk and fires delegate callbacks for start, finish, and errors.


class Recorder: NSObject, AVCaptureFileOutputRecordingDelegate {

    private let movieFileOutput = AVCaptureMovieFileOutput()
    private let session = AVCaptureSession()

    func startSession() {
        // (Set up inputs as in Section 2.)
        if session.canAddOutput(movieFileOutput) {
            session.addOutput(movieFileOutput)
        }
        session.startRunning()
    }

    func startRecording() {
        let tmpURL = FileManager.default.temporaryDirectory
                     .appendingPathComponent(UUID().uuidString)
                     .appendingPathExtension("mov")
        movieFileOutput.startRecording(to: tmpURL, recordingDelegate: self)
    }

    func stopRecording() { movieFileOutput.stopRecording() }

    // MARK: - Delegate
    func fileOutput(_ output: AVCaptureFileOutput,
                    didFinishRecordingTo outputFileURL: URL,
                    from connections: [AVCaptureConnection],
                    error: Error?) {

        if let error = error { print("Recording failed:", error) }
        else {
            // Save to Photos
            PHPhotoLibrary.shared().performChanges {
                PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputFileURL)
            }
        }
    }
}

6  Handling Interruptions & Errors

 Observe AVCaptureSessionRuntimeError, AVCaptureSessionWasInterrupted, and AVCaptureSessionInterruptionEnded notifications to pause UI and attempt recovery. The userInfo dictionary contains AVCaptureSessionErrorKey or AVCaptureSessionInterruptionReasonKey.

7  Advanced Configuration Tips

High-Frame-Rate & HDR – Find a format whose videoSupportedFrameRateRanges includes 240 fps, lock the device, then set activeFormat and activeVideoMinFrameDuration.

HEVC or ProRes – On iOS 17 +, configure movieFileOutput.setOutputSettings(_:for:) with AVVideoCodecKey : .hevc.

Multiple Cameras – Use AVCaptureMultiCamSession (iOS 13 +) to stream two sensors simultaneously (picture-in-picture).

Synchronizing A/V – For pro-grade sync or segmented recordings use AVAssetWriter instead of AVCaptureMovieFileOutput. If you notice drift, verify you are not running two separate sessions … see Apple Dev Forum threads for common pitfalls.

8  Complete Minimal Recorder Class


import UIKit
import AVFoundation
import Photos

final class CameraVC: UIViewController, AVCaptureFileOutputRecordingDelegate {

    private let session = AVCaptureSession()
    private let movieOut = AVCaptureMovieFileOutput()
    private var preview: AVCaptureVideoPreviewLayer!

    override func viewDidLoad() {
        super.viewDidLoad()
        configureSession()
        configurePreview()
        session.startRunning()
    }

    private func configureSession() {
        session.beginConfiguration()
        session.sessionPreset = .high

        let cam = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)!
        let mic = AVCaptureDevice.default(for: .audio)!
        try? session.addInput(AVCaptureDeviceInput(device: cam))
        try? session.addInput(AVCaptureDeviceInput(device: mic))

        if session.canAddOutput(movieOut) { session.addOutput(movieOut) }
        session.commitConfiguration()
    }

    private func configurePreview() {
        preview = AVCaptureVideoPreviewLayer(session: session)
        preview.videoGravity = .resizeAspectFill
        view.layer.addSublayer(preview)
    }

    override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()
        preview.frame = view.bounds
    }

    @IBAction func recordTapped(_ sender: UIButton) {
        if movieOut.isRecording { movieOut.stopRecording() }
        else {
            let url = FileManager.default.temporaryDirectory
                            .appendingPathComponent(UUID().uuidString)
                            .appendingPathExtension("mov")
            movieOut.startRecording(to: url, recordingDelegate: self)
        }
    }

    func fileOutput(_ output: AVCaptureFileOutput,
                    didFinishRecordingTo url: URL,
                    from connections: [AVCaptureConnection],
                    error: Error?) {
        guard error == nil else { print(error!); return }
        PHPhotoLibrary.shared().performChanges {
            PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: url)
        }
    }
}

9  Best-Practice Checklist

  • Defer expensive UI layout until after session.startRunning() completes.
  • Watch for thermal state with ProcessInfo.processInfo.thermalState.
  • Use Background Tasks (BGProcessingTaskRequest) to finish export when the user leaves the app.
  • Persist manual camera settings so the next launch restores user preference.
  • Gracefully handle devices that lack a front camera or specific presets.

10  Further Reading & Samples

 • AVCam: Building a Camera App sample illustrates photos and movies with advanced controls.

 • Setting Up a Capture Session guide walks through every configuration phase.

 • API References: AVCaptureSession, AVCaptureDevice, AVCaptureDeviceInput, AVCaptureMovieFileOutput, and AVCaptureFileOutputDelegate for full property lists.