AVFoundation is Apple's advanced framework designed for managing and processing time-based audiovisual media.
Unlike higher-level APIs such as AVAudioPlayer
, AVFoundation offers granular control over audio and video through components like
AVAudioEngine
, AVAudioSession
, AVAsset
, and more.
This guide focuses on **low-level audio tasks** such as real-time audio playback, recording, effects processing, synthesis, and mixing using AVAudioEngine and related classes.
Before doing anything with audio, you must configure the audio session.
// Import AVFoundation
import AVFoundation
// Configure the Audio Session
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, mode: .default)
try audioSession.setActive(true)
} catch {
print("Failed to set up audio session: \(error)")
}
We can use AVAudioPlayerNode
to play audio buffers or files:
import AVFoundation
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
// Connect player to the output
engine.attach(player)
engine.connect(player, to: engine.outputNode, format: nil)
guard let fileURL = Bundle.main.url(forResource: "example", withExtension: "mp3") else { return }
let audioFile = try AVAudioFile(forReading: fileURL)
try engine.start()
player.scheduleFile(audioFile, at: nil, completionHandler: nil)
player.play()
Use the input node to capture microphone data into a file.
import AVFoundation
let engine = AVAudioEngine()
let input = engine.inputNode
let format = input.inputFormat(forBus: 0)
let documents = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
let fileURL = documents.appendingPathComponent("recording.caf")
let outputFile = try AVAudioFile(forWriting: fileURL, settings: format.settings)
input.installTap(onBus: 0, bufferSize: 4096, format: format) { (buffer, time) in
do {
try outputFile.write(from: buffer)
} catch {
print("Error writing buffer: \(error)")
}
}
try engine.start()
Note: Always stop and remove the tap when recording is done to release resources.
Attach multiple player nodes and connect them to a mixer node.
import AVFoundation
let engine = AVAudioEngine()
let mixer = AVAudioMixerNode()
let player1 = AVAudioPlayerNode()
let player2 = AVAudioPlayerNode()
engine.attach(mixer)
engine.attach(player1)
engine.attach(player2)
// Connect players to mixer
engine.connect(player1, to: mixer, format: nil)
engine.connect(player2, to: mixer, format: nil)
// Connect mixer to output
engine.connect(mixer, to: engine.outputNode, format: nil)
let file1 = try AVAudioFile(forReading: Bundle.main.url(forResource: "track1", withExtension: "mp3")!)
let file2 = try AVAudioFile(forReading: Bundle.main.url(forResource: "track2", withExtension: "mp3")!)
try engine.start()
player1.scheduleFile(file1, at: nil, completionHandler: nil)
player2.scheduleFile(file2, at: nil, completionHandler: nil)
player1.play()
player2.play()
You can easily insert effects into the audio chain.
import AVFoundation
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let reverb = AVAudioUnitReverb()
reverb.loadFactoryPreset(.largeHall)
reverb.wetDryMix = 50 // 0 = dry, 100 = wet
engine.attach(player)
engine.attach(reverb)
// Connect: player → reverb → output
engine.connect(player, to: reverb, format: nil)
engine.connect(reverb, to: engine.outputNode, format: nil)
let audioFile = try AVAudioFile(forReading: Bundle.main.url(forResource: "example", withExtension: "mp3")!)
try engine.start()
player.scheduleFile(audioFile, at: nil, completionHandler: nil)
player.play()
You can build a basic tone generator by scheduling buffers manually:
import AVFoundation
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
engine.attach(player)
engine.connect(player, to: engine.outputNode, format: nil)
let sampleRate = 44100.0
let frequency = 440.0
let duration = 2.0
let frameCount = AVAudioFrameCount(sampleRate * duration)
let format = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1)!
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount)!
buffer.frameLength = frameCount
let theta = 2.0 * Double.pi * frequency / sampleRate
for frame in 0..
AVFoundation provides powerful low-level audio capabilities suitable for building professional audio applications on iOS and macOS. With AVAudioEngine, you can build custom audio signal paths, record, mix, synthesize, and apply complex effects efficiently.
Mastery of this framework enables you to create apps like audio workstations, real-time voice changers, custom synthesizers, and many more powerful tools.