1 · What Is the Web Audio API?

The Web Audio API is a high‑level JavaScript interface for creating, manipulating, and analyzing audio entirely in the browser, supporting everything from simple sample playback to complex modular synthesis and 3‑D spatialisation .

Its design is node‑based: audio nodes are connected into a directed graph that flows into one or more audio destinations (speakers, recordings, streams). Underlying principles:

Note: The API is W3C Recommendation and is actively maintained by the W3C Audio Working Group .

2 · Creating an AudioContext

2.1 Construct & Resume the Context


const ctx = new AudioContext({ sampleRate: 48000 });
await ctx.resume(); // required after user gesture on most browsers
			

2.2 Lifecycle Methods

ctx.state may be "suspended", "running", or "closed". Transition with resume(), suspend(), and close() to coordinate CPU usage with page visibility.

3 · Audio Source Nodes

3.1 AudioBufferSourceNode


// Decode an ArrayBuffer then play it once
const buf  = await ctx.decodeAudioData(arrayBuffer);
const src  = new AudioBufferSourceNode(ctx, { buffer: buf });
src.connect(ctx.destination);
src.start(0);                 // schedule at AudioContext.currentTime
			

3.2 MediaElementAudioSourceNode

Stream audio from an <audio> or <video> element:


const node = ctx.createMediaElementSource(myAudioEl);
node.connect(ctx.destination);
			

3.3 MediaStreamAudioSourceNode

Combine with getUserMedia() for live microphone / WebRTC input:


const stream = await navigator.mediaDevices.getUserMedia({ audio:true });
const mic    = ctx.createMediaStreamSource(stream);
mic.connect(analyser).connect(ctx.destination);
			

3.4 OscillatorNode

Generate test tones or build synthesizers:


const osc = new OscillatorNode(ctx, { type: "sawtooth", frequency: 440 });
osc.connect(ctx.destination);
osc.start();
osc.stop(ctx.currentTime + 2);
			

4 · Routing & Gain Control

4.1 connect() and disconnect()

All nodes expose these for graph wiring. They may target inputs, indices, or parameters.

4.2 GainNode


const gain = new GainNode(ctx, { gain: 0.5 });
src.connect(gain).connect(ctx.destination);
// fade‑in over 1 second
gain.gain.linearRampToValueAtTime(1.0, ctx.currentTime + 1);
			

4.3 Scheduling Parameter Automation


lfo.connect(gain.gain);              // modulate with another node
gain.gain.setValueAtTime(1, ctx.currentTime);
gain.gain.exponentialRampToValueAtTime(0.001, ctx.currentTime + 4);
			

5 · Filters & Built‑in Effects

5.1 BiquadFilterNode


const filter = new BiquadFilterNode(ctx, { type:"lowpass", Q:12, frequency:800 });
src.connect(filter).connect(ctx.destination);
			

5.2 DynamicsCompressorNode

Useful for taming loud transients in games.

5.3 ConvolverNode

Load impulse‑response files for real‑time convolution reverb.

6 · Spatial Audio & 3‑D Panning

6.1 PannerNode

Specify an AudioListener position & orientation and move sources in XYZ space.

6.2 SpatialPannerNode

Provides HRTF‑based spatialisation with distanceModel and cone* parameters.

6.3 Ambisonics & FOA

Third‑party libs (e.g. OMnitone) decode higher‑order ambisonics into WebAudio graphs.

7 · Sample‑Accurate Timing

7.1 currentTime

The global, monotonically increasing clock (seconds) of an AudioContext.

7.2 Scheduling Playback


drum.start(ctx.currentTime + 0.000); // kick
snare.start(ctx.currentTime + 0.250);
hat  .start(ctx.currentTime + 0.500);
			

7.3 setTargetAtTime() vs. Ramps

Use for envelope generators and smooth parameter transitions.

8 · Real‑Time DSP with AudioWorklet

ScriptProcessorNode is deprecated; migrate to AudioWorklet for low‑latency and better thread isolation .

8.1 Registering Processors

// filter‑processor.js
class OnePoleProcessor extends AudioWorkletProcessor {
  static get parameterDescriptors() {
    return [{ name:'cutoff', defaultValue:0.5, minValue:0, maxValue:1 }];
  }
  process(inputs, outputs, params) {
    /* DSP here */ return true;
  }
}
registerProcessor('one‑pole', OnePoleProcessor);
			

8.2 Loading the Worklet


await ctx.audioWorklet.addModule('filter-processor.js');
const op = new AudioWorkletNode(ctx, 'one‑pole');
src.connect(op).connect(ctx.destination);
			

8.3 SharedArrayBuffer & RingBuffers

Newer browsers expose Atomics‑safe ringbuffers for zero‑copy communication between worklet and UI thread, improving performance in metering/visualisation pipelines.

9 · Recording Output

9.1 MediaRecorder


const dest = ctx.createMediaStreamDestination();
mix.connect(dest);                       // tap the master bus
const rec  = new MediaRecorder(dest.stream,{ mimeType:'audio/webm' });
rec.ondataavailable = ev => chunks.push(ev.data);
rec.start();
			

9.2 Offline Rendering


const off = new OfflineAudioContext({ numberOfChannels:2, length:48000*5, sampleRate:48000 });
/* build identical graph */ off.startRendering().then(buf => /* export WAV */);
			

10 · MediaStream & WebRTC Pipelines

WebAudio nodes can produce and consume MediaStream tracks, enabling effects chains before sending microphone data over WebRTC or mixing remote peer tracks into 3‑D game audio.

11 · Performance & Best Practices

12 · Project Templates & Examples

12.1 Modular Synth Starter

osc → filter → gain → ctx.destination with MIDI control & AudioWorklet LFO.

12.2 Live‑coding Looper

Combine MediaStream inputs, latency‑compensated scheduling, and offline bounce exporter.

12.3 Spatialised Voice Chat

Route each peer to a PannerNode; update positions from game engine coordinates.

13 · Debugging & Automated Testing

14 · Further Reading & Spec Links