Illustrations & Reference Gallery

Quick visual cues for the concepts & hardware discussed below

Waveform Sampling Diagram
Wave Sampling Process (44.1 kHz vs 10 kHz)
AES/EBU XLR Cable
Balanced AES/EBU XLR lead
Dante Network Topology Diagram
Typical Dante network routing
SPDIF 75 Ω RCA to XLR Adapter
75 Ω S/PDIF RCA ⇆ 110 Ω XLR adapter

1 · Digital Imaging Pipeline Overview

Digital video is a chain of discrete steps, each of which can be tuned or replaced in software:

  1. Photon capture → electrical charge on a CMOS/CCD photosite.
  2. Demosaicing the Bayer or X‑Trans pattern into RGB.
  3. Linear → logarithmic tone mapping (dynamic‑range compression).
  4. Color–space conversion (e.g. RAW → YC​BCr or RGB).
  5. Compression (intra‑frame & inter‑frame).
  6. Packetising into a container (.mp4, .mov, .mkv …).
  7. Transmission, storage, or real‑time playback.

Note — Each stage exposes programmable hooks: sensor registers, ISP parameters, encoder presets, transport latency budgets, & more.

2 · Sensor Architecture & Key Specifications

2.1 CMOS vs CCD

CMOS dominates modern cameras for its rolling‑shutter read‑out, lower power, and per‑pixel ADCs. CCD still appears in high‑end broadcast where global‑shutter artefacts matter.

2.2 Critical Metrics

Note — On‑die PLLs & I²C registers expose most parameters; use v4l2‑ctl (Linux) or vendor SDKs to poke them.

3 · Color Science & Formats

3.1 RAW → RGB → Y′CBCR

Programmatic processing usually follows:

// Pseudocode
rawFrame = sensor.read();                     // Bayer 12‑bit
rgbFrame = demosaic(rawFrame, 'VNG');         // edge‑aware
rgbLinear = applyBlackLevel(rgbFrame);
rgbGamma  = applyTransfer(rgbLinear, sRGB);
yuv420    = convertColor(rgbGamma, ITU‑R BT.709);

3.2 Chroma Sub‑sampling

SchemeBand‑widthUse‑case
4:4:4100 %Keying / VFX
4:2:266 %Broadcast / ProRes HQ
4:2:050 %Streaming / H.264

4 · Interfaces & Transport Protocols

Note — Many “USB cameras” simply bridge MIPI + ISP + UVC ⇢ USB3.

5 · Controlling Camera Parameters in Code

5.1 Universal Controls

5.2 API Examples

/* Swift – AVFoundation */
let device = AVCaptureDevice.default(.builtInWideAngleCamera,
                                     for: .video,
                                     position: .back)!
try device.lockForConfiguration()
device.setExposureModeCustom(duration: CMTimeMake(value: 1, timescale: 120),
                              iso: 200, completionHandler: nil)
device.whiteBalanceMode = .locked
device.unlockForConfiguration()
// C++ – OpenCV (V4L2 backend)
cv::VideoCapture cap(0, cv::CAP_V4L2);
cap.set(cv::CAP_PROP_FRAME_WIDTH, 1920);
cap.set(cv::CAP_PROP_FPS,         60);
cap.set(cv::CAP_PROP_EXPOSURE,    -4);  // auto‑exposure: ‑1, manual: value

6 · Capturing & Streaming Workflows

6.1 Real‑Time Preview

/* JS – getUserMedia */
const stream = await navigator.mediaDevices.getUserMedia({ video: { width: 1280 }});
videoElem.srcObject = stream;

6.2 Batch Recording with FFmpeg

$ ffmpeg -f avfoundation -framerate 30 -i "0" \
         -c:v libx264 -preset veryfast -crf 22 out.mp4

6.3 Low‑Latency NDI / WebRTC

Use libndi_send_create() or mediasoup‑client to inject encoded frames into production pipelines with ~ 100 ms glass‑to‑glass.

7 · Processing & Encoding Pipelines

7.1 GPU Shaders / Compute

Leverage Metal Performance Shaders, CUDA, or OpenCL for de‑bayer, noise‑reduction, and LUTs at >200 FPS.

7.2 Hardware Encoders

7.3 Transcoding Matrix

ProfileBit‑rate @1080p 30 fpsLatency (≈)
H.264 Baseline2‑4 Mb/s< 100 ms
H.265 Main1‑2 Mb/s200‑300 ms
AV1 Main0.6‑1.5 Mb/s800 ms+

8 · Synchronization, Buffers, Timing

9 · Containers, Metadata, Storage

Popular wrappers embed audio, timecode, and user data:

Note — Use sidecar JSON/XMP if your metadata schema exceeds container support.

10 · Performance Optimisation & Diagnostics

11 · Tooling & Debugging Suite

12 · Further Reading & Standards