Latency: Understanding Delay in Video Production Pipelines

Introduction Latency — the delay between an event happening in front of a camera and the moment you see it on a screen — is a major concern in video…

Introduction

Latency — the delay between an event happening in front of a camera and the moment you see it on a screen — is a major concern in video production. It affects:

Even a few extra milliseconds can break illusion, timing, or operator usability.

But latency isn’t caused by one device — it accumulates across an entire pipeline.
This article breaks down every source of delay, how they stack, how different technologies compare, and how you can design a low-latency workflow.


1. What Is Latency?

Latency is the time between input and output, measured in:

Latency =
Capture → Encode → Transport → Process → Display


2. Why Latency Matters

Camera Operators

If the monitor lags, a camera op can’t pull focus, pan, or track movement accurately.

IMAG (Image Magnification)

If screens in a venue show performers out of sync with their real movements, audiences notice immediately.

Virtual Production

Latency breaks illusions:

Live Streaming

Too much delay makes interactions awkward or impossible.

Esports

High latency makes viewing difficult and reduces competitive integrity.


3. Latency Sources in a Video Pipeline

Every device introduces delay — sometimes small, sometimes significant.

Below is a breakdown of each stage.


4. Camera Sensor & Internal Processing

A camera introduces latency before the signal ever leaves the body.

What causes camera latency?

Typical camera latencies

Camera TypeLatency
Cinema cameras (ARRI, RED, Sony Cine)1–2 frames
Mirrorless cameras1–3 frames
Broadcast cameras<1 frame
Smartphones3–10 frames

Mirrorless cameras often have high latency because they prioritize image processing over real-time response.


5. Transmission Medium: SDI vs HDMI vs IP Video

SDI Latency

Virtually zero (typically <1 microsecond).
SDI is an electrical digital signal — no compression, no buffering.

HDMI Latency

Also near-zero for raw transmission, but:

…can add slight processing delays in certain devices.

IP Video Latency

IP video is where latency grows.

ProtocolLatency
NDI Full Bandwidth1–2 frames
NDI HX320–50 ms
NDI HX50–200 ms
SRT120–2,000+ ms
RTMP2–5 seconds
WebRTC100–500 ms

IP video is powerful, but not all formats are suitable for low-latency environments.


6. Switchers, Scalers, and Converters

Each device that touches a signal often adds frame sync delay.

Typical Latencies

Device TypeLatency
Hardware switchers (ATEM, TriCaster)0–1 frame
Video scalers1–3 frames
Frame synchronizers1–2 frames
Cross-converters0–1 frame (SDI↔HDMI)
Wireless SDI transmitters1–4 frames
Wireless HDMI transmitters3–10 frames

Wireless = biggest latency offender

Wireless transmission requires:

  1. Encoding
  2. Packetization
  3. RF transmission
  4. Decoding

Latency compounds quickly.


7. Encoding & Streaming

Encoding compresses video for:

Higher compression = higher latency.

Codec latency (approx)

CodecLatency
ProRes / DNx1–2 frames
H.2641–4 frames
H.2652–8 frames
AV110–50+ frames (high computational cost)

Hardware encoders beat software encoders in latency by a wide margin.


8. Displays & Monitors

Monitors also introduce delay due to:

Typical latencies

Display TypeLatency
Professional SDI monitors (SmallHD, Flanders)<1 frame
Broadcast studio monitors1–2 frames
Consumer TVs3–20 frames
Gaming monitors1–3 frames
LED walls2–6 frames

LED walls often introduce delay because the processor must handle:


9. Cumulative Latency — How It Adds Up

Latency stacks linearly.

Example pipeline:

  1. Camera: 2 frames
  2. Wireless transmitter: 3 frames
  3. Switcher: 1 frame
  4. LED processor: 3 frames
  5. LED wall: 2 frames

Total = 11 frames (~183 ms @ 60fps)

This is visible to the human eye and will break timing for IMAG and virtual production.


10. Latency Thresholds: What Is “Acceptable”?

Different applications have different tolerance.

ApplicationAcceptable Latency
Virtual production (LED volumes)<5 ms
Camera operating<1 frame
IMAG (concerts)<2–3 frames
Live broadcast switching<3 frames
Esports / gaming<10 ms
Streaming to internet<5 seconds
Remote contribution<200 ms

Virtual Production has the strictest requirement — even a single dropped frame can break the illusion.


11. How to Build a Low-Latency Pipeline

Here are the top engineering rules.


🔹 Rule 1: Use SDI whenever possible

SDI is king.

In any serious production environment, SDI is preferred over HDMI.


🔹 Rule 2: Avoid wireless unless absolutely necessary

Wireless systems always add delay.

If wireless is needed:


🔹 Rule 3: Minimize frame synchronizers

Any device that syncs mismatched signals adds 1–2 frames.

Match cameras and frame rates where possible.


🔹 Rule 4: Avoid scaling if possible

Scaling (1080→4K, 4K→1080, 720→1080) is a hidden latency cost.

Use consistent resolutions.


🔹 Rule 5: Use NDI Full Bandwidth for low-latency IP video

NDI HX adds more delay than NDI Full.

Use HX only when bandwidth is limited.


🔹 Rule 6: Use appropriate monitors

Professional SDI monitors (Flanders, SmallHD, Sony) have very low latency.

Consumer TVs are the worst offenders.


🔹 Rule 7: LED Walls Require Special Care

Virtual production pipelines require:

Most LED walls are 2–6 frames delay by default.
Use Brompton or Megapixel VR processors for best results.


12. Genlock & Synchronization

Latency isn’t just delay — sync matters too.

If camera → processor → LED wall → display are not perfectly in sync, you get:

Genlock ensures all systems operate on the same timing clock.


13. Measuring Latency

You can test latency using:

Typical methods involve capturing a timestamp on both ends.


14. Latency Optimization Strategies

Checklist for lowest delay pipelines:

Use SDI

✔ Always priority #1

Avoid unnecessary conversions

✔ No HDMI → SDI → HDMI → SDI loops

Avoid wireless when possible

✔ Even the best adds 1–4 frames

Use professional equipment

✔ LED processors, SDI monitors, hardware encoders

Keep consistent formats

✔ Same frame rate
✔ Same resolution
✔ Same time base

Use genlock

✔ Especially for VP and LED walls

Upgrade your switchgear

✔ Use high-bandwidth SDI routers
✔ For IP: multicast-enabled, low-buffer network switches


15. Latency by Workflow Type

🎬 Cinema Production

Acceptable: <2 frames
Best tools: SDI → hardware wireless → SDI monitors

🎤 IMAG / Live Events

Acceptable: <3–5 frames
Best tools: SDI → switcher → LED processor

🎮 Esports

Acceptable: <10 ms
Best tools: Gaming monitors; SDI or HDMI direct feeds

📺 Live Broadcast

Acceptable: <3 frames
Best tools: SDI router → switcher → broadcast monitors

🌐 Streaming

Acceptable: 1–5 seconds
Best tools: SRT / RTMP / hardware encoder

🧱 Virtual Production (LED Volume)

Acceptable: <3–5 ms end-to-end
Best tools: SDI → genlocked cameras → Brompton/Megapixel processors


Conclusion

Latency is one of the most critical technical considerations in video production — and one of the least understood. Every component in a pipeline contributes to delay, but with the right design choices you can build an ultra-low-latency workflow suitable for anything from basic streaming to high-end virtual production.

✔ SDI = lowest latency
✔ HDMI = moderate latency
✔ NDI Full Bandwidth = low latency
✔ NDI HX/HX3 = medium latency
✔ LED walls = high unless optimized
✔ Wireless links = always add delay
✔ Scaling & sync = hidden latency killers
✔ Genlock = mandatory for VP and LED workflows

Understanding where latency comes from empowers cinematographers, engineers, and streaming professionals to build fast, responsive systems that preserve timing, realism, and viewer experience.