Introduction
Latency — the delay between an event happening in front of a camera and the moment you see it on a screen — is a major concern in video production. It affects:
- Live broadcasts
- IMAG (live concert screens)
- Esports
- Virtual production
- LED wall work
- Motion capture
- Live streaming
- Interactive productions
- Camera operating
Even a few extra milliseconds can break illusion, timing, or operator usability.
But latency isn’t caused by one device — it accumulates across an entire pipeline.
This article breaks down every source of delay, how they stack, how different technologies compare, and how you can design a low-latency workflow.
1. What Is Latency?
Latency is the time between input and output, measured in:
- Milliseconds (ms)
- Frames of delay (e.g., 1 frame @ 60fps ≈ 16.67 ms)
Latency =
Capture → Encode → Transport → Process → Display
2. Why Latency Matters
Camera Operators
If the monitor lags, a camera op can’t pull focus, pan, or track movement accurately.
IMAG (Image Magnification)
If screens in a venue show performers out of sync with their real movements, audiences notice immediately.
Virtual Production
Latency breaks illusions:
- Camera tracking desynchronizes
- LED walls lag behind movement
- Unreal Engine cannot match perspective in time
Live Streaming
Too much delay makes interactions awkward or impossible.
Esports
High latency makes viewing difficult and reduces competitive integrity.
3. Latency Sources in a Video Pipeline
Every device introduces delay — sometimes small, sometimes significant.
Below is a breakdown of each stage.
4. Camera Sensor & Internal Processing
A camera introduces latency before the signal ever leaves the body.
What causes camera latency?
- Rolling shutter readout
- Debayering
- Noise reduction
- Image scaling
- Log/HDR processing
- Frame synchronizers
Typical camera latencies
| Camera Type | Latency |
|---|---|
| Cinema cameras (ARRI, RED, Sony Cine) | 1–2 frames |
| Mirrorless cameras | 1–3 frames |
| Broadcast cameras | <1 frame |
| Smartphones | 3–10 frames |
Mirrorless cameras often have high latency because they prioritize image processing over real-time response.
5. Transmission Medium: SDI vs HDMI vs IP Video
SDI Latency
Virtually zero (typically <1 microsecond).
SDI is an electrical digital signal — no compression, no buffering.
HDMI Latency
Also near-zero for raw transmission, but:
- Handshakes
- Color space conversions
- EDID negotiations
…can add slight processing delays in certain devices.
IP Video Latency
IP video is where latency grows.
| Protocol | Latency |
|---|---|
| NDI Full Bandwidth | 1–2 frames |
| NDI HX3 | 20–50 ms |
| NDI HX | 50–200 ms |
| SRT | 120–2,000+ ms |
| RTMP | 2–5 seconds |
| WebRTC | 100–500 ms |
IP video is powerful, but not all formats are suitable for low-latency environments.
6. Switchers, Scalers, and Converters
Each device that touches a signal often adds frame sync delay.
Typical Latencies
| Device Type | Latency |
|---|---|
| Hardware switchers (ATEM, TriCaster) | 0–1 frame |
| Video scalers | 1–3 frames |
| Frame synchronizers | 1–2 frames |
| Cross-converters | 0–1 frame (SDI↔HDMI) |
| Wireless SDI transmitters | 1–4 frames |
| Wireless HDMI transmitters | 3–10 frames |
Wireless = biggest latency offender
Wireless transmission requires:
- Encoding
- Packetization
- RF transmission
- Decoding
Latency compounds quickly.
7. Encoding & Streaming
Encoding compresses video for:
- Streaming
- Recording
- Network transport
- Wireless systems
Higher compression = higher latency.
Codec latency (approx)
| Codec | Latency |
|---|---|
| ProRes / DNx | 1–2 frames |
| H.264 | 1–4 frames |
| H.265 | 2–8 frames |
| AV1 | 10–50+ frames (high computational cost) |
Hardware encoders beat software encoders in latency by a wide margin.
8. Displays & Monitors
Monitors also introduce delay due to:
- Scaling
- Color processing
- HDR tone-mapping
- Refresh synchronization
- Overdrive and frame interpolation
Typical latencies
| Display Type | Latency |
|---|---|
| Professional SDI monitors (SmallHD, Flanders) | <1 frame |
| Broadcast studio monitors | 1–2 frames |
| Consumer TVs | 3–20 frames |
| Gaming monitors | 1–3 frames |
| LED walls | 2–6 frames |
LED walls often introduce delay because the processor must handle:
- Scaling
- Color calibration
- HDR mapping
- Refresh driving
- LED module timing
9. Cumulative Latency — How It Adds Up
Latency stacks linearly.
Example pipeline:
- Camera: 2 frames
- Wireless transmitter: 3 frames
- Switcher: 1 frame
- LED processor: 3 frames
- LED wall: 2 frames
Total = 11 frames (~183 ms @ 60fps)
This is visible to the human eye and will break timing for IMAG and virtual production.
10. Latency Thresholds: What Is “Acceptable”?
Different applications have different tolerance.
| Application | Acceptable Latency |
|---|---|
| Virtual production (LED volumes) | <5 ms |
| Camera operating | <1 frame |
| IMAG (concerts) | <2–3 frames |
| Live broadcast switching | <3 frames |
| Esports / gaming | <10 ms |
| Streaming to internet | <5 seconds |
| Remote contribution | <200 ms |
Virtual Production has the strictest requirement — even a single dropped frame can break the illusion.
11. How to Build a Low-Latency Pipeline
Here are the top engineering rules.
🔹 Rule 1: Use SDI whenever possible
SDI is king.
- No compression
- No network buffering
- No jitter
- Lowest latency transmission available
- Better cable lengths
- Locking connectors
In any serious production environment, SDI is preferred over HDMI.
🔹 Rule 2: Avoid wireless unless absolutely necessary
Wireless systems always add delay.
If wireless is needed:
- Use professional zero-delay SDI systems (Bolt 4K, Hollyland Mars 4K, Teradek Bolt 750/1500).
- Avoid consumer HDMI transmitters.
🔹 Rule 3: Minimize frame synchronizers
Any device that syncs mismatched signals adds 1–2 frames.
Match cameras and frame rates where possible.
🔹 Rule 4: Avoid scaling if possible
Scaling (1080→4K, 4K→1080, 720→1080) is a hidden latency cost.
Use consistent resolutions.
🔹 Rule 5: Use NDI Full Bandwidth for low-latency IP video
NDI HX adds more delay than NDI Full.
Use HX only when bandwidth is limited.
🔹 Rule 6: Use appropriate monitors
Professional SDI monitors (Flanders, SmallHD, Sony) have very low latency.
Consumer TVs are the worst offenders.
🔹 Rule 7: LED Walls Require Special Care
Virtual production pipelines require:
- Genlock
- Low-latency processors
- Frame-accurate sync
Most LED walls are 2–6 frames delay by default.
Use Brompton or Megapixel VR processors for best results.
12. Genlock & Synchronization
Latency isn’t just delay — sync matters too.
If camera → processor → LED wall → display are not perfectly in sync, you get:
- Judder
- Rolling bars
- Mismatched motion
- Unreal Engine tracking errors
Genlock ensures all systems operate on the same timing clock.
13. Measuring Latency
You can test latency using:
- A stopwatch app filmed by the camera
- A timecode generator
- LED flash method
- Software latency meters
- Sync-One2 analysis tool
- VENICE & RED internal latency measurements
Typical methods involve capturing a timestamp on both ends.
14. Latency Optimization Strategies
Checklist for lowest delay pipelines:
Use SDI
✔ Always priority #1
Avoid unnecessary conversions
✔ No HDMI → SDI → HDMI → SDI loops
Avoid wireless when possible
✔ Even the best adds 1–4 frames
Use professional equipment
✔ LED processors, SDI monitors, hardware encoders
Keep consistent formats
✔ Same frame rate
✔ Same resolution
✔ Same time base
Use genlock
✔ Especially for VP and LED walls
Upgrade your switchgear
✔ Use high-bandwidth SDI routers
✔ For IP: multicast-enabled, low-buffer network switches
15. Latency by Workflow Type
🎬 Cinema Production
Acceptable: <2 frames
Best tools: SDI → hardware wireless → SDI monitors
🎤 IMAG / Live Events
Acceptable: <3–5 frames
Best tools: SDI → switcher → LED processor
🎮 Esports
Acceptable: <10 ms
Best tools: Gaming monitors; SDI or HDMI direct feeds
📺 Live Broadcast
Acceptable: <3 frames
Best tools: SDI router → switcher → broadcast monitors
🌐 Streaming
Acceptable: 1–5 seconds
Best tools: SRT / RTMP / hardware encoder
🧱 Virtual Production (LED Volume)
Acceptable: <3–5 ms end-to-end
Best tools: SDI → genlocked cameras → Brompton/Megapixel processors
Conclusion
Latency is one of the most critical technical considerations in video production — and one of the least understood. Every component in a pipeline contributes to delay, but with the right design choices you can build an ultra-low-latency workflow suitable for anything from basic streaming to high-end virtual production.
✔ SDI = lowest latency
✔ HDMI = moderate latency
✔ NDI Full Bandwidth = low latency
✔ NDI HX/HX3 = medium latency
✔ LED walls = high unless optimized
✔ Wireless links = always add delay
✔ Scaling & sync = hidden latency killers
✔ Genlock = mandatory for VP and LED workflows
Understanding where latency comes from empowers cinematographers, engineers, and streaming professionals to build fast, responsive systems that preserve timing, realism, and viewer experience.