Building a color pipeline for cross-platform releases
Shipping the same spot to YouTube, television, and social always used to make me nervous. I would nail the look in DaVinci Resolve, export a ProRes master, and then watch Instagram crush the shadows while TikTok pushed the skin tones toward magenta. After a few painful client calls I built a repeatable color pipeline to keep every deliverable on brand regardless of where it plays.
Start with calibrated monitors and a neutral suite
It sounds obvious, but you cannot lock in a look if your monitors are lying to you. I grade inside a neutral, low-light suite using a calibrated reference monitor set to Rec. 709 for broadcast and a secondary display matching sRGB for web work. Every shoot gets a MacBeth chart shot under key lighting so I can balance the camera profiles before I touch creative adjustments.
# get balanced starting point
resolve-match --input=timeline --target=rec709 --chart=macbeth
I also check renders on a consumer OLED TV and an iPhone before I ship. It is the fastest way to catch crushed detail or clipped highlights that somehow hid on the grading monitor.
Build show LUTs with intent
Once the base grade is consistent, I design a look LUT in Resolve's node tree, export it, and store it alongside the project in Git. I generate a second version that bakes in contrast and saturation for vertical deliverables because social players tend to desaturate and brighten videos.
- Normalize footage with ACES IDTs.
- Balance skin tones using log wheels and selective qualifiers.
- Create a look LUT with Tone Mapping set to Luminance Mapping.
- Export variants for 16:9 and 9:16 formats.
Saving LUTs per project lets my assistants jump in and create alt cuts without reinventing the look, and it keeps brand teams happy when they see the exact same palette across campaign touchpoints.
Automate QC before publishing
Before I deliver, I run a batch of checks through FFmpeg and my own Python scripts to confirm the renders stayed within broadcast-safe limits and that the color transforms were applied.
from qc import verify_levels, check_lut
master = "launch_spot_prores.mov"
vertical = "launch_spot_vertical.mp4"
verify_levels(master, target="rec709")
check_lut(vertical, look="launch_lut_v09.cube")
It takes five minutes and has saved me from embarrassing reposts more than once. If you are scaling your editing team, invest time in this stage - it is easier to fix a LUT than explain a mismatched Instagram post to a global marketing team.
