Setting up color management template and strategy
I make mainly music videos. I use some camera footage (hacked GH2), but mainly, I do AE animations with fairly extreme colors blending constantly, so I really need to work out a proper color management strategy as I'm constantly dancing on the edge of oversaturation, etc.
I am (unfortunately) working with a cheap lcd tv as a monitor, but I've got the room nearly free of any glare, ambient daylight, etc. and I have calibrated the monitor to the best of my ability via colormunki.
Current project settings:
I know I need to up the bitrate, but not clear whether I should be using 16 or 32. Also, I already have very long render times with 8 bit, so does this get much worse as bitrate increases?
Next, as for working space, very confused on this, but I believe I should be working in sRBG and linearizing the workspace, correct? Does this give best transition to YT, FB, etc and help alleviate the haloing and other related color mixing issues? If so, do I need to alter something on output? Currently outputting with these settings:
I don't see anywhere to set color management profile on output, but it's my understanding that YT ignores this anyway, correct?
So, should I just go with this for my new project settings and call it a day?
If not, what am I missing? (I'm already avoiding QT player like the plague, so I'm assuming this means I don't need to mess with the legacy qt option)
[Greg Sage] "I know I need to up the bitrate, but not clear whether I should be using 16 or 32. Also, I already have very long render times with 8 bit, so does this get much worse as bitrate increases? "
This is not bit rate. This is bit depth, and it refers to the precision with which calculations are made.
With 8bpc, the range from black to white (per channel) is 0 to 255, with whole-number steps. This is "millions of colors," 16,777,216 of them to be precise.
With 16bpc, the range from black to white (per channel) is 0 to 32768, again with whole-number steps. This is "trillions of colors," 35,184,372,088,832 of them.
Note that black is still black, white is still white, red is still red, etc., and your final delivery encoding and display will probably be 8bpc anyway. Working in 16bpc just gets you shades in between the extremes for calculations. In practical terms, that means that doing something like adjusting the gamma up in a levels effect at the top of an effects stack and then pushing it back down later on will give you less crunching in the midtones, or stair-stepping in the histogram.
With 32bpc, we stop doing integer math and start doing floating point math. Black is 0 and white is 1, with the numbers between, below and above represented as real numbers with decimal points as necessary.
Although you can't display a white brighter than white (1), you can do intermediate calculations with it. This means that superbright highlights can be preserved across effects, even if they are brighter than can be displayed.
See here for more:
Moving a project from 8bpc to 16bpc doubles your RAM requirements, improves precision, but doesn't radically alter the way calculations work; moving it from 8bpc to 32bpc quadruples RAM requirements and can drastically affect the way effects are calculated. CPU requirements does not change linearly like RAM. You're still doing the same number of calculations per second, you're just doing them with larger containers for the data.
[Greg Sage] "Next, as for working space, very confused on this, but I believe I should be working in sRBG and linearizing the workspace, correct?"
sRGB and Rec. 709 are very, very close. Rec. 709 is the normal profile for HD video work.
Linearizing the workspace does not affect how footage looks coming in or going it. It does affect the math used in compositing operations, which affects the look of your composites and effects.
Linear light compositing usually gives more natural-looking composites.
[Greg Sage] "I don't see anywhere to set color management profile on output, but it's my understanding that YT ignores this anyway, correct?"
This is done in the output module from the render queue. If you don't specify an output color management profile, then the working space will be used.
As for color management and YouTube, it's not just up to YouTube how this works. Color management is required at every stage of the pipeline to ensure consistent results. If your viewer's browser is not color-managed, it may not be consistent.
Designer & Mad Scientist at Keen Live [link]
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
@keenlive [twitter] | RenderBreak [blog] | Profile [LinkedIn]
Ok, thx. Wasn't sure about bit depth slowing things down. I have plenty of ram, it's the cpu cycles that are at a premium.
So I've switched to 16 bit, and linearizing with 709 workspace. Part I'm not sure about is output. I understand it's not all up to YouTube, but it's pointless for me to worry about it from there. I only control the point up until it's transcoded at YT. I just need the closest I can get to some sort of reference after it gets processed by YT.
Given my settings, and given that I am delivering only to FB, YT (or similar in future), can I improve on my current practice of outputting from AE like this:
and then generating delivery format in AME with this:
For instance, I'm not specifying color space. Could or should I be using different codec, etc?
The Animation codec only supports 8-bit color space. Use ProRes 422 or 444 instead; these support 10-bit color space.