I've been editing on a broadcast-friendly rig with a Kona 3 and ADCs, makeshift HD studio monitor for the past few years. Now I'm working in an HD web only delivery environment.
I'll be doing sound design via Soundtrack Pro, and I'll need to hookup a good set of reference monitors for the audio when editing in FCP too. I plan to stick with 23.98, 29.97 and 59.94 unless there's a good reason not to do so.
Now, I suppose I don't need an HD studio monitor since my deliverables will be in RGB rather than rec. 709. But that's where I need some advice. If I'm editing and using a Kona 3 or the analog Kona card to monitor the audio out to a Big Knob and then out to my reference monitors, shouldn't I also be monitoring the video via a studio monitor connected to the Kona rather than the Viewer on my desktop so that the sync is 100% inline?
Are there any other suggestions for the best way to monitor video that's in a ProRes timeline, with the ultimate idea being to have the most accurate way to monitor for RGB computer screens?
[Brad Bussé]"Now, I suppose I don't need an HD studio monitor since my deliverables will be in RGB rather than rec. 709."
Not sure about this, because, I guess, that you will be working with YUV codecs (H264, Flash,..).
Your deliverables won't be RGB.
[Brad Bussé]" shouldn't I also be monitoring the video via a studio monitor connected to the Kona rather than the Viewer on my desktop so that the sync is 100% inline?"
Only out of the KONA you are getting a proper picture. No from the GPU.
Thanks, I guess my thinking is that, yes it's a YUV codec, but ultimately the end user is largely going to be viewing on an LCD, so the YUV codec is being converted to RGB via Quicktime, correct? I wonder if an sRGB calibrated iMac display is accurate enough to master Quicktime/Flash encoded videos intended only for the web. Obviously it won't account for people viewing on a CRT or plasma via a PS3, or people with higher than 72% NTSC gamut (although is that a concern, since it's being viewed in an sRGB color space instead of rec. 601 or rec. 709?).
"Only out of the KONA you are getting a proper picture. No from the GPU."
So even if you've rendered out the real-time GPU FX in the timeline, you're saying the GPU's representation of the Canvas is inherently inaccurate? Is this just in relation to the color space, or does it also relate to an inaccurate display of sync between the video in the Canvas, and the audio out of the 3.5mm audio jack hooked up to reference monitors?