FORUMS: list search recent posts

BMCC alternate workflow

COW Forums : Apple Final Cut Pro X Debates

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
Oliver Peters
BMCC alternate workflow
on Sep 3, 2012 at 12:36:33 am

I've done some testing with the same 5 John Brawley camera raw clips from the Blackmagic Cinema Camera and have come up with an interesting workflow using FCP X. Obviously BMD wants you to use Resolve to generate dailies, but the truth is that many photo apps will read CinemaDNG. I would imagine that when this camera gets out into the wild, most folks will simply shoot ProRes. Nevertheless the DNG files offer the advantage of greater latitude (and grading control) and over-sized images. This gives you some nice lossless reframing options with HD and even 2K timelines. Here's the workflow:

1. Open the CinemaDNG camera raw image files in Aperture. The shots are image sequences within a folder for each shot. (That's how the files are online, but I'm not sure if that's how the camera actually creates them, though.) I've imported each folder as a project in Aperture.

2. Adjust one frame in the Aperture project and then Lift & Stamp (like copy & paste attributes) the settings to apply them to all of the frames in that project. Repeat for each shot.

3. Export "versions" of the frames in the original size as new frames. I exported as TIFFs at original size. This should go to a different folder than the source folder. Repeat for each shot.

4. Launch QT Player 7 Pro and "open image sequence". Find the first frame of the shot you want (the new exported frame) and import. At this point QT will ask for the frame rate. Once opened, save this as a QT reference movie. Repeat for each shot.

5. Launch FCP X and import the QT refs into an Event. You have the option to transcode optimized and/or proxy media and that's what I would recommend. However, performance - at least with only these 5 short clips - is fine on an 8-core tower using the QT reference movies directly. Remember, the underlying compressed media is uncompressed TIFF at 2400 pixels wide. Adjust/fit/fill the frame size as desired.

Now here's the interesting part. Go back to Aperture and change the settings for a shot (series of many frames). Then re-export new versions (replacing the earlier exports). The QT reference file will update to show the new color settings. When you relaunch FCP X, the clips in the Event and Project will also be updated. A few caveats, though. The filmstrip thumbnails are not updated. You'll see one updated frame in the filmstrip as you skim over it. Nevertheless linking is fine and the viewer shows the correct display. If you have created optimized or proxy media, you will have to manually delete these files first and then transcode again in order to reflect the updated color.

In any case, working only with the QT refs is fine on a fast machine, as best as I could test with only 5 clips. By comparison, I could AMA-link these same files into Symphony 6 and also import them into Premiere Pro. In both cases, performance stuttered. Even at lower resolution playback settings. Clearly FCP X has an advantage in this type of workflow, although Media Composer/Symphony does a good job importing image sequences in the "traditional" (non-AMA) fashion. The only issue I've hit in FCP X is that some plug-ins (FxFactory and their partners plus IRUDIS, so far) don't work on these clips because of the 2400 pixel width.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Rob Mackintosh
Re: BMCC alternate workflow
on Sep 3, 2012 at 8:18:20 am

That's a useful workflow Oliver.

I found that when importing the raw dng files that FCPX created optimized media whether you selected this option on import or not.
ProRes 422 files at the original frame size were created during playback and skimming in the event or project. Perhaps it is doing the same with the ref QT and this explains the good playback performance you are getting.

See here for more info http://forums.creativecow.net/readpost/335/40719

In fact I noticed this behaviour with a variety of stills: http://fcp.co/forum/4-final-cut-pro-x-fcpx/12820-color-correction-on-camera...

Sorry I am a few thousand miles from my computer so can't test this out myself.


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 3, 2012 at 1:51:10 pm

[Rob Mackintosh] "I found that when importing the raw dng files that FCPX created optimized media whether you selected this option on import or not."

It seems to do this with all still images. I found that out on another project and posted it in another thread.

[Rob Mackintosh] "Perhaps it is doing the same with the ref QT and this explains the good playback performance you are getting. "

No, I don't think so. The only media I find in the event folders is what I told it to transcode or optimize. It's definitely playing the timeline only from the QT reference movies. This is verified by when I change the TIFFs with updated ones. The timeline updated accordingly. No activities in the "100% meter" or the background tasks display.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index


Rob Mackintosh
Re: BMCC alternate workflow
on Sep 4, 2012 at 7:31:01 am

I must have missed that thread Oliver.

Be interested to see if it works with 16bit tiffs. I exported a BMC dng (with adjustments) as a16 bit tiff from Aperture, imported into FCPX and compared it with the Aperture preview file (8 bit jpeg, quality 12, no chroma subsampling) that's brought in when you drag from the media browser. FCPX optimised both to ProRes 422 files of about the same size. Images looked identical on my monitor, slight difference in waveform/histogram, you could push the exposure up and down on both images without much degradation. I noticed a difference when making a secondary color correction using the color board's HSL keyer. It felt a little more precise pulling a key from the 16bit tiff.

So at the very least I think FCPX is processing 10bit plus media at 10bits.


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 3, 2012 at 1:56:09 pm

[Rob Mackintosh] "I found that when importing the raw dng files that FCPX created optimized media whether you selected this option on import or not."

PS: It does this with JPEGs and TIFFs, too. You will find them in the Events>Render files>High Quality Media folder.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 3, 2012 at 2:37:33 pm

[Oliver Peters] "Launch QT Player 7 Pro and "open image sequence". Find the first frame of the shot you want (the new exported frame) and import. At this point QT will ask for the frame rate. Once opened, save this as a QT reference movie. Repeat for each shot."



I wanted people to note that handling the files in this manner will truncate the working color space down to the QT players native 8bit.

Using the frame to video workflow is currently handled better by using the Adobe Media Encoder CS6 -
WHile allowing for the full bit depth during the still to motion conversions, it will also allow you to select the first frame of the image sequence, then select the make QT Movie and it will create the image seq for you, and can create ProRes for FCPX faster than the current Compressor workflows.

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index


Oliver Peters
Re: BMCC alternate workflow
on Sep 3, 2012 at 2:55:29 pm

[gary adcock] "I wanted people to note that handling the files in this manner will truncate the working color space down to the QT players native 8bit."

Hmmm... Very good point. Aperture gives you the option of a 16-bit or 8-bit TIFF export. I only tested this with 8-bit TIFFs anyway, so I'm not sure what would happen with 16-bit TIFFs. I also don't really know whether the standard QT issues are a factor in FCP X or whether Apple has engineered some back-door workarounds. Clearly there are no problems with Alexa QuickTime recordings. Besides, if you bring the DNG files straight into FCP X, isn't the same thing happening? I doubt these are coming in at full, native bit-depth. There certainly is no access to raw adjustments within FCP X.

FCP X identifies the "compressor" of the QT reference files as TIFF. I was trying to avoid a double-encode for reasons of time and storage space. Obviously a real shoot would amount to 100s of 1,000s of frames of unedited footage. The solution, of course, would be if FCP X added: a) raw settings controls, and b) the ability to ingest image sequences. In any case, the images looked very clean (no banding) in this very limited test. Another option is to use After Effects to create movie files.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Rafael Amador
Re: BMCC alternate workflow
on Sep 3, 2012 at 4:12:33 pm

[gary adcock] "I wanted people to note that handling the files in this manner will truncate the working color space down to the QT players native 8bi"
Right, but QT Player is not processing the stills in this workflow at any moment. Its only build a reference file that is processed by FCPX (32b FP). The original codec/bit depth is preserved.
QT supports 16b RGB (TIFF/PNG still sequences and Microcosm QT Movies) even if QT Player can't process or display more than 8b.
Rafael

http://www.nagavideo.com


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 3, 2012 at 4:30:42 pm

[Rafael Amador] "ight, but QT Player is not processing the stills in this workflow at any moment. Its only build a reference file that is processed by FCPX (32b FP). The original codec/bit depth is preserved."

It is my understanding that may have been true prior to the release of QTX, but under Mountain Lion, the QT REF movie is handling the file as a 8 bit video file, so unless you re-create stills on output, you will be limited to 8bits in your workflow.


[Rafael Amador] "QT supports 16b RGB (TIFF/PNG still sequences and Microcosm QT Movies) even if QT Player can't process or display more than 8b."

Supporting and correctly handling a bit depth are 2 completely different things here

How is it possible to output more than 8bits if that is all the players engine supports?

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index


Oliver Peters
Re: BMCC alternate workflow
on Sep 3, 2012 at 5:43:44 pm

[gary adcock] "It is my understanding that may have been true prior to the release of QTX, but under Mountain Lion, the QT REF movie is handling the file as a 8 bit video file, so unless you re-create stills on output, you will be limited to 8bits in your workflow. "

What we don't know, though, is whether that applies to FCP X. This may only be a limitation of other software using the QT conversion. For example, Avid bypasses the QT conversion when it imports ProRes files directly, on Mac MC6.x, thanks to a licensing deal with Apple. It taps directly into the codec for a "fast import" (file rewrap). However, if you access QT files on MC5 or lower, the "hand-off" happens by way of the QT engine, which is handling the conversion.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 3, 2012 at 5:46:37 pm

PS: "Bypassing QT" in FCP X (if in fact that's true) might also account for why real-time performance is significantly better in FCP X versus Avid AMA or Premiere Pro import.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Rafael Amador
Re: BMCC alternate workflow
on Sep 3, 2012 at 11:22:41 pm

[gary adcock] "It is my understanding that may have been true prior to the release of QTX, but under Mountain Lion, the QT REF movie is handling the file as a 8 bit video file, so unless you re-create stills on output, you will be limited to 8bits in your workflow. "
As Peter describes hes workflow, he is using QT.7, which hasn't been modified since SL.
QTX has no "Open still sequence" option.
And he is "Saving as..Reff Movie", where there is not any kind of picture processing involved. "Saving as..." just put the stills on a QT container without affecting the content.

Things would be different if he did "Export", where there is always processing, even when exporting to with the same codec, size, etc, etc.
"Export with QT Pro is always "Destructive".



[gary adcock] "Supporting and correctly handling a bit depth are 2 completely different things here

How is it possible to output more than 8bits if that is all the players engine supports?"


You work with 10b Uncompressed and Prores (10b) QT files, on a daily base, don't you?
Where is the problem?
They are 10b QT files even if QT player can only display them at 8b.
You can play them with QT and they still being 10b, even if the monitor will shows just 8b.
You can edit (Cut, Copy, Paste, Add to Movie) and "Save as Reference Movie" or "Save as Self-contained Movie" and they still 10b.
The 8b crunching only happens when you EXPORT.

The Player (that as you say only DISPLAY at 8b) and the QT Engine (that only PROCESS at 8b) are just two functions of QT Pro, but they don't have any role here.
The QT Player window that pops up whenever you open or save something with QT Pro is just for preview, so you can see what are you doing
Another function is packing things in QT containers. This can be done a 16b per channel. All those functions are managed by common GUI. .
rafael

http://www.nagavideo.com


Return to posts index


Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 3:13:30 pm

[Rafael Amador] "QT supports 16b RGB (TIFF/PNG still sequences and Microcosm QT Movies) even if QT Player can't process or display more than 8b."

I've just tested 16b Microcosm movies with Rafael -- and they don't seem to work in FCPX at all (black screen). Ae CS6 reads them correctly.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 4, 2012 at 6:27:00 pm

[gary adcock] "I wanted people to note that handling the files in this manner will truncate the working color space down to the QT players native 8bit."

[Oliver Peters] "What we don't know, though, is whether that applies to FCP X. This may only be a limitation of other software using the QT conversion."


I did a quick test, and Gary's right: FCPX is truncating reference movies referring to 16-bit TIFFs.

However, Oliver is right, too -- FCPX is apparently not using QuickTime in its processing path. More on that in a moment.

Here's my test methodology -- and I welcome all thoughts and criticisms in case I've missed anything:

In After Effects, I created a black solid over a white solid, animated its opacity from 100% to 0% over 1024 frames, and rendered to a 16-bit TIFF sequence -- giving me 1024 sequential images of linearly increasing RGB values. This gives me 10 bits of precision for the test.

I opened them as an image sequence in QuickTime Player and saved a reference movie.

I imported the reference movie into FCPX and cut it into the primary storyline. Then I advanced one frame and cut it in again as a connected clip, then changed the composite mode to "difference." Finally, I added blank title as an adjustment layer and cranked the gamma with a published Levels filter from Motion to exaggerate the result of the difference blend.

If the full depth of the original media were preserved, I'd see a consistent gray as a result of the exaggerated difference between any one frame and its neighbor. If it were truncated, I'd see 3 frames of black (no difference) followed by a flash of gray for a single frame (difference). This cadence is due to the one frame offset of two instances of the test movie and the division of 1024 (my 10-bit equivalent in a 16-bit render) / 256 (8-bit video) = 4. I see the latter, indicating truncation to 8-bit.

Interestingly, I get a different result bringing the reference movie back into a QuickTime-using application like QuickTime Player or AE. In these, I'm don't see truncation; instead, I seeing dithering. (Of course, bringing the native TIFF files directly into AE shows the correct constant difference.)

While FCPX does not seem to be using QuickTime to read the reference movies (as Oliver suggested), it is still limited to an 8-bit processing path somewhere (as Gary suggested).

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Rich Rubasch
Re: BMCC alternate workflow
on Sep 4, 2012 at 9:00:10 pm

That is a genius post. I think we need a little icon for a post that is purely genius.

Rich Rubasch
Tilt Media Inc.
Video Production, Post, Studio Sound Stage
Founder/President/Editor/Designer/Animator
http://www.tiltmedia.com


Return to posts index


Walter Soyka
Re: BMCC alternate workflow
on Sep 4, 2012 at 10:12:24 pm

[Rich Rubasch] "That is a genius post. I think we need a little icon for a post that is purely genius."

Ha, thanks, Rich. It can be right next to the button for posts that are overcomplicated...

While I had aimed to do the proof entirely in FCPX to eliminate external variables, this test can be greatly simplified or verified (if you trust AE, which I do) by simply importing the "temporal grayscale ramp" TIFF sequence reference movie into FCPX and immediately exporting it back out to a deep lossless format (like a 16-bit TIFF sequence). Open that in AE, work in 16bpc, hover the mouse over the footage with the info panel open, and step through one frame at a time. You'll see no change in the RGB values for a pixel over four frames in a row, indicating truncation as I explained above. Zooming in and bumping up the viewer's exposure, as well as mousing around with an eye in the info panel, will show no change within a single frame, indicating no dithering.

For the sake of completeness, creating the temporal ramp in FCPX (black solid over white solid, animating opacity from 100 to 0 over 1:09:23@29.97) and exporting as a TIFF sequence (correctly) shows a brightness change on every single frame in AE when checked with this same method, proving that the truncation is not in FCPX's rendering or exporting pipelines.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 5, 2012 at 1:07:59 am

Walter,

Did you render this in X or only go by the real-time display? If rendered, did you test the options, like HQ, 4444 or 10-bit uncompressed? If so, does it alter the results?

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 5, 2012 at 2:37:27 am

[Oliver Peters] "Did you render this in X or only go by the real-time display? If rendered, did you test the options, like HQ, 4444 or 10-bit uncompressed? If so, does it alter the results?"

No render in my initial tests -- I looked at the real-time display for the first test (offset difference comparison), and I looked at no-render exports via "Share > Export image sequence..." for the second test (simple read in and write out).

Rerunning the test now, rendering to any of the options first doesn't change the results.

I want to correct something I wrote above for anyone playing along at home; 1024 frames at 29.97 fps is 34:03, not 1:09:23 as I wrote. I read the timecode off from the wrong project when writing the post, and clearly wasn't thinking about the math while I was writing -- apologies for any confusion!

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Rafael Amador
Re: BMCC alternate workflow
on Sep 5, 2012 at 3:02:57 am

Hi Walter,
I think the test is consistent and shows that there is something wrong on how FCPX manage QT Reference files or at least the contain.
Are you able to import directly the 16b TIFF as stills (no QT Reff) and see how they behave in FCPX?
rafael

http://www.nagavideo.com


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 5, 2012 at 3:33:31 am

[Rafael Amador] "Are you able to import directly the 16b TIFF as stills (no QT Reff) and see how they behave in FCPX?"

When importing a series of stills to FCPX and exporting to deep formats from FCPX, verifying in AE shows no 8b truncation as it did with the reference movies.

Interestingly, FCPX's 16b TIFF export and the original 16b TIFF sources differ slightly on brightness values; I would have expected them to be identical. Perhaps there is something going on with FCPX's color management system that is foiling a perfect roundtrip?

I am also seeing noticeable shifts from the originals in AE with 10b Uncompressed 422, ProRes 422, and ProRes 4444. (10b Unc 422 and ProRes 422 look to be identical; perhaps a faulty RGB/YUV transform somewhere? Perhaps I'm doing something wrong?)

ProRes 4444 shows dithering in addition to the shift (but again no truncation). The dithering here strikes me as extremely odd.

And with that, I am back to having more questions than answers...

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Rafael Amador
Re: BMCC alternate workflow
on Sep 5, 2012 at 5:06:55 am

Walter, I have no FCPX installed, so I can't make any test, but I would like to send you something to try.
is an small file. May I post it to "info keenlive"?
rafael

http://www.nagavideo.com


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 5, 2012 at 1:13:42 pm

Rafael -- reach me direct at walter at keenlive dot com.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 5, 2012 at 6:39:10 am

[Walter Soyka] "Interestingly, FCPX's 16b TIFF export and the original 16b TIFF sources differ slightly on brightness values; I would have expected them to be identical.

I am also seeing noticeable shifts from the originals in AE with 10b Uncompressed 422, ProRes 422, and ProRes 4444. (10b Unc 422 and ProRes 422 look to be identical; perhaps a faulty RGB/YUV transform somewhere? Perhaps I'm doing something wrong?)

ProRes 4444 shows dithering in addition to the shift (but again no truncation). The dithering here strikes me as extremely odd."



The 16bit stuff is still really dicey in the mac os when going thru different apps.

The color difference being seen between import and export is the difference between "FullRGB" and "SMPTE" output ranges.

NO you are not wrong about faulty transforms with the conversion.

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 5, 2012 at 1:11:51 pm

[gary adcock] "The 16bit stuff is still really dicey in the mac os when going thru different apps. The color difference being seen between import and export is the difference between "FullRGB" and "SMPTE" output ranges. "

I think there is more to it than that -- the shift varies according to format.

When compared to the TIFF originals, FCPX's 16b TIFF export shows the slightest change; 10b Unc 422 and ProRes 422 both show the same greater change, and ProRes 4444 shows a still greater change.

More on the ProRes 4444 outputs: a correction to the above. ProRes 4444 is not showing dithering as I had assumed; it's showing very faint, regular horizontal lines -- every eighth line is a touch brighter in all three channels (more pronounced in red and blue than in green). This is true in Motion 5 as well as AE CS6.

I will try to make time to collect some stills to share with the group here today so you can see what I'm talking about, but what started out as a quick test is quickly becoming a science project...

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Rob Mackintosh
Re: BMCC alternate workflow
on Sep 5, 2012 at 7:13:39 am

Thanks Walter for the comprehensive testing. I have gained a lot of technical knowledge from your posts over the last year.

As Oliver noted in an earlier post, FCPX optimizes all stills to ProRes 422. Do you think this may be influencing the results of your tests. I wouldn't have thought FCPX was using these files an intermediate codec for rendering deep formats, but who knows. Starting with a 16 bit RGB file, transcoding it to a 10 bit YCbCr intermediate, processing it in 32 Bit float linear RGB then exporting as 16 bit RGB would seem less than optimal.

I noticed when rendering the BMCC dngs on a ProRes444 timeline that ProRes 422 optimized media was simultaneously created in the event folder. I think the same occurred on export except when using compressor. I'm away from my computer so can't test this with tiffs.


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 5, 2012 at 1:18:38 pm

[Rob Mackintosh] "As Oliver noted in an earlier post, FCPX optimizes all stills to ProRes 422. Do you think this may be influencing the results of your tests. I wouldn't have thought FCPX was using these files an intermediate codec for rendering deep formats, but who knows. Starting with a 16 bit RGB file, transcoding it to a 10 bit YCbCr intermediate, processing it in 32 Bit float linear RGB then exporting as 16 bit RGB would seem less than optimal."

Rob, I was working with only 1024 frames from black to white (ergo, 1024 discrete shades of gray) in that 16b TIFF original, so it should be able to be expressed in a 10b format. Repeating this experiment with a longer temporal ramp to purposefully violate that 10b barrier (say, 2048 frames/11b or 4096 frames/12b) would show if 10b intermediates are being used for render or not. I'll put it on the to-do list. More to come later.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 5, 2012 at 5:42:35 pm

Test patterns are nice, because they are definitive; however, I like to look at real world stuff to see what the practical implications are. So here's a slightly different test.

I took Shot #1 (close-up face shot of actress) and made a nice, punchy-but-pleasing image in After Effects from the DNG image sequence. Color correction was just tweaking the camera raw settings. I placed this into a 16-bit composition and exported a ProRes4444 file. Then I placed the same in an 8-bit composition and exported an 8-bit uncompressed YUV file.

Then I took these into FCP X and stacked them on a timeline setting the top layer's blend mode to "difference". Black screen, so no visible differences.

I did the same in After Effects in both 8-bit and 16-bit compositions. Again, black screen throughout.

This is with rendered QT files. When I get a chance, I'll test QT refs, but the bottom line for me is that 8-bit vs. 10-bit vs. 16-bit doesn't seem to make much of a difference with these images.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 5, 2012 at 9:14:59 pm

PS: One unknown in this discussion is whether the CinemaDNG raw importers are actually passing a 12-bit signal or truncating to 8bpc in the conversion to RGB.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 5, 2012 at 9:23:56 pm

[Oliver Peters] "Test patterns are nice, because they are definitive; however, I like to look at real world stuff to see what the practical implications are."

The test was just designed to see if depth was preserved from 16b TIFF ref movies. It may or may not be a dealbreaker, depending on your workflow needs, but it's certainly worth being aware of when you select a workflow.


[Oliver Peters] "I took Shot #1 (close-up face shot of actress) and made a nice, punchy-but-pleasing image in After Effects from the DNG image sequence. Color correction was just tweaking the camera raw settings. I placed this into a 16-bit composition and exported a ProRes4444 file. Then I placed the same in an 8-bit composition and exported an 8-bit uncompressed YUV file."

I think that Adobe Camera Raw might be 8-bit only -- the color picker in it is on a 0-255 scale, not 0-32767. Does anyone know for sure?

Also, did you change the ProRes 4444 output module's depth to Trillions of Colors? Ae won't do this by default, even if you have deep sources, and even if you're working at 16 bpc, so it's very easy to accidentally dither down to 8bpc on output.


[Oliver Peters] "Then I took these into FCP X and stacked them on a timeline setting the top layer's blend mode to "difference". Black screen, so no visible differences. I did the same in After Effects in both 8-bit and 16-bit compositions. Again, black screen throughout."

If you push the viewer's exposure up in AE, you'll see the differences. Some may be dithering from the depth reduction, some may be compression, and some may be chroma sub-sampling on the 422 stuff, but they are there.

They will not be visually significant. Since you will more than likely end up with an 8-bit output or an 8-bit display, you're right that it's likely not important to preserve the extra depth for straight cuts. It will get smushed down on display eventually anyway.

Deep color is certainly useful for effects and grading, though, so if the depth is one of the reasons someone is considering using this camera, preserving it may be an important consideration when choosing a workflow.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 5, 2012 at 11:08:44 pm

[Walter Soyka] "The test was just designed to see if depth was preserved from 16b TIFF ref movies. "

Didn't mean to sound like I was downplaying it. Certainly a very worthwhile and illuminating test.

[Walter Soyka] "Also, did you change the ProRes 4444 output module's depth to Trillions of Colors?"

That's a great question. I'll have to recheck. You may be right and I may have exported it incorrectly.

[Walter Soyka] "If you push the viewer's exposure up in AE, you'll see the differences."

Yes, I understand.

[Walter Soyka] "Deep color is certainly useful for effects and gradin"

I will say that it looks like you can push the grading pretty hard. I don't see the range of raw tweaks as you would get from RED in REDCINE-X PRO, however. I also see completely different raw interpretation (such as color temp) in Adobe versus Aperture versus Resolve.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 1:36:12 am

[Oliver Peters] "Didn't mean to sound like I was downplaying it. Certainly a very worthwhile and illuminating test."

I didn't mean to sound like I thought you sounded like you were downplaying anything. I just wanted to point out that even though it's a completely synthetic test, it does have real-world implications.

I also didn't mean to imply that you need 10b to get great imagery -- just that there are some cases where you do want those extra 2 bits of precision.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Shawn Miller
Re: BMCC alternate workflow
on Sep 6, 2012 at 12:28:09 am

[Walter Soyka] "I think that Adobe Camera Raw might be 8-bit only -- the color picker in it is on a 0-255 scale, not 0-32767. Does anyone know for sure?"

John Brawley's blog and Blackmagic's website both report that it's 12 bit.

Shawn



Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 6, 2012 at 12:38:46 am

[Shawn Miller] "John Brawley's blog and Blackmagic's website both report that it's 12 bit. "

Well, they say the raw files that the camera records are 12-bit. There are no real specifics about the conversion used by any of the CinemaDNG plug-ins and whether they preserve that. I would presume a photographic plug-in would, but that's only an assumption.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 6, 2012 at 1:44:17 am

FWIW - I reran the tests making sure the 16bpc AE comp export to PR4444 was set to trillions. Again, I'm not seeing anything when I difference-blend the 8-bit uncompressed QT over the Pro4444 in either AE or FCP X.

Also tried 16-bit TIFFs with the QT refs. That doesn't work. They don't like 16-bit TIFFs. The 16-bit TIFFs import directly without issue into both FCP X and Media Composer. So, although there may indeed be some truncation issues, the workflow is more than workable at this point.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 1:56:02 am

[Oliver Peters] "Also tried 16-bit TIFFs with the QT refs. That doesn't work. They don't like 16-bit TIFFs."

I used 16b TIFFs in my test -- uncompressed TIFFs worked in QuickTime reference movies (with truncation), but LZW-compressed TIFFs did not.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 6, 2012 at 2:07:24 am

[Walter Soyka] "but LZW-compressed TIFFs did not"

That must be the case. I'm exporting the TIFFs out of Aperture and they don't expose that setting in the export presets. I'll try another time with Lightroom.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Jeremy Garchow
Re: BMCC alternate workflow
on Sep 6, 2012 at 2:36:39 am

Let's all send some feedback to get Appe to support image sequences in fcpx!

Who's with me?!?!?!?!?

Hip hip! Hooray!


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 2:52:32 am

[Jeremy Garchow] "Let's all send some feedback to get Appe to support image sequences in fcpx! Who's with me?!?!?!?!? Hip hip! Hooray!"

Image sequences are ALWAYS the answer!

Of course, while we're at it, we could request direct support for the BMCC...

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: BMCC alternate workflow
on Sep 6, 2012 at 3:26:07 am

[Walter Soyka] "Of course, while we're at it, we could request direct support for the BMCC..."

Why stop there? BMDCC XYZ is an image sequence with RAW. Let's do it all!


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 6, 2012 at 5:56:06 am

[Jeremy Garchow] "Why stop there? BMDCC XYZ is an image sequence with RAW. Let's do it all!"

ah....

most of the higher end use some level of sequential frames

that list also includes Arriraw, DPX, the BMCC's CinemaDNG format but do not forget OPEN EXR for the VFX guys.

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 1:19:21 pm

[Jeremy Garchow] "Why stop there? BMDCC XYZ is an image sequence with RAW. Let's do it all!"

I'm in.

My point there, though, was that these advanced formats that you and Gary mention require much more than than the ability to treat a set of sequential stills as a movie.

Plain vanilla image sequence support by itself would get you things like TIFFs. Not all image formats are created equal. For example, any of the RAW image formats, you have to add debayering and RAW interpretation options. For OpenEXR, you need meaningful multi-channel image support (which brings with it a lovely can of worms on the compositing side).

I am merely cautioning that one should be careful what they feature request -- you might just get it!

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: BMCC alternate workflow
on Sep 6, 2012 at 2:00:56 pm

[Walter Soyka] "I am merely cautioning that one should be careful what they feature request -- you might just get it!"

I'm over it! :)

We are discussing an almost 3k camera for almost $3k that shoots 12bit RAW DNG.

Our cheap post tools should be able to work the cheap production tools.

FCPX has a direct line to the Aperture library which has RAW capability. Changes in Aperture are immediately reflected in FCPX. Raw control panels should be "easy" to add. Those are huge quotation marks around easy, by the way.


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 6, 2012 at 5:53:04 am

[Oliver Peters] "Also tried 16-bit TIFFs with the QT refs. That doesn't work. They don't like 16-bit TIFFs. "

Yes.
I have to admit that I have had some serious issues trying to work in 16bit using the everyone of the available tools and very specially when using full range 16bit DPX frames from the Sony F65.

The only current tool I was happy with the conversion result was from the Adobe CS6 Media Encoder, which I used to make 4K60p ProRes 422 versions from those DPX frames so that I could do a rough edit from, then ran into the "nothing really handles much 4K Resolution" very well on the desktop.


This project was shot for a new product launch at IBC.

The 4096x2160 at 59.94 fps in PR422 requires about 200 MBps for playback.

I cut the entire project in REALTIME on my Retina MBP in FCPX over Thunderbolt using a Promise R6 array, generating about 18T of content in 4 days (I was using 2 laptops to create media due to a mountain lion issue, so I had one running 10.7 to handle specifics (like the Sony F65 view app does not run under 10.8), but all the editing was done on my Retina.

The final edited project weighs in at approx. 1.4T for 3:30 TRT when relinked back to the 16bit DPX sequences.

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 1:49:11 am

[Shawn Miller] "John Brawley's blog and Blackmagic's website both report that it's 12 bit. "

[Oliver Peters] "Well, they say the raw files that the camera records are 12-bit. There are no real specifics about the conversion used by any of the CinemaDNG plug-ins and whether they preserve that. I would presume a photographic plug-in would, but that's only an assumption."

I have confirmed that when AE uses Adobe Camera RAW to import these BMCC CinemaDNG files, it does not clip to 8b.

Opening one of the CinemaDNG files in Photoshop gives you customizable RAW workflow options for color space and depth that are unavailable to the user when opening them in AE. To figure out what AE was doing under the hood, I saved out a few variations of one of the frames (with no adjustments) from Photoshop, where I could customize the workflow options. I used combinations of 8b and 16b, and Adobe RGB and sRGB, saved them all out as 16b PSDs, and compared them to the same DNG imported into AE (with no adjustments). The 16b sRGB version matched perfectly, while the others all showed differences.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Rafael Amador
Re: BMCC alternate workflow
on Sep 6, 2012 at 4:12:03 am

I've been making too some tests with Photoshop (importing as 8b and 16b) and then bringing the files to AE (difference), and there is no doubt that the .dmg's are more than 8b depth.

[Walter Soyka] "I used 16b TIFFs in my test -- uncompressed TIFFs worked in QuickTime reference movies (with truncation), but LZW-compressed TIFFs did not."
That's an interesting finding.
Do you think that the button "For PC/For Mac" on the TIFF export window, could be causing issues ("QT Reference" problems) when not properly set?
(can't test by my self)

[Walter Soyka] "Also, did you change the ProRes 4444 output module's depth to Trillions of Colors? Ae won't do this by default, even if you have deep sources, and even if you're working at 16 bpc, so it's very easy to accidentally dither down to 8bpc on output."
When selecting Prores444, the output module changes it self to Trillions Colors, so by default you should be getting more than 8b (supposedly 12b with PR444).
rafael

http://www.nagavideo.com


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 6, 2012 at 6:02:28 am

[Rafael Amador] "I've been making too some tests with Photoshop (importing as 8b and 16b) and then bringing the files to AE (difference), and there is no doubt that the .dmg's are more than 8b depth."



herein lies one of the issues in modern imaging.

Existing 16bit workflows are designed around VFX workflows,
however I am using CAMERA GENERATED 16bit files, not creating something in Photoshop or Maya and yes I am seeing different responses with the media.

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index

Rafael Amador
Re: BMCC alternate workflow
on Sep 6, 2012 at 12:12:51 pm

[gary adcock] "Existing 16bit workflows are designed around VFX workflows,
however I am using CAMERA GENERATED 16bit files, not creating something in Photoshop or Maya and yes I am seeing different responses with the media."

I haven't make the files by my self , but I've used the same John Brawley's BMCC ".dmg" files that Oliver has been using for his tests.
I guess that when working on Photoshop (or whatever other Graphic application), the main caveat is on choosing and keeping the proper Color Profile.
rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: BMCC alternate workflow
on Sep 6, 2012 at 1:36:03 pm

[Rafael Amador] "I guess that when working on Photoshop (or whatever other Graphic application), the main caveat is on choosing and keeping the proper Color Profile."

Ironically, FCPX seems to hint at having this capability.

We just need to know what is happening, and of course we need more options, see here:



giveuscontrolplease.png

Jeremy


Return to posts index

Walter Soyka
Re: BMCC alternate workflow
on Sep 6, 2012 at 3:40:50 pm

[Rob Mackintosh] "As Oliver noted in an earlier post, FCPX optimizes all stills to ProRes 422. Do you think this may be influencing the results of your tests. I wouldn't have thought FCPX was using these files an intermediate codec for rendering deep formats, but who knows. Starting with a 16 bit RGB file, transcoding it to a 10 bit YCbCr intermediate, processing it in 32 Bit float linear RGB then exporting as 16 bit RGB would seem less than optimal."

FCPX does not seem to be creating optimized media from my 16b TIFFs -- there is no "High Quality Media" folder in my Events folder.

I created a 16b temporal ramp (RGB values increase by 1 on a 0-32767 scale on each frame), saved out 16b TIFFs, imported them into FCPX and cut a few in a frame at a time (poor man's image sequence), and exported a TIFF sequence.

FCPX preserved the full 16b depth, as shown by examining the output in FCPX. It was not clipped to 10b in the middle.

However, like the 10b TIFF test, the results were very close but not mathematically identical. The first error -- only a single bit in all channels -- occurred after 26 frames. Certainly tolerable in practical applications (though, going back to a completely hypothetical OpenEXR example, this would actually very, very slightly alter non-image data stored in image buffers), but it does indicate there's some transformation or quantization going on somewhere in the FCPX processing pipeline.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Oliver Peters
Re: BMCC alternate workflow
on Sep 6, 2012 at 3:50:39 pm

[Walter Soyka] "FCPX does not seem to be creating optimized media from my 16b TIFFs -- there is no "High Quality Media" folder in my Events folder"

It would be within the render folder. But yes, I noticed that, too. Maybe it only does that for some graphic file formats, like JPEG.

BTW - to amend my earlier comment regarding the difference blends in X. I do actually see a difference on the waveform, but it's only in the 0-2% range.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Rob Mackintosh
Re: BMCC alternate workflow
on Sep 7, 2012 at 5:20:00 am

Thanks for testing this out Walter. It's good to know FCP X is preserving the bit depth. I must have had create optimized media selected when I imported the 16 bit tiff. I'm going to do some further testing of FCP X's handling of still images when I get back home next week. If I discover anything interesting I'll post it here.


Return to posts index

gary adcock
Re: BMCC alternate workflow
on Sep 5, 2012 at 6:21:54 am

[Walter Soyka] "
I did a quick test, and Gary's right: FCPX is truncating reference movies referring to 16-bit TIFFs.

However, Oliver is right, too -- FCPX is apparently not using QuickTime in its processing path. More on that in a moment.

While FCPX does not seem to be using QuickTime to read the reference movies (as Oliver suggested), it is still limited to an 8-bit processing path somewhere (as Gary suggested).
"



Thanks Walter for the confirmation, since I do all of this in hardware, I am seeing the differences on scopes not on files. I was traveling to IBC and not able to reply in a timely manner.

A have been told that this is a legacy hook left as part of the process used when creating a QT Ref movies.

gary adcock
Studio37

Post and Production Workflow Consultant
Production and Post Stereographer
Chicago, IL

http://blogs.creativecow.net/24640

follow me on Twitter
@garyadcock




Return to posts index

John Heagy
Re: BMCC alternate workflow
on Sep 6, 2012 at 2:10:54 pm

If people appreciate inventive workflows like this enabled by Quicktime reference movies, please ask Apple to add reference movie creation in AVFoundation. The lack of ref mov support in AVFoundation is why FCPX can't export one.

If anyone wants to send feedback I suggest using the "Provide Final Cut Pro Feedback.." under the "Final Cut Pro" menu in FCPX and the Apple Feedback site, both OS X and Quicktime.

http://www.apple.com/feedback/quicktime.html

http://www.apple.com/feedback/macosx.html

Thanks
John


Return to posts index

Rafael Amador
A couple of questions on Color Spces
on Sep 7, 2012 at 4:13:36 pm

[Jeremy Garchow] "Ironically, FCPX seems to hint at having this capability.

We just need to know what is happening, and of course we need more options, see here:"


Not sure, Jeremy, but I think that set of Color Profiles should cover most needs, at least for the kind of stuff most of us work with.
Although I understand the theory behind Color Profiles and the need (at least for certain workflows/materials) of proper Color Managing, I have a few questions on the practical side. For example:
- When importing QT video (SD/HD) is FCPX aware of the different color profiles?
- What about when exporting video? Does FCPX (32bRGB) apply the proper Color Profile (Rec 601/709) un to the export format (NTSC/PAl/HD)?

As long as we still working with a lot of QT stuff that (in my understanding) do not have a proper embed Color profile:
- In an average video workflow (you know with normal cameras/codecs that most of are are using, edit, Color Grading, add some home made graphics, and Broadcast, DVD or web delivery); it makes a real difference when improperly managing Rec 601/709 Color Profiles?

The last question is about "sRGB/Rec-709".
- Are they, lets say, interchangeable? I mean, if I'm making graphics in an application like Photoshop, to be put on top of HD footage, is sRGB the correct Working Space?
I have much more question, but if somebody give me some answers to those, I promise to keep the rest for my self, for a while.
rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 8, 2012 at 4:56:45 pm

[Rafael Amador] "Not sure, Jeremy, but I think that set of Color Profiles should cover most needs, at least for the kind of stuff most of us work with."

I guess I see it extended beyond color profiles and starting to use proper LUTs. With the proliferation of Log color sciences on almost any new and modern camera from cheap to astronomical, this will become 'de rigueur'. It'd be nice to select a bunch of clips and apply a LUT via this drop down. Since many productions have more than one camera per shoot, and more than one log color science, you could then apply different LUTs to different footage without having to apply a separate filter. At least, that's how I see it in my head.

[Rafael Amador] "- When importing QT video (SD/HD) is FCPX aware of the different color profiles? "

Why shouldn't it?

[Rafael Amador] "- What about when exporting video? Does FCPX (32bRGB) apply the proper Color Profile (Rec 601/709) un to the export format (NTSC/PAl/HD)? "

There is no reason to apply a profile to everything. It really depends on your source material and of course your output. You shouldn't have to mess with this option for regular video much at all, if ever.

[Rafael Amador] "it makes a real difference when improperly managing Rec 601/709 Color Profiles?"

It can make a difference on how it looks, yes. For the most part, you won't have to touch anything unless you want to. In the case of this camera test we are talking about, with a proper raw importer you can select whatever profile you want. It is metadata.

[Rafael Amador] "- Are they, lets say, interchangeable? I mean, if I'm making graphics in an application like Photoshop, to be put on top of HD footage, is sRGB the correct Working Space? "

It depends on ow you want it to look. There is no right or wrong unless you have to match footage to other footage, or use exact RGB colors for client logos or branding.


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 9, 2012 at 2:37:33 pm

[Jeremy Garchow] "I guess I see it extended beyond color profiles and starting to use proper LUTs. With the proliferation of Log color sciences on almost any new and modern camera from cheap to astronomical, this will become ........., you could then apply different LUTs to different footage without having to apply a separate filter. At least, that's how I see it in my head."
You are right. I'm still dealing with Color.

[Jeremy Garchow] "[Rafael Amador] "- When importing QT video (SD/HD) is FCPX aware of the different color profiles? "

Why shouldn't it?"

[Jeremy Garchow] "[Rafael Amador] "- What about when exporting video? Does FCPX (32bRGB) apply the proper Color Profile (Rec 601/709) un to the export format (NTSC/PAl/HD)? "

There is no reason to apply a profile to everything. It really depends on your source material and of course your output. You shouldn't have to mess with this option for regular video much at all, if ever."

You can't do a proper YUV>RGB conversion if you don't know the Color Profile of the YUV stuff, and QT files do not flag the color profile.

Most people takes for granted that all the HD is REC-709, and I guess that thousand of people that are shooting CANOND DSLRS don't know that records HD Rec-601.
If you import Canon footage to FCPX (or whatever other application) and FCPX treat it as Rec-709, is doing a wrong conversion to RGB.


[Jeremy Garchow] "[Rafael Amador] "- Are they, lets say, interchangeable? I mean, if I'm making graphics in an application like Photoshop, to be put on top of HD footage, is sRGB the correct Working Space? "

It depends on ow you want it to look. There is no right or wrong unless you have to match footage to other footage, or use exact RGB colors for client logos or branding."

Right.
Imagine that I want treat some Freeze Frames from HD stuff (Rec-709) to Photoshop, and back to the NLE.
Photoshop has no idea about Rec-709. Should I export the stills as sRGB to match better the video color?

When I send video to AE, the video looks the same in sRGB than Rec-709, and both looks the same than FC canvas. If I do not set any Color Profile (Working Space), the color are very different.
rafael

http://www.nagavideo.com


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 9, 2012 at 6:16:24 pm

[Rafael Amador] "Imagine that I want treat some Freeze Frames from HD stuff (Rec-709) to Photoshop, and back to the NLE. Photoshop has no idea about Rec-709. Should I export the stills as sRGB to match better the video color?"

Photoshop does Rec. 709 -- but it's not available as a working space with the default color settings. If you click "More Options" in Photoshop's Color Settings dialog box, "HDTV (Rec. 709)" will become available as a working space.


[Rafael Amador] "When I send video to AE, the video looks the same in sRGB than Rec-709, and both looks the same than FC canvas. If I do not set any Color Profile (Working Space), the color are very different."

Here's how Ae's color management system works:

If color management is not enabled, then Ae makes no effort to ensure accuracy or consistency. RGB values are simply passed through as in any other unmanaged workflow.

If color management is enabled, then Ae interprets a footage item according to its embedded profile (or the profile defined by the user in Interpret Footage), transforms it to the working space where all calculations are performed, then optionally transforms it again to the monitor's profile for display or to the output module's color management space for render. This preserves the appearance of colors throughout the Ae workflow.

If you correctly set your source and output profiles, it doesn't matter what your working space is set to -- unless you linearize (as this affects computations), and as long it is large enough to contain the other spaces (to avoid clipping).

Practically speaking, if you have a single deliverable, I think it's a good practice to base your working space on your deliverable (Rec. 709 in your example) as Ae bases the default output module profiles on the working space.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 3:18:48 am

[Walter Soyka] "If you correctly set your source and output profiles, it doesn't matter what your working space is set to -- unless you linearize (as this affects computations), and as long it is large enough to contain the other spaces (to avoid clipping). "

I worded this terribly, so I want to briefly revisit. I called out linearization specifically because it's so markedly different from traditional video color spaces, but that's misleading in that it falsely implies that only linearizing the working space changes the results of calculations.

As I mentioned above, the working space is the one common space into which all source colors are transformed for calculations (effects and blending).

With that in mind, imagine a source image with no applied effects or blends. If the working and display/output spaces are large enough, we can preserve the look of the original image from its source through the working space to the display and output spaces. Changing working spaces will not change the visual result (again, unless we choose a working space smaller than the output) because we are simply transforming the same perceived color through a series of different RGB representations in different color spaces.

Once we apply some effects or blends to the image that alter the image, changing the working space will change the output. Why? Because these effects and blends are really just math done on the RGB values for the pixels, not the colors those RGB values represent, and because different color spaces translate specific RGB values to actual perceived color differently.

However, there's no right or wrong choice for working space (again with the proviso that it is large enough), because adjustments like these are made subjectively. If you were working in a different working space, you'd simply have to make different adjustments to get your same desired end result.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 1:10:21 pm

[Rafael Amador] "You can't do a proper YUV>RGB conversion if you don't know the Color Profile of the YUV stuff, and QT files do not flag the color profile."

I think you might be over thinking this.

What do you mean by "proper"?

What's done is done. FCPX does not let you transform color on video files. The "Color Override" option goes away, and there's a non changeable "color profile" field that simply lists HD or SD.

The footage that you have has been recorded and it looks the way it looks. In the case of the few Canon DSLRs that do shoot 601, there's zero reason to transform them to 709. I mean, you can if you want to, but there's really no reason to.

This is going back to my original post about this, is that I'd like more control of this for LUTs. The BMDCC, the footage is raw when shooting DNG. With raw, you have complete control on how to "develop" the image and can change it at any time. It would be nice if FCPX allowed this type of control on raw video formats as well. For Log video formats, it'd be nice to simply load a LUT through that drop down similar to loading a color profile.

[Rafael Amador] "If you import Canon footage to FCPX (or whatever other application) and FCPX treat it as Rec-709, is doing a wrong conversion to RGB."

But it's not "wrong". The camera should be shooting 709, but editing and outputting a 601 source in a 709 container won't do anything incorrectly. Even if you could transform the 601 to 709 in FCPX, that wills simply cause the color to look different than when it was shot. You will probably have a more difficult time as it will look one way in one application but not the next.

[Rafael Amador] "Imagine that I want treat some Freeze Frames from HD stuff (Rec-709) to Photoshop, and back to the NLE.
Photoshop has no idea about Rec-709. Should I export the stills as sRGB to match better the video color?"


No. Don't transform the color at all.

[Rafael Amador] "When I send video to AE, the video looks the same in sRGB than Rec-709, and both looks the same than FC canvas. If I do not set any Color Profile (Working Space), the color are very different."

I never color manage in Ae as I find there's no reason to with regular video (that is to say, there's no reason to change what I am working on). If I am working with Log material, I will mostly use a LUT as it also transforms gamma. Of course, broadcast monitoring helps with all of this as you can see what you're really getting is there's no reason to transform the color space.

If you need to output to something different, like some sort of film or print stock, and transforming the color actually has some merit, then color management makes sense.

Jeremy


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 2:17:30 pm

[Rafael Amador] "You can't do a proper YUV>RGB conversion if you don't know the Color Profile of the YUV stuff, and QT files do not flag the color profile."

[Jeremy Garchow] "I think you might be over thinking this. What do you mean by "proper"?"

I assume he means preserving perceived color from the camera through post-production. Incorrectly interpreting a 601 Canon file as 709 is no different than using a wrong LUT for a specific camera in your previous example.


[Jeremy Garchow] "But it's not "wrong". The camera should be shooting 709, but editing and outputting a 601 source in a 709 container won't do anything incorrectly. Even if you could transform the 601 to 709 in FCPX, that wills simply cause the color to look different than when it was shot. You will probably have a more difficult time as it will look one way in one application but not the next."

If the Canon camera uses 601, then interpreting it as if were 709 (in other words, failing to transform it to 709 for HD editorial) will cause the color to look different. That's Rafa's point -- if you're assuming that the Canon is working in 709 when it is not, the color is wrong.

But yes, HD cameras should use 709, and it's puzzling that the 5D apparently uses 601.


[Jeremy Garchow] "I never color manage in Ae as I find there's no reason to with regular video (that is to say, there's no reason to change what I am working on). If I am working with Log material, I will mostly use a LUT as it also transforms gamma."

A LUT is a pre-computed transform from one specific color space to another.

A color profile describes a color space (like Rec. 709) or the response of a device (like your monitor) in terms of a common, device-independent color space like CIE XYZ.

Color management in Ae is unnecessary when the your inputs and outputs use the same color space. If you have multiple inputs from different spaces (or multiple outputs in different spaces), then color management automates their transformation to a common space. You could manually do this by applying LUTs to all your items, but you'd need different LUTs for each combination of input/output spaces. For example, if you're working with a photo stored in Adobe RGB (1998) in an HD Rec. 709 project, you are not getting correct color unless you enable color management or have a LUT to transform from Adobe RGB to Rec. 709.


[Jeremy Garchow] "Of course, broadcast monitoring helps with all of this as you can see what you're really getting is there's no reason to transform the color space."

For straight video, then you can practically make up for technical incorrectness in the grade. For compositing from multiple sources, this is far less practical.

If you need to ensure consistency across an entire workflow, some form of color management is necessary.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 3:00:35 pm

[Walter Soyka] "I assume he means preserving perceived color from the camera through post-production. Incorrectly interpreting a 601 Canon file as 709 is no different than using a wrong LUT for a specific camera in your previous example."

Yes, but that's a human interaction. If you don't touch a 601 recorded signal in a 709 container, nothing bad is going to happen, so therefore there's nothing "improper". There's no reason to reinterpret this footage if working with normal video output.

[Walter Soyka] "If the Canon camera uses 601, then interpreting it as if were 709 (in other words, failing to transform it to 709 for HD editorial) will cause the color to look different. That's Rafa's point -- if you're assuming that the Canon is working in 709 when it is not, the color is wrong."

I guess "wrong" is subjective then. Older Canon DSLRs record in 601. Of course this is meant to be 709 so that part is certainly wrong. So, when I edit in 709, I am not changing the colors of the original media, the colors remain just as recorded. If I change to 709, then the colors won't be as recorded, and therefore they will be "wrong" even though they are technically now in the correct color space to match the container. You are applying an adjustment to something that doesn't need adjustment.

[Walter Soyka] "For example, if you're working with a photo stored in Adobe RGB (1998) in an HD Rec. 709 project, you are not getting correct color unless you enable color management or have a LUT to transform from Adobe RGB to Rec. 709."

Yes, and I mentioned that it is highly dependent on your input and output. Rafael was asking specifically with QT media and regular video pipelines.

With photos , and certainly Raw photos, adding a color profile can certainly help you define an aesthetic look. The CinemaDNG files that we are talking about in this thread seems to come in as AdobeRGB1998. You can override them to 709, and the colors are different. Are either of those right or wrong, or are they just different? That is the advantage of raw is that you can define it, and it is just metadata so it's not baked in and is changeable.

When you work with Red, you can define all different kinds of "color profiles" none of them are wrong, and none of them are right, even when you are destined for 709 delivery, you don't have to choose the 709 color space.

[Walter Soyka] "A LUT is a pre-computed transform from one specific color space to another. "

OK. But it doesn't have to be a specific or standard color space. It can be whatever you want when you create your own LUT similar to the Arri Looks creator. Sure I can conform it to within the 709 space, but I don't have to. LUTs are different from color profiles, but the idea of transforming one space to another is not that much different in profiles and LUTs.

[Walter Soyka] "If you need to ensure consistency across an entire workflow, some form of color management is necessary."

As I said in an earlier response to Rafael, it really depends on your input and output.

Rafael is asking about regular video and 709 QT media. There's really not many reasons to touch this material with differing color profiles.

Jeremy


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 3:16:48 pm

[Jeremy Garchow] "[Rafael Amador] "You can't do a proper YUV>RGB conversion if you don't know the Color Profile of the YUV stuff, and QT files do not flag the color profile."

I think you might be over thinking this.

What do you mean by "proper"?

What's done is done. FCPX does not let you transform color on video files. The "Color Override" option goes away, and there's a non changeable "color profile" field that simply lists HD or SD.

The footage that you have has been recorded and it looks the way it looks. In the case of the few Canon DSLRs that do shoot 601, there's zero reason to transform them to 709. I mean, you can if you want to, but there's really no reason to. "


[Jeremy Garchow] "[Rafael Amador] "If you import Canon footage to FCPX (or whatever other application) and FCPX treat it as Rec-709, is doing a wrong conversion to RGB."

But it's not "wrong". The camera should be shooting 709, but editing and outputting a 601 source in a 709 container won't do anything incorrectly. Even if you could transform the 601 to 709 in FCPX, that wills simply cause the color to look different than when it was shot. You will probably have a more difficult time as it will look one way in one application but not the next."

No Jeremy, and there is nothing like a 709 container. What is 709 or 601 is the content.
Anyway, i'm not talking about "if looks OK go on" (the good enaough option),
I'm talking about color accuracy and trying to get in your computer the picture the most closed possible to what was in front of your camera when you shot.

If an application convert to RGB footage that is Rec-601 as if was Rec-709, is doing wrong because the two color matrix are different on both standards.
Just look at the formula of the Luma coefficients:
Is not the same to apply the formula :
Y' = 0.299 R' + 0.587 G' + 0.114 B' (Rec-601)
than to apply the formula:
Y' = 0.2126 R' + 0.7152 G' + 0.0722 B' (Rec-709)

So when you convert stuff that is Rc-601 as if was Rec-709, the RGB color you are getting are different.

Applications like 5DtoRGB, aware of the fact that ALL the Canon record Rec-601, allows to correct this when converting to Prores, putting out Prores QT files with the standard Rec-709.
Is not that the Canon Rec-601 is wrong, Is that if your camera applied the first formula when going RGB (CMOS sensor) to YUV (H264 file), you should apply the same formula when converting your H264 to RGB in your NLE or graphic application.

That you do not "manage color", doesn't means that you are not applying a certain matrix. If you ingest H264 that is Rec-601 and you do not color manage, on exporting a Prores file, it will be Rec-601.
rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 3:56:11 pm

[Rafael Amador] "No Jeremy, and there is nothing like a 709 container. What is 709 or 601 is the content."

I guess when I say 709 container, I mean the resulting HD file.

[Rafael Amador] "I'm talking about color accuracy and trying to get in your computer the picture the most closed possible to what was in front of your camera when you shot. "

OK. So how does transforming to 5D material from 601 to 709 reflect accuracy? It doesn't. You are changing, as you say, the content. As I mentioned, if you want to do that, you can, it's a choice, but there's no technical reason you need to as the footage is what it is.

If you do need to completely transform the content for a differing display or output, then color management makes sense, like going to a different film stock, for example. Personally, I never need to do that.

[Rafael Amador] "So when you convert stuff that is Rc-601 as if was Rec-709, the RGB color you are getting are different."

Different from what? I guess I don't understand what you are trying to do.

If I bring in a 601 HD file to FCPX and export it to a ProRes movie, nothing has changed color wise so what are you saying needs to change or what is "improper" or "wrong"? Please give me a concrete example to work from.

[Rafael Amador] "Applications like 5DtoRGB, aware of the fact that ALL the Canon record Rec-601"

The new MkIII is 709.

[Rafael Amador] "Is not that the Canon Rec-601 is wrong, Is that if your camera applied the first formula when going RGB (CMOS sensor) to YUV (H264 file), you should apply the same formula when converting your H264 to RGB in your NLE or graphic application."

But why? Why transform again? There's zero reason. If 5DtoRGB wants to do that, that's cool. They do other things with the highly compromised dslr files as well in terms of luma sampling.

[Rafael Amador] "That you do not "manage color", doesn't means that you are not applying a certain matrix. If you ingest H264 that is Rec-601 and you do not color manage, on exporting a Prores file, it will be Rec-601."

So you put a can of paint full of yellow paint inside of a white bucket, and the yellow inside in now white? No, it is yellow paint inside a white bucket.

It is not a 601 file. It is assumed 709, most everything will assume it's 709. It is 601 colors in a 709 space just like putting an SD file in an HD timeline. Do you color manage that process? Why and what good does it do?

Jeremy


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 4:28:12 pm

[Jeremy Garchow] "If you don't touch a 601 recorded signal in a 709 container, nothing bad is going to happen, so therefore there's nothing "improper". There's no reason to reinterpret this footage if working with normal video output."

I'm not really sure what you mean by "601 recorded signal in a 709 container."

If you fail to transform wrongly-recorded 601 to 709 for working in HD, something is improper -- you are not seeing the image as intended. It's like overriding an image tagged with Adobe RGB by interpreting it as sRGB.


[Jeremy Garchow] "I guess "wrong" is subjective then. Older Canon DSLRs record in 601. Of course this is meant to be 709 so that part is certainly wrong. So, when I edit in 709, I am not changing the colors of the original media, the colors remain just as recorded. If I change to 709, then the colors won't be as recorded, and therefore they will be "wrong" even though they are technically now in the correct color space to match the container. You are applying an adjustment to something that doesn't need adjustment."

If you can't set an input profile to 601, and instead interpret the colors as if they were encoded with 709 values when they were really encoded with 601 values, it is objectively wrong.

The literal RGB values would be as-recorded, but the colors they were intended to represent would not be.

As you mentioned, since you'll be presumably be making subjective changes in the grade later, you can do this in a technically incorrect fashion without compromising the final result -- as long as the entire workflow is consistently technically incorrect. For example, if you took a file to another application for FX work, you'd have to make sure that none of them correctly transform from 601 to 709 or else you'll have an inconsistency.


[Jeremy Garchow] "With photos , and certainly Raw photos, adding a color profile can certainly help you define an aesthetic look. The CinemaDNG files that we are talking about in this thread seems to come in as AdobeRGB1998. You can override them to 709, and the colors are different. Are either of those right or wrong, or are they just different? That is the advantage of raw is that you can define it, and it is just metadata so it's not baked in and is changeable."

If the CinemaDNG spec relates sensor information to CIE XYZ (as I suspect but am not certain that it does), then changing the working space affects any adjustments to the image (i.e., the same literal numeric adjustments would give different results in different working space), but not the unadjusted image itself.

In this case, the correct profile to use with the image is the one with in which you develop it.


[Walter Soyka] "A LUT is a pre-computed transform from one specific color space to another. "

[Jeremy Garchow] "OK. But it doesn't have to be a specific or standard color space. It can be whatever you want when you create your own LUT similar to the Arri Looks creator. Sure I can conform it to within the 709 space, but I don't have to. LUTs are different from color profiles, but the idea of transforming one space to another is not that much different in profiles and LUTs."

It doesn't have to be a standard space, but it does have to be a specific space. Color profiles are all generalized to a device-independent color space by definition; LUTs are not.


[Jeremy Garchow] "Rafael is asking about regular video and 709 QT media. There's really not many reasons to touch this material with differing color profiles."

I think this conversation is specifically about the 601 Canon insanity. I generalized it a bit -- sorry if I made it sound like you were saying something you're not.

Practically speaking, you're right that you can interpret these incorrectly and still be ok (as long as you're consistent).

Technically speaking, Rafael is right that there is a right and a wrong way to do it, and using the wrong color profile for these source files means you're not getting the colors the camera intended.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 6:23:29 pm

[Walter Soyka] "If you can't set an input profile to 601, and instead interpret the colors as if they were encoded with 709 values when they were really encoded with 601 values, it is objectively wrong. "

But lets look at practically wrong.

You CAN'T set an input profile in FCPX, or Pr, or FCP Legend. So, what comes in is what goes out as there's no transformations. If that's "wrong" to you and Rafa, so be it. I'll keep my version of right that the input matches the output on passthrough renders. If I change anything in that pipeline, they don't match

[Walter Soyka] "For example, if you took a file to another application for FX work, you'd have to make sure that none of them correctly transform from 601 to 709 or else you'll have an inconsistency. "

I would disagree. This is exactly what I do (don't touch it) and everything comes out looking like it went in, unless of course I don't want it to. Again, maybe the matching colors is wrong. It's easy and clients don't complain, so I guess I'm OK with it.


Return to posts index

Oliver Peters
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 6:56:43 pm

My apologies, as I've been glazing over most of this discussion ;-) , but here are my 2 cents...

It's important to understand that color space specs are based on DISPLAY profiles. Rec 601 or 709 are not based on camera specs, but rather how to define the way an image looks on a TV or monitor. This information is part of the definition of a file or a codec's specs. In the case of Apple, QuickTime (and other converters using the QT kit) make certain assumptions about the correct profile when a file is imported. Apple relies on the manufacturer to create files that adhere to the spec they expect to see.

If you import a file into FCP X that uses a professional digital video codec (like uncompressed or ProRes) it will scale the values to correspond to Rec 709 color space. If you import a TIFF, FCP X knows the file has RGB ("full swing") values, but it will scale the levels into Rec 709 space ("studio swing") since it's being edited in an environment based on digital video standards. OTOH, the GUI display is actually being shown with an extended luma range designed for the RGB space of the computer display. If you import a ProRes file into FCP X, edit it to a project timeline and export it again as ProRes, I'm not sure this is the same "transparent" (media-copy-only) process that it used to be in FCP "legacy".

As an aside, I have been testing the display levels of various players using a series of 8-bit greyscale files as QT movies (uncompressed, ProRes, DNxHD, AVC-Intra). Absolutely no player and no compressed codec displays the correct gamma value. Only uncompressed is properly displayed in the players. IMHO Apple has chosen to alter the "proper" gamma for something that they feel looks better on their displays. Yet, by eye and by the scopes, all of these files appear correct in FCP X.

The application of camera LUTs (like an ARRI LogC-to-709 filter) is a mathematically correct conversion to change the LogC profile back into an appearance that matches the Rec 709 appearance of the image from the camera at the time of that recording. If this is an image that is going to be heavily graded anyway, the LUT may or may not be all that important. You can get back to that image (or reasonably close) with standard grading and no LUT. Odds are you are going to change the grading anyway to look different than the 709 appearance on set or in studio, so it probably doesn't matter whether you are using a LUT or not - other than ease of use.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 7:11:22 pm

[Oliver Peters] " other than ease of use."

During the rough cut stage.

I typically strip a LUT for any final grading, but I certainly use a LUT for editorial and client reviews so I don't have to "regrade" the piece for every review.

Plus, in the case of Alexa, the Log-C to 709 LUT is what is typically used on set, so the client is familiar with it.


Return to posts index

Oliver Peters
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 7:18:50 pm

[Jeremy Garchow] "I typically strip a LUT for any final grading, but I certainly use a LUT for editorial and client reviews so I don't have to "regrade" the piece for every review."

If my grading goes out-of-house, then I send the colorist the QuickTimes, so no LUTs baked anyway. I've found that for F3 S-log and C300 Canon log, a basic grade in FCP X's color board is more than adequate for both a rough cut grade and a final grade.

For ALEXA, I apply the Pomfort filter and color board on top for a final grade. Pomfort has some built in grading controls, too, so a tweak of both gets you a great image.

If I'm working in FCP7 with ALEXA, I add the Nick Shaw LUT, export baked "proxy edit" PRLT files. Offline with those and then access the PR4444 "clean" files for the final grade.

- Oliver

Oliver Peters Post Production Services, LLC
Orlando, FL
http://www.oliverpeters.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 8:34:06 pm

[Oliver Peters] "If I'm working in FCP7 with ALEXA, I add the Nick Shaw LUT, export baked "proxy edit" PRLT files. Offline with those and then access the PR4444 "clean" files for the final grade."

I happen to use the GlueTools plug that works in a more or less real time even on my crappy lappy.

In X, I Pomfort as well.

I also find grading out of Log in X to be superior quality wise than FCP7. FCP7 turns to noise much more quickly. In 7, I still go to Color so the LUT is for preview and review purposes only.


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 7:12:50 pm

[Oliver Peters] "My apologies, as I've been glazing over most of this discussion ;-) "

Oh, and there's certainly no reason to apologize for this.


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 3:23:27 pm

[Walter Soyka] "If the Canon camera uses 601, then interpreting it as if were 709 (in other words, failing to transform it to 709 for HD editorial) will cause the color to look different. That's Rafa's point -- if you're assuming that the Canon is working in 709 when it is not, the color is wrong.

But yes, HD cameras should use 709, and it's puzzling that the 5D apparently uses 601."


This is from the 5DtoRGB manual:

"Decoding Matrix: Choose the decoding matrix you want to use here. Normally, you would match the matrix that was used to encode the video to YUV (technically YCbCr). Generally speaking, BT.709 is for HD material and BT.601 is for SD. Canon HDSLRs use the BT.601 matrix, which is selected by default".

rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 3:56:51 pm

[Rafael Amador] ""Decoding Matrix: Choose the decoding matrix you want to use here. Normally, you would match the matrix that was used to encode the video to YUV (technically YCbCr). Generally speaking, BT.709 is for HD material and BT.601 is for SD. Canon HDSLRs use the BT.601 matrix, which is selected by default". "

See, even 5DtoRGB keeps the 601 matrix.

Jeremy


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 4:14:19 pm

[Jeremy Garchow] "See, even 5DtoRGB keeps the 601 matrix."

No, and that's Rafael's point. 5DtoRGB assumes 601 instead of 709 when decoding the file. If another app assume 709 for all HD sources, it decodes these files incorrectly.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 6:05:36 pm

OK. I think we are getting lost on what "correct" and "wrong" is, or at least I am.

Recording the HD footage as 601 is "wrong", that's the problem. From there it's up to you to make teh footage look how you want it to look.

If transforming the dslr footage seems like it's the "Correct" thing to do, then by all means go for it.

You won't be able to do this in FCPX, anyway.


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 6:11:56 pm

[Jeremy Garchow] "OK. I think we are getting lost on what "correct" and "wrong" is, or at least I am. Recording the HD footage as 601 is "wrong", that's the problem. From there it's up to you to make teh footage look how you want it to look. If transforming the dslr footage seems like it's the "Correct" thing to do, then by all means go for it."

I'm with you.

Recording HD as 601 is wrong -- but interpreting 601-encoded HD as if it were 709-encoded HD is also wrong, and as my mother would remind me, two wrongs don't make a right.

But here's where your point about practicality comes in: if everyone does the second part wrong and you don't, then even though you're right, you're wrong (because now you're the inconsistent one). Sorry, mom.

I'm not in front of an FCPX machine now -- does the color space override do anything for stills?

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 6:23:06 pm

[Walter Soyka] "I'm not in front of an FCPX machine now -- does the color space override do anything for stills?"

Yes, that was how this got started. It's the ONLY place the override works is for stills (at least that I come across so far and why I said that FCPX is hinting at other possibilities).

When using it, it transforms the color, but only in FCPX and not on the actual file (which goes back to my point of leaving things alone). On the sample BMD footage, Adobe RGB 1998 seems to be what was assigned. You change it to 709 and the look changes.

None of those decisions are "right" or "wrong". :)

When I render out a ProRes movie, the look I choose is baked in to the file that is now going to be assumed at 709 space. So when that happens, the incoming picture will look the same as the outgoing picture.

Jeremy


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 7:19:46 pm

[Walter Soyka] "but interpreting 601-encoded HD as if it were 709-encoded HD is also wrong, "

Walter, this is kind of what I have been saying.

You DON'T interpret (or transform), you simply leave it alone.

Earlier, you and Rafa seemed to be saying you need to interpret the 601 colors to 709. I don't see why one would do this. You simply take the colors that were shot in 601 and carry on. I don't see the point of interpreting the 601 recorded material to 709, it does nothing but change the original colors to something different.

It will be assumed it is 709 and no colors will change. If you actually transform to 709, you will change the colors, and most likely you will only change them in that application unless you "Bake" the profile in to a new QT movie. This could cause some confusion if you would ever need to reconnect to the original unbaked movies for whatever reason, or you do another transcode and forget to transform the colors from the original.

Again, I'm losing track of right and wrong.


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 4:15:15 pm

[Jeremy Garchow] "See, even 5DtoRGB keeps the 601 matrix."
No.
5DtoRGB convert H264 stuff to Prores. It works in RGB 32bFP, (Y'CbCr > RGB > Y'CbCR) and its lets you select a matrix (601, 709 or even custom) according with the imported footage.
So with the canon stuff is making: H264 (601) > RGB > Prores (709).

The Canon is the only device, that I know that shots HD-601.
I shot with a GH2, and that shots Rec-709. If I'd leave the default Rec-601 matrix (the application was first designed for Canon), I will be getting wrong colors on the Prores files.
rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 7:07:54 pm

[Rafael Amador] "I shot with a GH2, and that shots Rec-709. If I'd leave the default Rec-601 matrix (the application was first designed for Canon), I will be getting wrong colors on the Prores files."

That's because you started with 709 in the first place and you are physically changing them to 601.

With the Canon workflow you start with 601.

You seem to think you need to transform the colors, I'm saying you don't.

When you bring it in to an NLE, those colors aren't going to change, although you and Walter seem to think they will.

To each their own workflow, I guess.


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 7:41:52 pm

[Jeremy Garchow] "[Rafael Amador] "I shot with a GH2, and that shots Rec-709. If I'd leave the default Rec-601 matrix (the application was first designed for Canon), I will be getting wrong colors on the Prores files."

That's because you started with 709 in the first place and you are physically changing them to 601.

With the Canon workflow you start with 601.

You seem to think you need to transform the colors, I'm saying you don't.

When you bring it in to an NLE, those colors aren't going to change, although you and Walter seem to think they will.

To each their own workflow, I guess."

No Jeremy,
Is about the two formulas I wrote up there.
With the GH2 you make;
- In Camera: RGB > YCbCr applying formula 2 (R-709)
- In 5DtoRGB YCbCr > RGB applying formula 2, then RGB > YCbCR Applying formula 2.

With the Canon you make:
- In Camera: RGb > YCbCr applying formula 1 (R-601)
- In 5DtoRGB YCbCr > RGB aplaying formula 1, then RGB > YCbCR Applying formula 2.
Both processes are correct.
The oddity of the Canon being 601, is corrected when you apply the 601 matrix in 5DtoRGB.


[Jeremy Garchow] "Recording the HD footage as 601 is "wrong", that's the problem. "
I would say wrong, Jeremy. The point is that the application that will convert it to RGB understands that although is HD is not R-709.
If you import the Canon stuff to FCPX and FCPX is aware that is R-601, when you export the movie as Prores, the Prores file will have exactly the same colors than the original H264.
But not only with Prores, if you export as a Tiff sequence (with whatever Color Profile you want), the colors wont match the original.
The error is in the first YUV to RGB conversion.

Is similar to the issue that we've been having for years with FC and the Interlaced material or the Alpha channels. FC by default set any Animation codec file as "Upper first" even when being Lower, or took any Alpha channel as Strait, even when being premultiplied.
Is all about "interpreting footage".

If FCPX knows he proper color profile of the file you import, will do the correct YUV > RGB conversion.
This is what the "Color Profile override" in FCPX is for, No?
That doesn't change the Color Space of the whole project, but individual clips or stills. isn't it?
I can't imagine FCPX (which is very little customizable on exporting), exporting a non-standard HD file (R-601).
So my believe is that the "Override Color Space" is to fix those issues on importing.
With the proper color management and working in 32b shouldn't be no problem with that canon stuff.
rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 10, 2012 at 8:11:30 pm

[Rafael Amador] "No Jeremy,
Is about the two formulas I wrote up there.
With the GH2 you make;
- In Camera: RGB > YCbCr applying formula 2 (R-709)
- In 5DtoRGB YCbCr > RGB applying formula 2, then RGB > YCbCR Applying formula 2.

With the Canon you make:
- In Camera: RGb > YCbCr applying formula 1 (R-601)
- In 5DtoRGB YCbCr > RGB aplaying formula 1, then RGB > YCbCR Applying formula 2.
Both processes are correct.
The oddity of the Canon being 601, is corrected when you apply the 601 matrix in 5DtoRGB."


I'm sorry, Raf. I'm not smart enough. You are losing me.

[Rafael Amador] "when you export the movie as Prores, the Prores file will have exactly the same colors than the original H264."

Yes. You don't want this?

[Rafael Amador] "But not only with Prores, if you export as a Tiff sequence (with whatever Color Profile you want), the colors wont match the original."

If you export with NO PROFILE, the colors will match, there is no reason to color profile this material, unless of course, you want to for whatever reason.

[Rafael Amador] "Is similar to the issue that we've been having for years with FC and the Interlaced material or the Alpha channels. FC by default set any Animation codec file as "Upper first" even when being Lower, or took any Alpha channel as Strait, even when being premultiplied.
Is all about "interpreting footage"."


No. It's not the same.

Let me ask you this, when you add true 601 material (SD) to an HD timeline, do you mean to tell me you actually color profile the 601 material to 709? Why or why not?

If a file is CREATED as upper filed first, and the NLE says it's lower field first (transform), and you are playing back through an upper field pipeline as through a capture card to a video monitor) , that is an incorrect interpretation and your file will not playback properly.

If a file is created at 601 and played back in a 709 space (as through a capture card to a video monitor) there will be no shift in color as the file is assumed 709. It does NOT transform the 601 to 709. There is no transforming that needs to happen.

The Canon files are not simply mislabeled. Misinterpreting a field order or alpha type is a totally different scenario.

[Rafael Amador] "This is what the "Color Profile override" in FCPX is for, No?"

I will say this again, the color profile override only works with stills. It does not work with video.

[Rafael Amador] "That doesn't change the Color Space of the whole project, but individual clips or stills. isn't it?"

Yes, just the stills that you choose to change.

[Rafael Amador] "With the proper color management and working in 32b shouldn't be no problem with that canon stuff. "

Oy. The "proper color management" would have been to have shot it in 709 to begin with. After that, anything you do will be diverging from the original file. You of course can change it if you think that is the best way to go about it, but that is not the "right" way. You are simply transforming the colors for the sake of it, not to do anything "correctly" or match a color or display.

I think you will potentially cause way more trouble than is necessary.

Besides, I thought the older Canon stuff was RGB anyway, not YCrCb. It can't conform to 709 as the pixel count or frame rates weren't up to par, or something of that nature, but I don't really know.

Jeremy


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 4:21:53 am

[Jeremy Garchow] "The "proper color management" would have been to have shot it in 709 to begin with. After that, anything you do will be diverging from the original file."

I do agree that the 5D MkII should have used Rec. 709, but we don't have much control over that.

We need to define color for the purpose of this conversation. A color is not a specific RGB value; it's what's represented by a specific RGB value in a specific color space.

Color management is all about accuracy and consistency. It works by identifying the relationships between actual color and specific RGB values in various color spaces, and then doing math behind the scenes to preserve that actual color by changing RGB values as necessary when transforming from one space to another.

Let's use a real example -- the Creative COW orange (as observed on Bessie's snout above): RGB [255,156,0] in both sRGB and Rec. 709. That same orange is RGB [255,155,4] in Adobe's SDTV NTSC profile, [232,155,36] in Adobe RGB, [240,157,68] in DCI P3 Neutral at D55, and [255,152,41] in my MacBook Pro's custom display profile.

To accurately see the color that sRGB represents as [255,156,0], my graphics card has to send [255,152,41] to my monitor. Same orange, but different RGB values, depending on the color space.

When we talk about interpretation, we mean specifying the profile with which the colors in the asset are encoded so we know what actual colors the RGB numbers are supposed to represent.

When we talk about transforming a color from one space to another, we actually (and perhaps confusingly) mean keeping the perceived color the same. The transformation is mathematical, not perceptual, changing the RGB values as necessary to keep the displayed color the same (see above for how one orange has different RGB values in different color spaces).

So if we interpret files with the wrong color profile, or if we don't manage at all going from one profile to another, then by preserving the original file's RGB data, we are in fact diverging from the original file's intended actual colors.

In the real world, the 601/709 difference is very small -- probably imperceptibly small in all but the most saturated colors. Reading a little more on the Canon DSLR stuff, the huge difference when using 5DtoRGB difference is more likely due to the fact that Canon's wacky H.264 files are encoded full range, not video range.


[Jeremy Garchow] "Besides, I thought the older Canon stuff was RGB anyway, not YCrCb."

You can't do chroma subsampling with RGB-stored data, which is one of the best ways to reduce the data rate for visuals. Chroma sub-sampled H.264 such as Canon's is actually YCbCr, though most (all?) H.264 encoders expect RGB and transform to YCbCr as the first step of encoding and most (all?) decoders transform the YCbCr back to RGB.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 4:53:59 am

[Walter Soyka] "Let's use a real example -- the Creative COW orange (as observed on Bessie's snout above): RGB [255,156,0] in both sRGB and Rec. 709. That same orange is RGB [255,155,4] in Adobe's SDTV NTSC profile, [232,155,36] in Adobe RGB, [240,157,68] in DCI P3 Neutral at D55, and [255,152,41] in my MacBook Pro's custom display profile."

OK, but where did Bessie's orange start? Was it recorded as 601? ;)

Again, I'm not smart enough.

[Walter Soyka] "To accurately see the color that sRGB represents as [255,156,0], my graphics card has to send [255,152,41] to my monitor. Same orange, but different RGB values, depending on the color space."

This is my point. If you NEED to transform to a differing display/output, then color management makes sense.

In the case of Rafael's specific questions, there is no need to transform, unless of course you want to. If you want tp, you will blow the color management though as it will now not be consistent. Isn't that the point, to keep orange the same color no matter where you go?

If you do transform, it will look differently in Ae, than it will in FCP or even Pr. Why would you ever want to do this in this specific example? I know when it is necessary, I just don't see it necessary to Rafael's original questions.

And finally, do you profile SD material in an HD timeline? Why or why not?


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 5:42:23 am

[Jeremy Garchow] "OK, but where did Bessie's orange start? Was it recorded as 601? ;)"

That's a great question. In my example, I assumed sRGB (since PNGs can't be tagged).


[Jeremy Garchow] "In the case of Rafael's specific questions, there is no need to transform, unless of course you want to."

Right. And if Rafael wants the colors as intended by the camera, he must transform. However, having read more about the camera, I don't think that just using a 601 profile will be sufficient (see above and below on range).


[Jeremy Garchow] "If you want tp, you will blow the color management though as it will now not be consistent. Isn't that the point, to keep orange the same color no matter where you go?"

There's consistency, and there's accuracy. You can be consistent without being accurate if you are consistently wrong.

I agree with you that having the same wrong orange everywhere is way better than having a right orange in one place and wrong oranges elsewhere. One bad orange does spoil the whole bunch.

Being both consistent and accurate is not a bad goal, and transforming the funky-color files properly to Rec. 709 may allow you to be both consistent and accurate.


[Jeremy Garchow] "If you do transform, it will look differently in Ae, than it will in FCP or even Pr. Why would you ever want to do this in this specific example? I know when it is necessary, I just don't see it necessary to Rafael's original questions."

That's because Ae does it right and FCP7 and Pr do it wrong. If you "burn in" the transform to 709 when you transcode, and everything else you have is 709, then you can just use an unmanaged workflow thereafter.

Check out some 5DtoRGB demos. The difference between using 5DtoRGB to decode the H.264 and QuickTime to decode it is staggering. This is hugely larger than the difference I would expect to see between straight up Rec. 601 and Rec. 709. As I mentioned before, I think the full range vs video range issue in decode is so much more significant than the 601/709 issue that it's not even funny. That said, I am not an expert on the 5D MkII color science by any means, and I don't know what's going on under the hood aside from what I'm reading tonight on the internet.


[Jeremy Garchow] "And finally, do you profile SD material in an HD timeline? Why or why not?"

No, because I am lucky enough to not have to deal with SD in HD timelines.

Going back to the 2011 SuperMeet, I was really stoked about getting color management plus 32b FP linear processing in FCP Awesome, because color management and compositing in NLEs was still so painful. Of course, if your NLE makes color management easy -- why not do it right?

For compositing, 3D, and mograph, I do usually work in a color-managed pipeline (often linear). I want color to be as consistent as possible across a project, no matter what platform, machine, or software I'm working with.

Sadly, all bets are off for final display outside our little bubbles.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 9:38:51 am

[Walter Soyka] "[Jeremy Garchow] "Besides, I thought the older Canon stuff was RGB anyway, not YCrCb."

You can't do chroma subsampling with RGB-stored data, which is one of the best ways to reduce the data rate for visuals. Chroma sub-sampled H.264 such as Canon's is actually YCbCr, though most (all?) H.264 encoders expect RGB and transform to YCbCr as the first step of encoding and most (all?) decoders transform the YCbCr back to RGB."

Right, that is why when you try to edit H264 (YCbCr) on FCP and you conform the sequence to the footage, the sequence gets "Render in 8b RGB".

And if FCP treat H264 as RGB we can be having issues of Whites/SuperWhites. Video range vs Full Range as Walter points here:

[Walter Soyka] "Check out some 5DtoRGB demos. The difference between using 5DtoRGB to decode the H.264 and QuickTime to decode it is staggering. This is hugely larger than the difference I would expect to see between straight up Rec. 601 and Rec. 709. As I mentioned before, I think the full range vs video range issue in decode is so much more significant than the 601/709 issue "
rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 2:40:28 pm

[Rafael Amador] "And if FCP treat H264 as RGB we can be having issues of Whites/SuperWhites. Video range vs Full Range as Walter points here:"

I was wrong.

I think that the color space was sRGB which makes it close to 709 in chromacity...?

Overall, It is still closer to 601 due to pixel count or something? I have to find that article.

I've done tests with 5dtorgb with every single combination.

When compared to logged and transferred footage, it either looks very much the same (using a 709 matrix causes a very slight shift in the red range) or it ends up much darker as is the gamma is off.

I have an older version of 5dtoRGB and the results are even worse, and the controls are very different.

I want to believe 5dtoRGB but I just can't find how it's doing anything different than the log and transfer plugin does already, or what Pr does natively, smoothing actions aside.

Adobe actually changed the code for mkIII clips to reflect the new 709 space now used by Canon.

I believe my eyes guys. If you can physically show me how transforming these clips makes a difference, I'll hear you out.

For now, I can place that movie in any software that I use without touching it, and it looks the same. Color managing it in Ae would only serve to change it for no reason, just as a different setting in 5dtoRGB will bake in a look I don't want. I know what you guys are saying, ideally you'd want everything to be "right" from the start, so do I.

DSLR footage is technically not right from the start, but in my practice and experience transforming it does nothing but sour the pipeline as you toss consistency out. The damage has been done.

I never use QuickTime Player to transcode anything, so there's that caveat.


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 3:20:56 pm

[Jeremy Garchow] "'ve done tests with 5dtorgb with every single combination.

When compared to logged and transferred footage, it either looks very much the same (using a 709 matrix causes a very slight shift in the red range) or it ends up much darker as is the gamma is off. "

If the picture gets darker when you set "Video Range", that's normal.

[Jeremy Garchow] "I want to believe 5dtoRGB but I just can't find how it's doing anything different than the log and transfer plugin does already, or what Pr does natively, smoothing actions aside. "
For my self, the more interesting function of 5DtoRGB, is the Chroma re-sampling.
As you say 'the damage is done", but a wise Chroma interpolation can helps to mitigate the damage.
Video "plastic surgery' is part of our job.
rafael

http://www.nagavideo.com


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 3:29:38 pm

[Jeremy Garchow] "I believe my eyes guys. If you can physically show me how transforming these clips makes a difference, I'll hear you out."

I believe the math, and the math says it makes a difference.

However (and this is a very big however), that difference may be small enough to not be worth the trouble in all but the most color-critical applications.

Anyone got some good footage (any format) they can share with a single frame or small series of frames that shows a fairly broad gamut? I could do a test later to illustrate the difference and I don't want to use a synthetic image here.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 3:32:06 pm

I was going to offer some. I can get it to you later.


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 11, 2012 at 3:41:58 pm

[Jeremy Garchow] "I was going to offer some. I can get it to you later."

Cool, I'll keep an eye out for it. Thanks.

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Walter Soyka
Re: A couple of questions on Color Spces
on Sep 14, 2012 at 6:02:15 pm

I just wanted to follow up about the test footage from Jeremy.

As discussed above, the differences between interpreting his footage in 601 and 709, all other things being equal, are minimal: never more than 1% difference per channel. Boosting the saturation on this footage a bit pre-comparison pushes the difference all the way up to 2%. It is mathematically noticeable, but visually insignificant. In extreme examples, it would be possible to see some saturation clipping, but I didn't observe any of that in Jeremy's real-world test footage.

For this specific camera, I'd say the penalty for pretending the footage is Rec. 709 is negligible.

I didn't test 5DtoRGB itself, but rather compared the same footage, interpreted in the two different color spaces, in Ae. This suggests that the big difference with files from 5DtoRGB and other decoders is not simply applying the correct color matrices, but rather dealing with the full swing (or whatever other voodoo is going in within the black box that is QuickTime). If this kind of footage is important to you, I'd suggest you verify that your application is appropriately dealing with the range.

We went off on a few tangents on color geekery and I think we lost focus on the main point: color management is a useful feature that belongs in an NLE. It doesn't make a practical difference in this specific case, but you'd see much more noticeable differences with non-video space sources like stills or more advanced cameras with different color sciences.

Cheers,

Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog - What I'm thinking when my workstation's thinking
Creative Cow Forum Host: Live & Stage Events


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 12, 2012 at 1:26:02 pm

Here is that article I mentioned.

http://colorbyjorg.wordpress.com/2011/01/14/canon-dslr-video-uses-bt-601-sd...

Rafa, I saw your new post above.

I appreciate you taking the time to put that together, but I understand the fundamental differences in the specs.

What is confusing to me is what you and I might think is "correct" or "wrong".

To me, correct is consistent. As long as the footage remains consistent throughout, I can manage it. Having to constantly change the "interpretation" on every single application doesn't make sense to me in regards to broadcast video. Even 5DtoRGB is consistent with every other application unless of course you like to change it for the worse! :)

All the applications I have used so far know how to handle older Canon footage to maintain consistency. My argument stands that you don't have to change anything.

Jeremy


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 12, 2012 at 5:20:46 pm

Hi Jeremy,
Thanks for the article.
For me adds more darkness.

He says:
"So, why do Canon DSLR cameras use the BT.601 recommendation for SD video, instead of the BT.701 recommendation for HD video, while using the latter’s color space?! The answer is simple… The Canon DSLR video can not comply with the BT.701 recommendation in regard to frame rates and maybe even pixel count".

What I understand from the article is that the Canon is no one of the standards, or at least uses a kind of "hybrid" 601/709 standard, because what is clear is that the Rec-601 is formulated only for PAL and NTSC.
In the other hand the Canon (at least his H264) are fully HD in terms of size/pixels and time-base, unless the author is talking about the 30Fps (real) that the Canon was making when he wrote the article (Jan. 14.2011).

[Jeremy Garchow] "What is confusing to me is what you and I might think is "correct" or "wrong". "
Color spaces are languages. to describe colors, so if doesn't matter which language you use and which path fallows your "message", if the final receptor understand it.
If the final receptor understand the message, for me the process is "correct". Even if haven't fallowed the expected path.

[Jeremy Garchow] "Even 5DtoRGB is consistent with every other application unless of course you like to change it for the worse! "
What's the point to have as default an SD matrix? Who shots SD with a Canon.

What is clear is that there is something "odd" with the Canon footage, other wise 5DtoRGB wouldn't need two different matrix options.
Unless the developer is giving us some bullshit.
rafa

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 12, 2012 at 6:19:05 pm

[Rafael Amador] "Color spaces are languages. to describe colors, so if doesn't matter which language you use and which path fallows your "message", if the final receptor understand it.
If the final receptor understand the message, for me the process is "correct". Even if haven't fallowed the expected path. "


Brother, this is exactly what I have been saying all along. There is simply no reason to transform this footage.

Do we now agree?

[Rafael Amador] "What is clear is that there is something "odd" with the Canon footage, other wise 5DtoRGB wouldn't need two different matrix options. "

They give you the option to change, but it uses 601 by default. Once rendered to a ProRes movie, THAT movie will be assumed 709, and every single application will also assume 709. The colors themselves have not changed from the original, even though they now have a 709 assumption.

[Rafael Amador] "Unless the developer is giving us some bullshit."

He's not bullshitting, but if you look at the 5DtoRGB version that's on the AppStore and compare it to the output of earlier builds, there is a lot that has changed. You can tell he learned along the way as well. And if the chroma smoothing is important to you, 5DtoRGB presents an advantage there.

I just did another test.

If you import original h264 DSLR footage in to FCPX the Color Profile is listed as 1-1-6 which is 601. (I was wrong, earlier).

You then use FCPX to transcode the movie to ProRes (Make optimized media) and reimport that clip, it has a color profile of 1-1-1, which is 709. The colors of each of them are identical.

This is further proof that FCPX (and others) are handling this correctly and there's no user action that needs to happen. There is a 601 to 709 path, and the colors remain the same. I had thought that everything was just assuming 709, but really the initial footage is handled at 601 in the applications I have tested, and then rendered out to a file that will be assumed 709, so there's no need to specifically transform to 709.

And we all lived happily ever after.


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 13, 2012 at 3:39:04 am

[Jeremy Garchow] "If you import original h264 DSLR footage in to FCPX the Color Profile is listed as 1-1-6 which is 601. (I was wrong, earlier).

You then use FCPX to transcode the movie to ProRes (Make optimized media) and reimport that clip, it has a color profile of 1-1-1, which is 709. The colors of each of them are identical."

Hallelujah!!
That's all we needed to know to see that FCPX is managing properly the Canon stuff.
But your's is the first news on that.
That means that the H264s files are properly tagged and FCPX is able to know the color of the Canon H264 (even if was not the expected Rec-709), and apply the correct matrix (601) to go back to RGB.

Then make sense both options in 5DtoRGB. Seems that 5DtoRGB can not tell the Color specs of the footage and you must set manually the matrix you need for decoding.

My point was that the Prores that FCPX or 5DtoRGb being Rec-709 means no much if those applications haven't use the Rec-601 when importing the Canon stuff. Yes, they will be Rec-709 but the colors won't match the originals from camera. When ever you convert RGB to YUV with whatever the matrix, you need to apply the same matrix to go back to RGB if you want to keep the same colors.


[Jeremy Garchow] "They give you the option to change, but it uses 601 by default. Once rendered to a ProRes movie, THAT movie will be assumed 709, and every single application will also assume 709. The colors themselves have not changed from the original, even though they now have a 709 assumption."

When you say "they give you the option to change".
is not about options. is about applying the correct matrix.
If you use the Rec-709 with Canon footage, you are changing the colors on your clips.

if I use the default 5DtoRGB matrix with my GH2 footage, the resulting Prores, sure will be Rec-709, but the color won't match the original H264 files.

What now remains fully inconsistent is the article you linked.

rafael

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 13, 2012 at 3:41:35 am

[Rafael Amador] "if I use the default 5DtoRGB matrix with my GH2 footage, the resulting Prores, sure will be Rec-709, but the color won't match the original H264 files. "

What do the GH2 settings default to?


Return to posts index

Rafael Amador
Re: A couple of questions on Color Spces
on Sep 13, 2012 at 5:11:30 am

With the GH2 you must chose the Rec-709 matrix.
You have to do it manually.
if you want, I can send you a short clip so you can check how shows up in FCPX (601 or 709).
rafa

http://www.nagavideo.com


Return to posts index

Jeremy Garchow
Re: A couple of questions on Color Spces
on Sep 13, 2012 at 5:12:18 am

[Rafael Amador] "if you want, I can send you a short clip so you can check how shows up in FCPX (601 or 709)."

Yes, that would be great.


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2017 CreativeCOW.net All Rights Reserved
[TOP]