Why don't use directly display-port 1.2 for DCI monitoring ?
Hi guys !
I'm building a computer (hackintosh) with GTX680 graphic card for use with CS6 Premiere and Resolve.
I will have thunderbolt, HDMI, and display-port 1.2, and I will take a grading monitor with DCI-P3 gamut, like Dreamcolor or Eizo CG246W.
The problem is I absolutely don't understand why I have to use one output card (Decklink or AJA) and also a converter box like a HDlink pro to output to the monitor's display port !
The native display port 1.2 of the GTX680 is 10bit, so why use a special card to output a 2x HDSDI with 4:4:4 signal and another box to convert it to display-port ?!?
And, so, will it be better to use a 2x HDSDI signal to a DCI monitor (like the FSI CM170 in example) or it will be the same quality with the display-port of this same monitor ?
Thanks for your help.
I want to clarifying I already know what is the purpose of an output card like Declinck,
and it's necessary to output only the colorspace (rec.709, rec.601, DCI-P3...)
of the footage, without the operating system interfering with his own color environment.
But it is not possible to just use some LUT, or software tricks to allow only that exact and precise color space throught a regular 10bit display port by example ?
And without using more "useless" card and hardware ??
And about the color grading more generally, since the beginning of my career, I saw so many new format and color space, that the fact DCI-P3 gamut is "only" the emulation of a Xenon bulb, doesn't make me confident...
soon we will use xvYCC, or Rec.2020, or I don't know what else...
and, I'm not George Lucas... I don't want to fully color grade again my features every 5 years !! ;)
So, why not trying to make a good and definitive colorspace and working environment ?
If the only color a human can see are only in the CIE1931 XYZ color space, so why not just grade in this environment, and after only make conversion, like we do when encoding our master feature in DVD, web, etc.
Now, the monitor with their huge xvYCC (in example) can allow more than CIE1931 if I understood well... so, why not ?
(I heard about the ACES, but I'm not really sure this will be the same approach.)
[John Fletcher] "it's necessary to output only the colorspace"
That is not the only purpose of a professional I/O card. I would suggest giving some of the I/O card manufacturers a call and they can illustrate all of the benefits of using a pro I/O card, but when we choose to use a pro I/O card the color space consideration is actually usually the last, though still important, thing we are concerned with as these cards do so much more than that...
[John Fletcher] "hrought a regular 10bit display port by example ?"
To the best of my knowledge the Mac OS still does not support 10 bit display port output, it has always been limited to 8 bit in the past. Glad to be corrected if this has recently changed.
[John Fletcher] "soon we will use xvYCC, or Rec.2020, or I don't know what else..."
This I highly doubt and more importantly it ignores the entire premise of the DCI specification with respect to XYZ coding and signal paths.
[John Fletcher] "So, why not trying to make a good and definitive colorspace and working environment ?"
They have and this was the specific aim of DCI's use of XYZ color space and signal coding. They have created a standard that from a color space perspective is specifically geared towards ending this sort of color space obsolescence.
[John Fletcher] "Now, the monitor with their huge xvYCC (in example) can allow more than CIE1931 if I understood well... so, why not ?"
No, as you stated yourself a few lines earlier CIE1931 encompasses all colors visible to the human eye. Moreover keep in mind that when we are talking about three-primary RGB display devices the largest color space triangle you can carve out of CIE1931 is always going to be smaller than the entire CIE1931 color space. You will not find a display that can replicate all real world colors visible to the human eye.
If you are looking for some good resources to help better illustrate these points I would suggest reading some of Omnitek's concise white papers on these topics, this one in particular would be a good starting point:
I have no affiliation with Omnitek other than being a customer, but I think their concise white papers can be more accessible as a quick read than digging through the full DCI spec. The full DCI spec is also well written in my opinion and a good read if you have the time.
FSI (Flanders Scientific, Inc.)
First, thanks a lot Bram to have corrected all my mistake.
And could you clarify one important point...
the CIE1931, and the XYZ are the same thing or not ??
Or is the DCI-P3 is a sub-colorspace inside the XYZ colorspace,
which is at the end a sub-colorpace of the "ultimate" CIE1931 ?
CIE1931 --> XYZ --> DCI-P3 ???
And if the XYZ colorspace is the summum, how much % of that is the DCI-P3 ?
So will it be possible to have a "higher" DCI-P4 coming next ?
Just few more questions :
"To the best of my knowledge the Mac OS still does not support 10 bit display port output"
Just by curiosity, when Mac or Windows will be able to output 10bit, does it mean it will be possible to use only display-port for output ?
I know the card are important for lots of things, but in this case of only monitoring precise colorspace, do you think one day it could be done without all those side roads ?
But for now, with current monitors, the signal will be better directly with decklink card and SDI input,
or with display port input (or HDMI) through decklink card and HDlink converter ?
"They have and this was the specific aim of DCI's use of XYZ color space and signal coding."
That's great to know that obsolence will be obsolete (like Jim Jannard say every 10 minutes !)...
but if DCI is the final goal for grading, why Sony (and others)
continue to invent new colorspace like xvYCC ?
( http://www.sony.net/SonyInfo/technology/technology/theme/xvycc_01.html )
I know it's marketing stuff, but as you can see (figure 4.3), their Sony GxL laser projector can output 97% of the Munsell color system !
And RED will also release a laser projector (REDRAY) soon...
so their colorspace will be higher than the DCI-P3, right ?
"When we are talking about three-primary RGB display devices the largest color space triangle you can carve out of CIE1931 is always going to be smaller than the entire CIE1931 color space."
Thanks for the clarification ! I didn't understood how it was possible to get a display higher than that !! Ultraviolet television for robots ! ;)
So... the most (dumb) important question (for me) :
DCI-P3 is clearly the way to go, and it will stay for the next decades ?
And thanks for this great Omnitek's link.
I tried to read before the full DCI spec... but....
I don't work for FSI, so... it's more or less chinese for me ! ;)
But I try to learn fast !!
[John Fletcher] "CIE1931, and the XYZ are the same thing or not ?"
No, I think it would be misleading to say they are the same thing. They are closely related.
[John Fletcher] "CIE1931 --> XYZ --> DCI-P3 ?"
No, XYZ is not a subset of CIE1931. Again, the Omnitek White Paper should clear this up for you and explain the benefit of encoding values as XYZ.
[John Fletcher] "And for monitors, the signal will be better directly with decklink and SDI, or with display port with decklink and HDlink converter "
Always better IMHO to go straight from an SDI output device to an SDI input device (display). I know you are looking for ways 'around' using the pro I/O card, but I really see this as having been a greater concern or worthwhile endeavor in the past when I/O cards were relatively expensive. However, professional I/O card options have become so phenomenally inexpensive in the last couple of years that I just think it is the wrong place for professional editors and colorists to try and save money. There are good cards available for a good price and this takes the guess work out of trying to build 'workaround' or 'good enough' solutions.
Also, we need to be careful not to confuse our terms and technology here and to understand what you realistically need for your workflow. If you are color grading in Resolve you are not going to be monitoring an XYZ signal, your output will be RGB. Additionally, you will not be monitoring a 12 bit signal, which would be the bit depth of the signal if monitoring an XYZ signal. What you will be monitoring is a 10 bit RGB signal. Now you can monitor in DCI P3 if you like or in Rec 709 even, but there is going to be a point where your content needs to be transformed into a DCP deliverable. The final DCP will be 12 bit XYZ, but it is important to understand that you will not be working/monitoring in 12 bit XYZ.
There is a nice article here that may also be useful to you: http://www.thefilmbakery.com/blog/creating-a-feature-film-dcp-using-opendcp
FSI (Flanders Scientific, Inc.)
Another question for you: are you actually going to be grading for digital cinema a majority of the time? Or is there a lot of TV destined work for you as well?
FSI (Flanders Scientific, Inc.)
"I know you are looking for ways 'around' using the pro I/O card"
Yes, previously, I asked your help because I needed to buy a laptop
(without thunderbolt), and of course I couldn't add an I/O card.
But, now, I try to build a dedicated computer for color correction.
So, to be fully honest, actually the problem for me is not really the I/O card
(which are not really expensive as you point), but more the display....
even if I must admit $3295 is really cheap for a quality monitor like the CM-170W ! ; )
But that's not really the subject of this debate.
"If you are color grading in Resolve you are not going to be monitoring an XYZ signal, your output will be RGB."
Excuse me to insist on that, but I carefully (I hope !) read the links you gave,
and it's clearly stated that the XYZ cannot be use as it, because XYZ color doesn't exist,
and will have to be translate into RGB coordinates for a monitor display.
So... in any grading software, if all output must be on RGB,
and nothing can be really done in XYZ / 12bit, why Scratch said it will output in XYZ
and Resolve can't, and a I/O card like the AJA Kona can only output 10bit in 3G SDI ??!
Or are we talking about "another" RGB, and I'm wrong ?
Ok, so... more prosaically... what card and what software
can grade in XYZ in 12bit ? (And I can start saving...)
"are you actually going to be grading for digital cinema a majority of the time?"
It will be nice... but as director I don't make feature every week... ; )
So, yes, digital cinema only... but unfortunately not a majority of the time.
Of course I didn't allow myself to bother everyone with the XYZ
if Rec.709 could satisfy me for every day production. ; )
[John Fletcher] "So... in any grading software, if all output must be on RGB,"
No, that is not correct. You can and equipment does output in XYZ. Moreover, plenty of displays can and do accept XYZ signals. Again, I think you may be a bit confused as to what XYZ means and what devices are supposed to do with it, probably because I am doing a terrible job of explaining it, sorry!
The thing about an XYZ formatted signal is that it can be mapped to a variety of color gamuts in a minimally objectionable way and that is largely the point and how it helps to combat obsolescence by not being tied to one specific 'traditional' color gamut. Remember, XYZ is actually a distinct way of coding signal information. It is not RGB and it is not YCbCr. It is true that the displays are largely RGB based devices, but this does not mean they cannot accept XYZ signals anymore than it means they cannot accept YCbCr signals, which of course they do. The display simply needs to know that the signal being sent is XYZ and then map that accordingly to the target color space (e.g. DCI P3), but this is not impossible by any means and is done regularly by many devices.
If you have a professional DCP player it is sending out XYZ to a professional digital cinema projector that accepts XYZ. More lower cost flat panel displays will also begin accepting XYZ signals (FSI monitors currently have this as beta support and other high-dollar solutions already do this as well). If you have something like a DVS clipster it can already output XYZ. I strongly suspect that more of the 'economical' I/O options will in the future also support XYZ output.
XYZ output is actually not even the most challenging part from what I have been told, the bigger challenge for most manufacturers seems to be 12 bit output (needed for XYZ) as most of the popular I/O cards on the market can only do 10 bit output. You cannot output (for realtime monitoring) in XYZ in Resolve (yet), but it seems like a logical evolution of the software. My purely speculative guess is that this will come almost immediately if/when a BMD card with 12 bit output capabilities is released.
I hope that helps. I am not by any stretch of the imagination the world's leading authority on DCI and XYZ. I am mostly familiar with it from a display perspective as we have begun to implement support of XYZ and I have gleaned a good bit of information from our engineers as we have rolled this out. I saw your post unanswered and know that can be frustrating so wanted to provide what info I could, but there is a good reason there are companies that specialize purely in DCPs and Digital Cinema workflows because it can be a very esoteric subject.
FSI (Flanders Scientific, Inc.)
Nooooo !!! You're a great teacher !! Really !!!
But... like I said, we can't learn chinese in few days !
I'm doing editing for almost 15 years, but I know I'm really far from the absolute knowledge.
Bluefish444 is the only manufacturer I found which is doing 12bit I/O cards.
And BMD just release the Decklink 4K Extreme in 10bit, so we'll have to wait I think.
Maybe AJA, with a new Kona...
So, to summarize, I just have to build my computer, and... wait for a 12bit I/O card ! ;D
I'll continue with my REC.709 until that..
To finish, just a question about your monitors.
I'm really interested by the CM-170S, but you said the XYZ is beta,
so can you confirm me (even in approximate % if you prefer !)
it will be enough accurate for use in color grading ?
With a 12bit I/O card of course !! ;)
And by curiosity, why your monitor have a video processing in 12bits, but only display in 10bits ?
[John Fletcher] "I'm really interesting by the CM-170S, but you said the DCI-P3 is beta,"
No, sorry if that was not clear. Both the CM-170W and LM-2461W have a DCI P3 color gamut selection, this has been in the units for quite some time now and is not beta. What is currently in beta is XYZ signal support. Remember, DCI P3 and XYZ signal format are distinct things. The monitor can operate in DCI P3 color gamut whether the signal is RGB, XYZ, or YCbCr.
You have been able to monitor RGB signals in DCI P3 since the CM-170W was released in April. Now we can also support XYZ format signals in addition to RGB and YCbCr signals. These can, though they don't have to be, used in conjunction so you can for example accept an XYZ signal and map it to DCI P3. People have been using our monitors in DCI P3 mode for some time now with RGB signal paths. At least this way they are working in the same color gamut that the deliverable will eventually be seen in. This way the creation of the DCP has less of an 'unknown' characteristic to it for the colorist as the data is more or less just being formatted for correct viewing from a Digital Cinema Player and the color gamut of the theatrical devices will be more or less the same as the color gamut the colorist was working in throughout the process. This is in contrast to another popular workflow where people will actually grade in Rec 709 through the entire workflow until creation of the DCP. That type of workflow is more likely to require a final pass on an actual P3 device when all is said and done, but if you don't have access to a P3 capable device for the majority of your work this can be a perfectly manageable workflow...some people even prefer to do it this way as they work with partners or systems/software they feel offer a very good Rec 709 to DCI transfer.
FSI (Flanders Scientific, Inc.)
Yes, sorry for the last question...
I mean XYZ, not DCI-P3...
sorry, my brain is tired !!
Well.... thanks a lot for all those explanations Bram !
It was really helpful. ; )
(And I promise I'll stop with my display port ramblings !!)