Budget Grading Setup- will a DVI connection support 10 bit color?
Hey y’all, first forum post on here and thought this wonderful online community could help me answer a few questions.
I’m currently constructing a budget friendly color grading suite for film projects. After hours of research and time I believe I have everything that I think I would need to make this set-up happen.
My goal is to grade my film projects in DaVinci Resolve via my (mid-2012) Macbook pro. Because this Macbook (and all Macbooks really) should not be used for color grading, I purchased the BenQ SW2700PT monitor. Is it an Eizo or Flanders? No. But for the type of entry level/ semi-professional work I will be doing and the budget I had, I think this monitor will suffice and it supports all I would need. The monitor will be color calibrated to adhere for any faulty colors that came with it out of box.
Next, I need to get a reliable color signal from where the grading will be taking place (Macbook) to the monitor (BenQ). My only real option after researching seemed to be the Blackmagic Decklink Mini Monitor 4k PCIe card (the BenQ doesn’t quite support 4k but hey, the more room for resolution the better).
However, since the Macbook doesnt have a location to house a PCIe card, I would need to house the card in an external unit via thunderbolt 2 to my Macbook. I decided on the Sonnet Echo Express SE1 Thunderbolt 2 Expansion Chassis (ECHO-EXP-SE1).
Now here’s the tricky part. I want to be able to relay at least a 10 bit color signal from my Macbook through the Decklink and to the BenQ monitor (Both the Decklink and Benq monitor support this signal, the macbook maybe or does that matter?). Because the Decklink only supports HDMI 2.0 or SDI out, I had to find a way to convert one of these signals to connect to the BenQ monitor (which has 3 different options: HDMI 1.4, Displayport 1.2, or DVI-DL). My thoughts were this: HDMI 1.4 is a no go because it fails to support a 10 bit signal. Displayport 1.2 is out (for now) because I cannot find a reliable supplier of an HDMI 2.0 to Displayport converter. So my last hope seems to be an HDMI 2.0 to DVI-DL converter of which I found for a very reasonable price on Monoprice. According the specs on the site, the adapter will relay a whooping 16 bit color depth at resolutions up to 4k (more than I need). Seems like I’m set. But if my setup is sound and everything works as it should, or I think it should, will the DVI on the monitor accept this HDMI to DVI signal effortlessly and produce the 10 bits I hope for on screen? Or being the semi-noob that I am, have I missed or overlooked something completely to make 10 bit trustworthy color grading occur via this setup?
Thanks in advance! 😁
Sonnet Chassis: https://www.sonnettech.com/product/echoexpressse1.html
HDMI to DVI Adapter: https://www.monoprice.com/product?p_id=2404