I'm looking for clarification / understanding of the different ways you can monitor video on an HDTV, while editing in FCP or Avid on a Mac Pro. I understand that its "better" to output via a card such as a Blackmagic Intensity Pro. but I don't know why this is so.
Here are three ways to do it.
1. DVI out to HDMI into HDTV, using Desktop Cinema Preview (basically a second monitor, full screen)
2. PCIe to Blackmagic Intensity Pro to HDMI to HDTV (or similar card)
3. Firewire/Thunderbolt to Matrox MX02 to HDMI to HDTV (or similar box)
I currently do #1, but I understand this is inferior to #2 and #3. Why is this?
What is the tangible, actual difference in the quality and accuracy of the video signal?
Will I be able to see the difference? Also what's the difference between #2 and #3?
Not only the output needs to be standard, but also the monitor.
If you are dealing with video and you want accuracy, you need to put out a CCIR compliant signal (CCIR 601 for SD, CCIR 709 for HD), and this, your computer video card won't do it, so the DVI out signal won't have the proper color profile.
The BlackMagic and the Matrox, will do it, however you will need to feed with them a monitor able to properly display that compliant output signal. An HDTV monitor won't do it.
You only get that on a properly adjusted Pro monitor.
Normally,TVs are set to display the picture "beautiful", not accurate.
My 42" PANASONIC has an option to display the picture "as it is", and really looks very closed to what I get on my monitor. However bright, contrast, chroma ad phase can't be fine tuned as in the monitor.