Just purchased new 23" high res LCD monitor for our edit bay. I'd like your opinion if we bought a monitor that's too good (thus too sexy) for our needs...
We'll be using:
- Avid Adrenaline HD 8-bit
- DVC-Pro HD format (1080i, 8-bit)
----Camera's manual says it records 1080i at resolution of 1920x1080.
----Deck's manual (AJ-HD150p) says it has a "Sample x Effective Line" of 1280x1080. (Nowhere does it actually say it plays 1920x1080).
Now, our new HD 1080i LCD monitor has over 1900 lines of resolution - enough for high-end HD evaluation. Trouble is, when we feed it video from the deck, it seems too critical. Is that possible?
We set up a test where se sent HD video from the deck to two HD monitors: the LCD and a 4-year-old consumer model CRT (with about 1200 scan lines). The shot was a sunrise over a mountain range. Both had really nice pictures, yet the LCD had a 1-pixel "ghosting" on top of the mountain range that resembled the NTSC ghosting you used to see on old tube cameras. This was NOT present on the CRT. We put in other video of interviews, interriors and exterriors and the ghosting is present on all of them. It almost looks like the sharpness is turned up too high - even though we have sharpness on its lowest possible setting.
Is DVC-Pro HD (8-bit 100 Mbps) too low of a resolution for this monitor? Is there such a thing?