We calibrate using color bars and the "Blue Only" feature that most good CRTs have. The short question is as follows: Do you calibrate a monitor for HD using 75% Chroma bars or 100% Chroma bars?
I have received both answers from video engineers, so I am looking for a proof on why one is right and one is wrong.
If we calibrate to 75% bars (Like in SD) then we gain 25% more gamut. Also, a still imported into HD or SD projects retain the same visual Luma/Chroma ratio, which sounds correct to me.
If we calibrate to 100% bars (unlike SD) then we gain more bandwidth to describe colors. This is because the highest and lowest broadcast safe chroma in SD looks exactly like the highest and lowest broadcast safe chroma in HD, but we have more volts to describe the chroma in between. But we do not get any more gamut, because no saturation above SD 75% is allowed. Also, a still imported into and SD project will have a different Luma/Chroma ratio than if import into an HD project.