I am doing some rigorous testing with my Canopus ADVC-100 - I was recording the analog outputs of the ADVC-100 to my Panasonic NV-FJ630 VCR and found that all captures from my VCR of the ADVC-100's footage have white overshoot (i.e. when the captured DV file is loaded in Sony Vegas with video scopes set to "Studio RGB (16-235)", the waveform monitor in luminance mode shows overshoot above IRE100 most of the time. There is no clipping indicated on the scope as maximum white level only reaches about IRE105, although when I compare with what I see on the VCR via a CRT TV, a small amount of detail in the highlights are lost in the ADVC's captures. The blackest black is IRE7.5 which should be correct for American NTSC, so only whites are affected. I've tried playing back other pre-recorded and commercial tapes on the VCR and white levels max out at IRE100 as normal.
I know that I can adjust the levels of the captured DV file to bring down the whites back to legal range but as I said before, there were small yet noticeable losses in highlight detail, so I want to try seek for a solution before opting for a proc-amp.
The strange thing is that while using Vegas' "Print to tape" function, I chose to output a color bar test pattern leader to record to the VCR and when I re-captured the pattern from the VCR, levels were correct as maximum white is at IRE100 and maximum black is at IRE7.5.
Does anyone have experience with this? Thanks in advance.
EDIT: As I have last checked, outputting PAL footage from the ADVC-100 and recording onto the VCR then re-capturing that shows correct levels for PAL (IRE0 to digital 16 and IRE100 to digital 235).