A client of mine has setup an Aja I/O LD on a graphics workstation to allow for uncompressed 4:2:2 10-bit capture.
The system that it is installed with doesn't have final cut pro - they are using AJA's VTR utility (not a bad program - wish it would do a batch capture)
They are a Digibeta house and are all setup SDI. The I/O works very well capturing and laying back to tape... Everything looks identical on a scope when comparing source tape to capture or layed back capture. The issue rolls around when they take the footage into after effects. There is a slight clamping of color (it looks like certain color values are being clamped). All of the footage is video legal... I have tried changing the project settings color space from none to NTSC and SMPTE-c (a suggestion from a website that I am going to do more research on - not that familiar with SMPTE-c) to see if for some reason that was causing some sort of chroma limiting.
Right now, my thought is that After Effects isn't really rendering the file as 10bit... that it is actually doing a dithered 10bit... that's really a shot in the dark, but I can't think of anything else that could cause it.
Ultimately it isn't something that will cause our tapes to fail a network QC but I would like to know what is causing it.
If there's no FCP on the system, what codec are you capturing to? I would suggest AJAs 10 bit RGB codec as after effects works in RGB. If you are using a YUV codec, it will present gamma problems.
Also, what footage are you comparing? Are you watching the tape and then comparing that to what you captured? Are you monitoring analog component? What color space is everything setup to (including your digibeta deck)? You should be monitoring the whole thing the same way (SMPTE N10 or beta component). That could account for different color shifts as well.