FORUMS: list search recent posts

8 bit vs 10 bit colour on Geforce Cards.

COW Forums : DaVinci Resolve

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
marcus lyall
8 bit vs 10 bit colour on Geforce Cards.
on Jan 27, 2013 at 5:52:30 pm

This is probably a stupid question, particularly as Blackmagic support Geforce cards. And possibly the wrong forum.

I'm reading that Geforce cards don't support an internal 10 bit pipeline. Not sure if this just applies to Mercury Playback Engine, output for SDI via Quadro cards... or what?

Obviously Resolve is using 10 bit colour or higher...
And I can't believe that Premiere would limit you to working in 8 bit on a Geforce card.


I'm building some new workstations and upgrading some Macs, mostly for Premiere, but looking at what bits to rebuild my Resolve with also. Buying cards for 3 older Macs plus a couple of new PC's so this is a fairly pricey decision.

Dealers will talk about Quadros being more reliable, better warranty... and a bunch of other hooey.

I actually don't mind buying a Quadro if there's a good reason. Rather have one card that works than buy two. But wondering if all this stuff about Quadros being any more reliable has any truth.

Lack of 10 bit support would push me over to Quadro, but not sure if 'extra reliability' would. Thoughts?


Return to posts index

Peter Chamberlain
Re: 8 bit vs 10 bit colour on Geforce Cards.
on Jan 28, 2013 at 1:31:57 am

Hi, please refer to our configuration guides for details on some GPUs and their slot location for use with Resolve. Its true that Quadro cards have lower power consumption and create less heat and thats dues to having fewer cores. Thus the GTX series is faster but you need to think more about heat and power or they will fail.

Resolve processing in 32bit floating point so 8/10 bit limitations might be based on other companies processes, not ours.

Peter


Return to posts index

sushil sharma
Re: 8 bit vs 10 bit colour on Geforce Cards.
on Jan 29, 2013 at 4:24:03 am

hi
the whole 8bit/10bit fuss is because of a simple misunderstanding. when we talk about graphics card it's about the display precision means the color depth of the GUI monitor. this confusion is because of autodesk as some autodesk system like smoke and lustre uses GUI card to output realtime deliverables as well (via Quadro or Quadro SDI card), so 8bit/10bit matters here.
GUI monitor does not need to be more than 8 bit.
but in case of resolve GUI is only for GUI operations it is never meant to be used for VTR feeding or graded output. Output for grade monitoring or VTR feed is always routed thru video card i.e. Blackmagic decklink card or in case of some old Davincis a DVS card both are always 10 bit comapatible.
another big difference between geforce and quadro card is double precision floating point units. these units are useful for 3d graphics rendering so quadro wins here but davinci does the calculations on single precision floating point (which is 32 bit much much more than 8bit or 10 bit.) so geforce is sufficient for color correction calculations moreover geforce have higher clock rates and are handsomely priced.
the internal calculations are always done at 32 bit and then rescaled to 8 bit or 10 bit for output. you can always check this by using a simple experiment. Grade something down to black so much so that you barely see the image and then grade it up . your picture is almost recovered to its original state. that proves that internal calculations preserve much more data we can see on the screen. its only the end of whole process where the computing part of geforce or quadro card pass the data
To output part of decklink for accurate output
or
to display part of geforce or quadro for approximate output for gui windows.
as much as i understand quadro is recommended for GUI because of the reason only that it has more robust drivers for GUI display and some silly marketing reasons. for GPU processing geforce is always better provided you have put GEFORCEs in separate case with extra cooling and a big power supply to feed power hungry GEFORCEs. geforce 580 gives better performance than 10x pricey quadro6000 in resolve.
hope it helps


Return to posts index


marcus lyall
Re: 8 bit vs 10 bit colour on Geforce Cards.
on Feb 7, 2013 at 12:45:53 pm

OK, I think I get it....
My thought was that if I'm doing 10 bit work in Premiere and I'm using a Geforce card, then when it renders using a GPU-accelerated effect, it'll be doing this in 8 bit not 10 bit.

But I guess what might be happening is that I'll be watching an 8 bit preview when I play something in realtime.
And when it renders, it's using a different pipeline which works in 10 bit.

Assuming this would be the case as otherwise any non-Quadro machine would be working solely in 8 bit, which can't be right.



Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2017 CreativeCOW.net All Rights Reserved
[TOP]