Does anyone have any experience with performance of the Red Giant Denoiser III with various GPUs? I have some 4K footage from a client that is noisey. If I take just one of the clips from the sequence (footage is in 4GB clips), apply the effect, and start it rendering, it'll take 12 hour son my NVidia K4000 card (around 450 processing cores, and 3GB memory. If I upgrade to a card with significantly more cores and memory, will the effect be able to make great use of it and speed way up, or is my card already handling all the processing the effect can through at it and spending $700 on a new card won't make much difference? I was thinking of the NVidia 10 series, with several thousand cores and 11 GB of memory.
I went ahead and purchased a new graphics card. Thought perhaps someone out there might be interested in the results.
I bought the EVGA GeForce GTX 1080 Ti SC GAMING Black Edition Graphics Card to replace my Quadro K4000 card when I was having rendering issues while using the Red Giant Denoiser III effect. For a 40 minute seminar video using the K4000, after six hours the render had gotten about 30% done, and for the previous two hours was making no further progress and was up to 14 hours remaining (and increasing one-for-one with elapsed time!). After installing the 1080 card, the same video rendered in 3.5 hours. Awesome! And the card is pretty much silent.