FORUMS: list search recent posts

2019 Hardware Encoding not supported on $3k 12-core xeon chip?

COW Forums : Adobe Premiere Pro

<< PREVIOUS   •   FAQ   •   VIEW ALL   •   PRINT   •   NEXT >>
Brad Bussé
2019 Hardware Encoding not supported on $3k 12-core xeon chip?
on Dec 5, 2018 at 9:57:39 pm

Which Intel chips support the new hardware encoding feature for exports in CC 2019? I just tried using it but it's greyed out and says my hardware isn't compatible. I'm on a Dell Precision 7920 with, at the moment, just the single chip installed which is a Xeon Gold 5118. This chip alone sells on B&H right now for over $3k. How is this chip not supported?

I have latest Win10, just did a clean install of the latest nVidia drivers for the 1080 installed, and changed from 2-pass to 1-pass to be compatible with hardware encoding as per:






Still no go. I still have 2018 for my main projects but I've been testing 2019. I haven't done side by side comparisons, but 2019 does seem to be speedier in the real-time preview performance in the timeline. Essential Sound upgrades are welcome, though it has crashed the app a couple of times. Lumetri upgrades are also welcome, no crashes from that. But overall, 2019 is very unstable. It often crashes when all I'm doing is the simplest tasks like shuttling the timeline with ProResHQ footage and no FX. I've noticed sometimes the program appears to not be open even though it is, so I have to go into Task Manager and force quit. So far, at least, it has let me force quit unlike 2018 which often will not.


Return to posts index

Oliver Peters
Re: 2019 Hardware Encoding not supported on $3k 12-core xeon chip?
on Dec 5, 2018 at 11:50:01 pm

Hardware encoding refers to accelerated H264/265 encoding. This is based on architecture specific to the Core series chips. So no Xeons.

- Oliver

Oliver Peters - oliverpeters.com


Return to posts index

Brad Bussé
Re: 2019 Hardware Encoding not supported on $3k 12-core xeon chip?
on Dec 11, 2018 at 4:27:41 am

Is the architecture specific to the Core series, "QuickSync"?

What's the thought behind leveraging the dedicated hardware decoding in the Intel Core series over leveraging the dedicated hardware decoding in GPUs?


Return to posts index

<< PREVIOUS   •   VIEW ALL   •   PRINT   •   NEXT >>
© 2018 CreativeCOW.net All Rights Reserved
[TOP]