Any word on new GPU support in VP13?
Anyone know if the release of Vegas Pro 13 will now support the NVIDIA Kepler-based GPU cards?
Perhaps 'support' is the wrong term. Maybe 'take advantage of the Kepler architecture to deliver stable, high performance preview and rendering' apparently other users of newer NVIDIA graphics cards experience in other software applications.
This is all Sony's saying right now. Nothing about hardware.
[Doug Jackson] "Anyone know if the release of Vegas Pro 13 will now support the NVIDIA Kepler-based GPU cards?
Perhaps 'support' is the wrong term. Maybe 'take advantage of the Kepler architecture to deliver stable, high performance preview and rendering' apparently other users of newer NVIDIA graphics cards experience in other software applications."
Just keep in mind, there's only so much Sony can do here. Consider the various issues:
OpenCL: Vegas itself is only OpenCL. OpenCL went to OpenCL V1.2 in November of 2011 and OpenCL 2.0 in November of 2012. I wouldn't be surprised if both Vegas 11 and 12 were under the OpenCL 1.1 API. I'm not sure a great deal of the changes would be relevant to Vegas, anyway... it would be interesting to hear the Vegas engineers take on that. In short, OpenCL itself has been stable, even when manufacturers' drivers haven't always been so. But my experience with AMD suggests it's completely possible... I have had no GPU issues at least since Vegas 12.
MainConcept: This needs to be updated, but hopefully also fixed for real. The current MainConcept AVC codec hard-wired the GPUs they felt like supporting. So, for example, OpenCL does nothing for nVidia, even though it ought to (if perhaps slower than with the CUDA option). And neither CUDA nor OpenCL works with any nVidia Kepler or ATi GCN GPU. It most likely could, these are just on its list. Vegas 13 desparately needs to add modern GPU support here, and ideally, arbitrary GPU support, since that's actually the entire point behind OpenCL at least.
nVidia: there are two big issues with nVidia, which are totally out of Sony/Vegas's control: their commitment to OpenCL, and their crippling of select performance features.
It's been observed on various OpenCL torture tests that Kepler cards pretty much keep pace with recent ATi cards, eg, they hold their position, better or worse, as you go across benchmarks. But every so often, they hit a brick wall. It's been conjectured that this might be due to nVidia favoring CUDA over OpenCL. But...
...As has been posted here, nVidia has intentionally crippled some specific OpenGL operations in their consumer/desktop versus workstation-class GPU drivers. This isn't a case of the pro drivers being more "optimized"; there are clear cases in which some OpenGL ops run at 1/10th speed in the consumer drivers, and most of that speed can be regained by carefully coding around the bad ops.
You can see some of this here: http://www.tomshardware.com/reviews/geforce-gtx-titan-opencl-cuda-workstati...
The GTX Titan is nVidia's fastest desktop card, and in some benchmarks, it rules. In others, particularly some non-gaming OpenGL and then OpenCL, it's getting blown away by the high-end AMDs. But even by older nVidia cards. The degree by which it loses suggests some real issues with these cards as a system -- whether it's hardware or software. And that's totally isolated from Vegas performance.
So these days, nVidia also sells, along with consumer/desktop "GeForce" and workstation "Quadro" cards, their "Computer" Cards, the Tesla series. As with all nVidia cards, at any technology node they're basically using the same chips, with more memory and perhaps some slight tweaks on the higher-end parts. Given that they've clearly been crippling OpenGL on the desktop cards (something AMD has not done year, far as we know), it's certainly not without precedent they're doing the same, on both workstation and desktop drivers, to keep the prices higher for their "compute" cards.
Otherwise, just how to you explain my favorite benchmark of all time: page 21 of that review, 2nd benchmark down, "Surface Smooting"... my $300-three-years-ago Radeon HD6970 is 5.3x TIMES! faster than the $1600 (today) Titan. Sure, the HD6970 only beat the Titan on a few benchmarks, but then again, the HD6970 beat the Titan on a few benchmarks. That's like an AMD Phenom beating an Intel i7-4960K... it just doesn't happen. On any benchmark. There is something going on here, and not necessarily something Sony has any power to change.
I wouldn't hold out for MainConcept to be fixed anytime soon. DivX and MainConcept were purchased by Parallax Capital Partners and StepStone Group in March. (Think- Gordon Gekko)
Rovi press release
MannMade Digital Video
SCS has to be worried about this a little. They are dependent on Mainconcept for nearly every decoder and encoder in Vegas.
I'm not sure it's the sale to Parallax that's the problem. Rovi bought Main Concept back in 2011. I can't swear to what Main Concept is telling their paid-in customers right now. But looking on the web site, they're supporting exactly the same set of circa-2011 GPUs that they did in Vegas 11 and Vegas 12. And nothing past that. So basically, nothing was improved under Rovi these past three years, far as they're admitting.
Sony really needs to deal with this problem. Not to mention the coming need for HEVC and the WebM formats. You don't fall behind all at once, it's a little bit at a time, it just eats away.
The tablet interface stuff looks kinda cool, but it won't matter to me until it drops on Android. I'll see if it matters at other times.. I am a tablet user, but I don't commenting/previeing on th tablet being important. Trendy, but useful?
That's not something I'm currently using. Rendering to advanced video CODECs as fast as possible, that stuff's regular everyday work.
I love it when Dave Haynie posts! He has so much in-depth knowledge on tiny specifics.....I just love it!
On a side note- yesterday I was perusing the Vegas Pro forum on Sony's website and I noticed that somebody there copies Dave's posts from here and pastes them over there as their own. Now THAT is flattery!!
Not sure what is happening with Nvidia but check out the following with Vegas 13 running realtime 4K with the AMD Firepro W9100.
See: AMD FirePro W9100 4k video solution pics from NAB 2014
You all sitting down? I believe the cards list at $3900 each :(
[Chris Young] "Not sure what is happening with Nvidia but check out the following with Vegas 13 running realtime 4K with the AMD Firepro W9100"
I can do realtime 4K on my Radeon HD6970, at least for some formats. But of course, Vegas 12 and 13 ought to go faster with a faster OpenCL card. The only real hold-back is the Main Concept AVC CODEC, and I'm starting to wonder if I even care. At some point in the lifetime of this PC, I'm going to want a 4K monitor, and that means a GPU upgrade. Of course, the old card could stick around.. I have four GPU slots.
The Firepro W9100 ($3999, 16GB DDR5, 2,816 stream processors) is AMD's "pro" version of a "Hawaii" GPU card, which is also found on the Radeon R9 290X ($630-$699, 4GB DDR5, 2,816 stream processors, 1GHz) and doubled on the Radeon R9 295X2 ($1499... two Hawaii processors on the same card, thus 5,632 stream processors, 1018MHz, 8GB DDR5). It is not obvious where the Firepro would trump the 290X running Vegas. Obviously, the extra RAM is a big concern for high-end mechanical CAD and perhaps 3D animation. And dual GPU doesn't help in a single instances of Vegas.
There's also the R9 290, which drops the stream processor count to 2560 (sounds like they're binning Hawaii chips with slight defects) and the clock to 947MHz... for $400-$450.
Thx for the heads up on the other card specs. I've made a note on them just in case I go mad and build a 4K capable Vegas 13 box. Tempted but not had one client remotely interested in 4K. Most go 'What is 4K?" So very hard to justify business wise when it's only me who wants it. Note 'wants' rather than 'needs' it :(
Here's the thing.. when I moved to 2K/HD, it was [a] because I'm just that kind of a geek, and [b] because I could get way better DV results from a modest-priced HDV camcorder than I could from improving my DV camera to any level I could afford. And then, when HDV became more important (I started actually using it for paid projects in 2006), there it was, already with a functional toolchain (well, ok, no ideal disc format yet), and me already knowing it.
I'm thinking 4K goes the same way. There are already 4K displays going for under $500.... this Seiki 39" monitor is probably not good enough for your primary video-oriented display, but these things are big among computer nerds, used as desktop displays. The cameras are getting interesting: Sony's 1" sensor camera, the Panasonic GH4, the new m43 models from JVC, even that crazy-low-light Sony Alpha 7S, all seem likely to be within my rather modest price range... still haven't decided if I'm every going back to a pure fixed-lens camcorder... lots of the DSLR workflow is improving, and lots of the stupid limitations are being removed.
Not there yet, but I expect to be there before it becomes critical. In fact, I'm shooting a short 4K (well, more like 5.5K) video right now... using a DSLR and an external shot controller.. a time lapse of my forest going from winter to spring.
The other thing about 4K... cropping. I used that quite a big in the DV days, where I could make cropping decisions in post on the HDV content, and you'd never know in the final DV result. That got me thinking that maybe I should always be shooting at a higher resolution than delivery. That seems to be where Red and some of the high-end guys are going, now that 4K is a regular deliverly format in cinema.