Vegas 14 (any build) multicam edit and GPU/CPU usage while show full frae on external monitor
Hi dear Cow members!
Recently I had a complaint which I told MAGIX support too. I got an answer as useful and informational as telling "it's usually raining in England".
So my problem is I already mentioned somewhere here in a topic, that when I activate multicam mode, and want to see the output fullscreen on the second monitor, the GPU isn't utilized at all.
Windows 7 Home-premium 64 bit with 8GB RAM, running on i5-3570 CPU. The GPU is a Radeon 6970 with 2GB RAM.
Certainly not topnotch machine, but it performs decently otherwise (I get a bit faster than realtime fullHD renders).
When I edit a simple project with fullHD AVC footages, I get smooth playback from timeline, CPU load is 10..60%, GPU load is 7..30%. Those constantly change, and depend on the complexity of the timeline too, this is normal.
(I read the load values from Windows task manager, and Techpowerup GPU-z).
Now, when I switch to multicam mode, the GPU usage drops to zero, CPU near to 100% and the preview gets stuttery, about 6..9fps for a 2-cam shot.
When in multicam I disable the second monitor output, GPU usage returns, and preview gets smooth again. (But doing so I dont see the program, only the "cameras" in the cyan border ☹ )
With a 2-cam shot mutlicam track then the load values show then somewhat, but not significant higher percentages than in simple edit mode. I think this load should be kept during full screen output as well? (Just because in simple edit mode switching full screen on secondary monitor does not seem to influence the GPU/CPU load at all).
Recently I had to work on a 3-cam shot, and there the preview fps was unbeareable, no matter where the preview quality is set... Even the old Vegas 10 outperformed -not significantly- Vegas 14 in this regard.
So please, if anyone can confirm this (I suspect buggy) behavior, that in multicam edit mode and output full frame to secondary monitor basically disables GPU acceleration, please notify MAGIX.
If you don't experience this phenomenon, please tell me what GPU do you have?
Thanks in advance.
László i gave up on GPU processing long ago... I have a stress free workflow based mainly on CPU processing.
GPU disadvantages far out weighs its benefits in most situations, i have not the time to battle with
this often inconsistent technology.
Steve Rhoden (Cow Leader)
Film Maker & VFX Artist.
Owner of Filmex Creative Media.
Samples of my Work and Company can be seen here:
If you want help with that configuration, here are some things to look at.
There are reasons that Vegas recommends i7 hardware vs i5. Vegas dedicates 8 threads to AVCHD decode alone, then it has all the other processes too along with Windows threads. A HyperThreaded CPU really does help with all of this.
Here is a link to where your CPU sits in comparison with i7 class hardware, and 1000 points here is significant.
Determine how fast your memory is operating at with Memtest86+ boot disk, or Winsat Mem. If your memory bandwidth is below 12GB/s you need to figure out why. This might be due to low cost memory being installed by the manufacturer, or you are not operating in dual channel mode.
If you upgrade your memory, pull out all of the existing memory, and upgrade with only 2 identical DIMMs of the desired total memory. I would do at least 16GB if not 32GB since the remaining extra memory goes into disk cache, which is what you want for multi-cam work. 12-24GB/s is much faster than accessing a HDD at 100MB/s. According to the Intel Ark specs on your CPU, you can handle DDR3-1600MHz but your motherboard memory controller might handle faster speeds. Memtest86+ any memory configuration, especially speeds above the stock 1600Mhz. There should be Zero errors no matter how long you run it, even overnight.
Since your motherboard likely only capable of PCIe2.0, I would verify with GPU-z that the device is actually interfacing at 16X speeds. 8X PCIe2.0 is pretty gimp when you compare it to the system memory speeds. You want to be able to move data as fast as possible between the System memory and GPU memory. The 6970 is an XT GPU, so you are already maxed on the compute units for that series. OpenCL (GPU accell) should work well with Vegas, but the CPU might be holding things back. Hard to tell. I would make sure to disable the Intel GPU in BIOS if possible.
Codec in use on the timeline, test Cineform or XDCAM-ex material for multi-cam projects. Make sure audio is uncompressed only, no MP3/AC3 audio.
Just some thoughts on things to try.
Hi Lazslo, just a small detail that might/not be relevant if you are getting new memory, and also depends on your system. If you get faster memory than say 1600, for example 1866, unless you set this in the uefi bios, usually under xmp, your system will still only be running at 1600.
As I said its a small detail only. I had been running my own system incorrectly until I came across this by chance. It would appear its quite common for a lot of users to have souped up systems, with faster memory, but not fully utilising it.
Thanks for your answers guys! ☺
I feel a bit misunderstood. I know that my system is not the top, and my question was about GPU usage in a (not so) special scenario.
Now I know Steve, that you don't use GPU at all. I could do this way as well, but I then still don't see any benefit of Vegas 14 over Vegas 10 other than that the V14 is GPU accelerated, and why did I pay for the upgrade then? Better would spend my money on HW upgrade 😉
Just finished a render this morning: 23 minutes long fullHD AVC render done in 16 minutes. With CPU only this would take definitely much longer. So I want to continue to use GPU... 😉
In my case, the bottleneck is not the amount or speed of memory. Vegas 14 is totally capable to decode and play smoothly a 3 cam shot.
Here's screenshot of it (look at the load and preview fps):
That clearly shows, that GPU is used, preview runs at 25 fps, and CPU is not maxed ☺
This is the load I would expect anyway.
Now when I switch on the external preview, the situation changes:
GPU usage drops. I don't expect the i5-3570 to be able to keep up a smooth preview on a 3 cam shot like this. But there's a whole lot of computational power in the GPU, which would help a lot (proof is the situation before), now that GPU power is just there unused. And this is what I find weird.
BTW, if I disable GPU acceleration I get the same poor preview performance, which for me shows, that Vegas just does not use GPU when multicam edit and external preview is on.
is this a general phenomenon with any type of (supported) GPU, or it's just my setup/driver/specific GPU?
Did you try any other codecs like XDCAM or Cineform? Those work different from the way AVCHD decodes.
The issue does seem odd. Have you tried different monitor outputs off the card for the 2nd monitor? On my system, I output to full preview HDMI via a Displayport adapter. While two other desktop monitors are on the DVI ports.
You are certainly CPU bound. Have you tried playback with other Preview settings? Draft Half, auto, or Preview Auto?
Do you need full quality playback while editing? I tend to favor FPS over image quality, knowing that the quality will be there in the final. Switching to best/full temporarily when placing graphics, or others things that need it.
thanks for your input!
[Aaron Star] "Did you try any other codecs like XDCAM or Cineform?"
Not yet, but I'll give it a try a tomorrow. I'll come back with the result.
I don't need very good preview quality when in multicam. Of course I tried to lower preview quality, I even went down to draft/quarter, bypassed all effects. That resulted an ugly pixelated picture but with the same preview fps - LOL.
I have 2 identical monitors, those attach to the videocard, one with DVI-D to the cards DVI output, the second via displayport (with an appropriate DVI adapter, as the monitor has only DVI and DSUB inputs).
There's still a HDMI output on this card, which is empty, I don't use it.
If you switch to multicam, don't you see a significant change in GPU/CPU load?
[Aaron Star] "Those work different from the way AVCHD decodes."
Aaron, you have the point ☺
I never delt with proxies so far, but before converting something to XDCam, I told Vegas to create video proxies for a recent multicam project. That seemed to be the easiest thing to try...
That's only a 2 cam shot, and already finished, but the struggling was just the same with it.
So I picked the 2 main avchd streams in this project, and asked Vegas to create proxies.
That took a while, but didn't want to interrupt, and wanted to see what happens.
So read the news, and had a coffee (actually 2 ) ☺
At the end I had 2 huge sfvp0 files in the project folder.
And bingo, multicam playback performance is just OK, and guess what, GPU is taking its job in it:
(Note that external preview is ON)
I'm not sure how these proxies are encoded with what resolution, they have definitely less detail - I don't care, for doing the multicam edit this does the job.
Setting preview quality anything above "preview" results in using the original materials, I see the details coming back ☺
So I learned the lesson: I need the proxies for multicam.
Thank you Aaron!