VBR vs. CBR for best HD output?
by John Matchett on May 12, 2009 at 1:45:12 pm
I'm trying to come to a definitive decision about rendering templates for lossless 1080 & 720 HD output, for use at trade shows etc. My .veg files have a lot of large hi-res files which use crop/track motion effects for animation, as well as native HD video.
Does anyone know if there VBR or CBR is best in terms of absolute quality. I know this is probably one of those "How long is a piece of string?" questions, but I've been playing with the various settings for some weeks now. I'm still not sure and I'm getting bored now!
One thing I have cured - by putting all the settings to their highest quality and fiddling with the keyframe rates - is the smoothness of the track-motion and transition effects. I rarley have a specific problem, although I had one clip recently which seemed to freeze momentarily during a complex multi-layer video/audio section using VBR; using a CBR rendering template solved this.
I'd appreciate it if anyone has any experience they’d like to share about this.
Re: VBR vs. CBR for best HD output? by John Rofrano on May 12, 2009 at 6:24:51 pm
> I'm trying to come to a definitive decision about rendering templates for lossless 1080 & 720 HD output, for use at trade shows etc.
I don't think you really mean lossless because that would require uncompressed HD video would would be an extremely large amount of data. I believe you mean "best quality possible".
> Does anyone know if there VBR or CBR is best in terms of absolute quality.
VBR has the potential to yield better quality for the same amount of bits. CBR will give the best quality at max bitrates.
To understand why this is, first you have to understand what they both mean because absolute quality depends on bitrate:
Variable Bit Rate means that you can vary the amount of bits used to represent a frame so that the overall average amount of bits-per-frame is achieved. It does this by stealing bits from frames with less information to encode (that don't need them) and giving them to frames that have more information to encode (and does need them).
Constant Bit Rate means that each frame uses the same amount of bits regardless of whether it needs them or not.
If you encode a video at 6Mbit CBR and 6Mbit VBR the VBR can have better quality. This is because it will vary the bitrate perhaps giving some frames as much as 8 or 9Mbit while others only 3 or 4Mbit but in the end it will average out to 6Mbit. CBR, on the other hand, will never give any frame more that 6Mbit. If there are frames that need more than 6Mbit to encode, CBR will look worse than VBR.
Having said that... if you are making DVD's and the DVD spec says that 9Mbit is the highest bitrate you can use, then a CBR 9Mbit is the best quality you can get because each frame will use all 9Mbit. So if your project will fit on the media at the max bitrate then CBR will give you the best quality. If it won't VBR 2-Pass should yield better results at the same bitrate.
BTW, VBR 2-Pass is a method of making 2 complete scans of the media, once to calculate the bits needed for each frame, then formulate a budget for where to best spend the bits, then a second pass to actually do the encoding. It takes some of the guess work out of budgeting bits which could make VBR 1-Pass look worse than CBR if bits are not allocated efficiently.
Re: VBR vs. CBR for best HD output? by John Matchett on May 13, 2009 at 8:15:46 am
Thanks for taking the time for your detailed reply - it was appreciated.
By the sound of it the 2-pass VBR will probably be the best bet for my projects. We're hoping to put in one of the new Core i7 machines soon, so the extra time taken to run the second pass shouldn't be too noticeable.