I have a short film which about 9 minutes long and I want to put it on a DVD-Video with the maximum quality possible. If Wikipedia is to be trusted, I have 9.8Mbit/s to be shared between the video and audio. My audio is 48K 16-bit (1.536Mbit/s), so this leaves 8.264Mbit/s for the video.
As compression is not an issue (as my content is only 9 minutes long), would a CBR encode of 8.2Mbit/s be better than a VBR encode with an average bitrate of, say 8Mbit/s and a maximum of 8.2Mbit/s?
Other than efficiency, is there any other upside to choosing VBR over CBR?
Combined bitrate ceiling for audio and video is 10.08....9.8 is video only
The problem is that most encoders aren't very accurate on the settings you give them. Even CBR is not very constant when viewing your bitrate in an analyzer.
Its always good practice to be conservative. Id set it at around 7.7 - 8 mb/s CBR if you used AC3 audio. If you must use PCM, I d use 6.7 or so. basically you need to keep away from the 9.8 cieling....any spikes in bitrates can be an issue
VBR is used when you must allocate bitrates across a longer piece....no worries for you here.
Also, the notion that players widely balk at bitrates above 7.5 on burned media is complete BS
Eric, I've always understood that VBR is the best way to go regardless of the space available. Encoding in BitVice for example, with VBR you get a two pass encode which should give a higher quality than the CBR one pass. This is what the development engineer at Innobits (producers of BitVice) tells me.
With VBR you have to set an average figure, with a low and a high and you are right, you don't want your average to be much higher than seven and a half, if you want to be absolutely certain that all players will play the disc everytime.
Players have certainly improved in this area. But as a battle-scarred veteran of the beginnings of DVD, I can tell you that many early players - and practically all laptops in the early days - tended to choke on anything higher than 7Mbs or so. I'm guessing some still do.
Thanks for all of the advice so far. I've done a bit of testing with AME and I've come to learn that you are spot on Eric, encoders are incredibly inaccurate. I've done some test exports, as I said, and I'm a bit confused at this stage. Maybe someone can help me clear up what this all means:
As you can see, for this test the bitrate was set at 8.
And here, I set it at 1.5.
The audio tracks on both files were AC3.
I'm curious as to why the overall bitrate (and filesize) for the 1.5 version is higher than the clip at 8? Also, the overall bitrate stays more or less the same regardless of my settings.
EDIT: I think it has something to do with the way AME is muxing the files. In CBR mode it adds packets to maintain a constant bitrate. Is it an issue if the displayed overall bitrate is over the 10.08Mbs ceiling?
The issue is that the values reported by MediaInfo aren't exact because it doesnt parse the actually file. It takes the value in the file header or it estimates by taking the total file size, subtracting the audio size, then dividing by the run time