FORUMS: list search recent posts

Re: ATEM Audio level difference between analog and digital outputs?

COW Forums : Blackmagic Design

VIEW ALL   •   ADD A NEW POST   •   PRINT
Respond to this post   •   Return to posts index   •   Read entire thread


Jeff Hartman
Re: ATEM Audio level difference between analog and digital outputs?
on Dec 23, 2011 at 2:18:27 pm

Just a bit of background to help you interpret what's going on...

Unlike most units of measurement, decibels are a relative measure, always compared to something else. It's not like measuring X volts or saying something is N inches long; rather, it's like saying a signal is three times more than... something. So a reading in decibels is not just the bare number; you also need to know what that "something" is -- the reference.

In the world of analog equipment, one use of decibels is to describe the voltage of an audio signal; the reference used is the amount of voltage that you have when the signal delivers one milliwatt of power into a 600 ohm load. That measurement is usually shown as dBm (the "m" signifying "milliwatt"). Most professional analog audio gear nowadays uses +4dBm as the nominal operating level; back when I started in television, +8dBm was the more common reference level. Generally speaking, most analog equipment is capable of handling peak voltages around +24dBm, which means there's about 20dB of headroom above the nominal operating level before clipping begins. The exact point of clipping varies from device to device, so a systems designer needs to be mindful to avoid serious mismatches that would lead to early distortion.

Now if you look at a piece of analog equipment with a VU meter -- a console or tape recorder will do -- you'll see a scale in decibels with zero about two thirds of the way to the right. This is measuring level with respect to the nominal operating level. So if you have a mixer that has a +4dBm reference level as its standard, you will see +4dBm when the meter reads 0; drop the level so the meter reads -10dB, and the output voltage will drop to -6dBm. Raise the level to +6dB, and the output becomes +10dBm. Where analog VU metering falls short is in representing peaks; very few analog meters go anywhere near the typical clipping point of +20dB VU. But in the analog world, it's pretty unusual to spend any real time there; moreover, a lot of analog devices can handle these overloads fairly gracefully.

Digital equipment, on the other hand, does not suffer overload well at all. There's no room for "just a little bit more"; once the numbers hit the maximum value, that's as far as things will go. And voila: instant clipping. This is such a serious issue that a great many digital devices measure audio in terms of that brick wall limit: and the measurement is dBfs, decibels with respect to Full Scale. Since the reference is the maximum possible value, that makes all dBfs measurements a negative number (which is the first thing that tends to hang people up). As a general case, most digital gear makes the assumption that the normal operating level is 20dB below the maximum, so for these devices -20dBfs will show as 0 VU, and will produce +4dBm on an analog output.

So... how does this play out in the real world? If you have a typical system with a mix of analog and digital equipment and throw tone on your mixer so that the VU meter shows 0dB, you should expect to measure a voltage of +4dBm on an analog output; feed it into a digital recorder, and you should see a reading of -20dBfs. This is entirely normal, and is exactly what you should be shooting for.

I could go on about the differences between average versus peak metering, but that's more intuitive and not directly pertinent to the issue you're having.

Regards,

Jeff

Jeff Hartman
Engineering Project Manager
Newport Television, Northeast


Posts IndexRead Thread 


Current Message Thread:





© 2019 CreativeCOW.net All Rights Reserved
[TOP]