A question about dBVU and dBFS and distortion
Posted: Fri Mar 13, 2009 5:38 am
Hi Joe (et al),
There has been a lot of controversy about gain staging with regard to digital vs. analog gear. To summarize, here's what I've learned:
Most analog gear is designed to run at 0dBVU, with headroom above that generally +18dBVU to +22dBVU (allowing for some variation).
Most meters inside the sequencing software measure dBFS, and 0dBFS generally = +18dBVU to +22dBVU (again, allowing for variation).
Factually, I'm pretty sure I've got that right.
So here's where things get a bit troublesome. If my analog gear is designed to run at optimum signal 0dBVU, then when I am using my sequencer's meters (in the computer software) I should see about -18-(-22)dBFS depending upon calibration, right?
Wouldn't running hotter than that be simply increasing distortion in the analog device? In other words, if I'm reaching say -4dBFS on my digital track meter, I'd be running around +14dBVU. Isn't that about 14dBVU too hot for optimum performance?
People generally try to run the digital meters up near 0dBFS to get "the hottest signal possible" but again... doesn't that overdrive the optimum level of the preamp or other analog device?
I'm asking because I know that Joe (and probably some others among you) likely worked on analog studio gear before the age of digital. Was it typical to run your preamps or other analog gear at +whatever dBVU all the time, or did you guys shoot for 0dBVU?
Consider the can of worms open!
Aric Keith
There has been a lot of controversy about gain staging with regard to digital vs. analog gear. To summarize, here's what I've learned:
Most analog gear is designed to run at 0dBVU, with headroom above that generally +18dBVU to +22dBVU (allowing for some variation).
Most meters inside the sequencing software measure dBFS, and 0dBFS generally = +18dBVU to +22dBVU (again, allowing for variation).
Factually, I'm pretty sure I've got that right.
So here's where things get a bit troublesome. If my analog gear is designed to run at optimum signal 0dBVU, then when I am using my sequencer's meters (in the computer software) I should see about -18-(-22)dBFS depending upon calibration, right?
Wouldn't running hotter than that be simply increasing distortion in the analog device? In other words, if I'm reaching say -4dBFS on my digital track meter, I'd be running around +14dBVU. Isn't that about 14dBVU too hot for optimum performance?
People generally try to run the digital meters up near 0dBFS to get "the hottest signal possible" but again... doesn't that overdrive the optimum level of the preamp or other analog device?
I'm asking because I know that Joe (and probably some others among you) likely worked on analog studio gear before the age of digital. Was it typical to run your preamps or other analog gear at +whatever dBVU all the time, or did you guys shoot for 0dBVU?
Consider the can of worms open!
Aric Keith