Go Back   Cockos Incorporated Forums > REAPER Forums > Recording Technologies and Techniques

Reply
 
Thread Tools Display Modes
Old 09-03-2014, 01:17 AM   #81
Reno.thestraws
Human being with feelings
 
Reno.thestraws's Avatar
 
Join Date: Nov 2009
Location: Belgium
Posts: 10,474
Default

The Lavry summary:

- affirms value of setting the output stage of analog gear earlier in the signal chain to the optimum levels

That's the whole point.

But if you can't?

How setting the output stage of analog gear on a fireface 800?

You can't because there's no output or trim on preamps or even VU meter. (Digicheck permits it bu if you set your preamp at 0VU you'll be at -15 or -9 dbfs RMS in your DAW depending on the alignement you choose)

So, if you want to be sure to don't overload the analog part of your signal chain, you have to rely on your digital level

And, Like I said Before

0VU = 1.23V = +4dbu (the optimal sweet spot of analog gear) = -18dbfs RMS (standard but can vary from Gear to gear)
__________________
http://www.residenceemilia.com
Reno.thestraws is offline   Reply With Quote
Old 09-03-2014, 04:20 AM   #82
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by Reno.thestraws
So, You didn't test the input, but the output. That means taht you REDUCE the volume of the output (and so the SNR) before reinjecting.
I see your point: yes, the SNR goes down when reducing the output level due to the noise floor of the output, and I should have isolated for the SNR reduction on the input only. Let's do your version:

Quote:
-Play a 8k tone on your speaker (not too loud)
-Place a SM57 or a similar dynamic microphone if front of your speaker (DO NOT MONITOR THE INPUT or you'll get larsen)
-Plug the SM57 on a preamp
- create a new track in reaper with the input set to your SM57 (Do not monitor input!)
- Set the preamp to achieve a level of -0.5dbfs in REAPER on the track input and record
- now ,mute the track you just recorded and create a new one
- set the preamp to achieve a level of -18dbrms in REAPER on the track input and record
- Normalize the two recorded track and make your analysis again
I have done this now; with an AKG C-535-EB into a pre-amp on the FireStudio. (The 535 is condenser -- is there a reason to favor a dynamic? Unfortunately I only have condensers...) Same basic result: the hotter signal has less noise. There is not as big a difference as before, but it's still worse at the lower level. (Of course all this noise is effectively silent.)



I presume that besides the pre-amp stage there is another analog component in the ADC that adds a constant amount of noise, regardless of the gain (the Lavry tech mentioned "intermediate stages" between the input and the actual sampling AD converter). So you turn down the preamp, and along with it the signal, but there is other noise that does not change, and the SNR is reduced.

I'm glad you had me do this second test to isolate to the input: the noise issue (which was never a significant problem) is less than we saw before, but it is still there.

All I want to show is that the signal doesn't get worse at -0.1, and that seems pretty clear both from these tests and the listening tests I've done, not to mention the four manufacturers I've been able to track down and the other authorities quoted.

Quote:
Originally Posted by Reno.thestraws View Post
So, if you want to be sure to don't overload the analog part of your signal chain, you have to rely on your digital level
I understand what you mean: it's a point also made by others here, that getting levels at -18 dBFS RMS (or whatever particular calibration) may be a good idea for some or most analog gear coming before the ADC.

But a lot of people advocating for -18 dBFS RMS present it the opposite way: they say that you need to do it because "hitting your converters too hard makes it sound bad, distorts their input, etc". That's the point I'm trying to get at.

And it's looking increasingly like that part is in fact wrong. We just need to stop saying it backwards.

I propose this rephrasing:

Quote:
Originally Posted by rephrasing
Recording to levels at -18 dBFS RMS is generally a great idea; it doesn't change anything as far as your ADC is concerned (except adding an insignificant amount of noise, and possibly an even less significant amout of THD), but it leaves you a safe amount of headroom and tends to result in you setting appropriate output levels from other analog gear going into your ADC, and depending on your plugins it may result in more convenient levels once in the DAW; however, some analog gear may sound better outputting hotter or quieter, so use your ears, and don't worry about 'driving your ADC too hard' -- as far as any contemporary and reasonable ADC is concerned, the harder the better; just prioritize the optimal levels for the analog gear before the ADC, and make sure you have enough headroom to guarantee that clipping won't happen.
clepsydrae is online now   Reply With Quote
Old 09-03-2014, 04:57 AM   #83
Reno.thestraws
Human being with feelings
 
Reno.thestraws's Avatar
 
Join Date: Nov 2009
Location: Belgium
Posts: 10,474
Default

The result seemed weird. Did you let the tone volume at the same level trough your speakers for both takes? It's near impossible to have more noise by recording with less gain the preamp.

If You had down the speakers volume; or reduce de output gain or forget to mute take 1, you false the results
__________________
http://www.residenceemilia.com
Reno.thestraws is offline   Reply With Quote
Old 09-03-2014, 05:01 AM   #84
Lawrence
Human being with feelings
 
Join Date: Mar 2007
Posts: 21,551
Default

Quote:
Originally Posted by clepsydrae View Post
Hmmm, well, it's the latter that I'm on about... In terms of not-ancient ADCs made by companies that anyone has heard of, I'm not convinced that there is such a thing as "overdriving unfavorably" if you're peaking at the analog input such that the digital signal exhibits peaks just under 0 dBFS.
Are you actually reading what you're responding to? I specifically talked about overdriving a preamp not the ADC.

Quote:
Sorry to not pick, but just in case anyone is confused by this: AFAIK the RMS will be ~71% of a sine test tone, and AFAICT the meter in Reaper at least reflects this as well (unless it's displaying with a "display offset", which it does by 14dB by default):
Yes that is a really huge "propeller beany" nitpick. We're talking reality, not the physics lab. Case in point, here is a 440hz sine wave where the peak and RMS are - for all practical purposes - metering the same visually. The white line is RMS...



I don't need to reduce that to scientific numbers on a chart to make that basic point between that and the gap in a snare drum signal. That's another problem with digital recording, too many people worrying about too much minutia that has no real practical context.

Last edited by Lawrence; 09-03-2014 at 06:36 AM.
Lawrence is offline   Reply With Quote
Old 09-03-2014, 05:11 AM   #85
Reno.thestraws
Human being with feelings
 
Reno.thestraws's Avatar
 
Join Date: Nov 2009
Location: Belgium
Posts: 10,474
Default

Quote:
Originally Posted by Lawrence View Post
Are you actually reading what you're responding to? I specifically talked about overdriving the preamp not the ADC.
Same for me.

For the adc, there's no difference to record a -1 peak level signal or a -18 peak signal. The difference is on the analog part of the chain that is before the adc.


When you try to reach -1 on your adc instead of -18, you use more gain on your preamp

More gain = more noise
__________________
http://www.residenceemilia.com
Reno.thestraws is offline   Reply With Quote
Old 09-03-2014, 06:59 AM   #86
Lawrence
Human being with feelings
 
Join Date: Mar 2007
Posts: 21,551
Default

Quote:
Originally Posted by Reno.thestraws View Post
More gain = more noise
I understand what you mean above but some may misinterpret it.

The stuff we typically do (close mic'd pop recording) rarely even requires really high preamp input gain and the input signal is typically so far above the equipment noise level as to not matter... unless the front end is generating some unusual noise levels for some reason.

Like that Real Traps guy said, "if you can't hear it". You could mix in a Hitler speech at a certain low level into your song and nobody would even hear it. See "Subliminal Advertising".

But to your correct point, a good (less noisy?) pro preamp will stand up much better in cases where you actually need high input gain and the source signal is not 6" from the mic ... where the really quiet or much quieter sections of that track are 'all preamp gained way, way up' and nothing else. We don't run into those situation because we don't typically need that much input gain to record stuff that's right up on the mic.

In a larger pro studio room an ambient mic might be a long way away from the source(s) and gained way up. In that case you'd maybe actually hear the Audiobox or similar pre-amp self-noise in the monitors.

Last edited by Lawrence; 09-03-2014 at 07:04 AM.
Lawrence is offline   Reply With Quote
Old 09-03-2014, 07:22 AM   #87
Reno.thestraws
Human being with feelings
 
Reno.thestraws's Avatar
 
Join Date: Nov 2009
Location: Belgium
Posts: 10,474
Default

A lot a microphone need a lot of gain. Ie the sm7b that is my go to mic for vocals. In a fireface preamp or any built in preamp, you have to push more than 75% to reach -18 dbfs. If I push more to obtain "near 0" I clearly hear a difference.

As I said before, if you know what you do, then ok, forget about recording levels and trust you ears. But, if you work with amateur hardware and amateur monitoring, it's a safe bet to record with headroom and avoid unwanted noises that can be unnnoticeable on one track but quickly become a problem when layering dozens of tracks
__________________
http://www.residenceemilia.com
Reno.thestraws is offline   Reply With Quote
Old 09-03-2014, 07:45 AM   #88
Lawrence
Human being with feelings
 
Join Date: Mar 2007
Posts: 21,551
Default

Quote:
Originally Posted by Reno.thestraws View Post
A lot a microphone need a lot of gain. Ie the sm7b that is my go to mic for vocals. In a fireface preamp or any built in preamp, you have to push more than 75% to reach -18 dbfs.
Sure, but (obviously) a close mic'd signal needs less input gain to reach the same signal levels, so the point there was of a comparative nature. But yeah, I hear you and understand.

But (of course) how much gain you need for a close mic'd vocal kinda directly depends on how loud the singer is singing.

Last edited by Lawrence; 09-03-2014 at 07:50 AM.
Lawrence is offline   Reply With Quote
Old 09-03-2014, 08:18 AM   #89
karbomusic
Human being with feelings
 
karbomusic's Avatar
 
Join Date: May 2009
Posts: 29,260
Default

Quote:
Originally Posted by Lawrence View Post
Sure, but (obviously) a close mic'd signal needs less input gain to reach the same signal levels, so the point there was of a comparative nature. But yeah, I hear you and understand.

But (of course) how much gain you need for a close mic'd vocal kinda directly depends on how loud the singer is singing.
Not directed at anyone, especially you since you've sort of already said what I am about to say earlier... Back in the day and before armchair PHDs were all the rage (again no one here)...

We would follow certain procedures (whatever they may be) and the result of doing just took care of all this extra miniscule inspection and double thinking everything constantly. That means no one has to decide any of this and most importantly of all, no one needs assume or worry when what doesn't matter under what conditions, period. In other industries they do it so that 1 in 1000 exception doesn't kill you, here its so you don't have to second guess every step you take all the time.

Who on earth other than the guy actually testing the equipment in the R&D lab or someone suffering from OCD would not want to do that?
__________________
Music is what feelings sound like.
karbomusic is offline   Reply With Quote
Old 09-03-2014, 08:26 AM   #90
Lawrence
Human being with feelings
 
Join Date: Mar 2007
Posts: 21,551
Default

Right. Here's the old school approach...

"Ok, that sounds good, now lets bring up the ambient mics. Wait, I hear some noise, lets track that down. Ok, that preamp choice is maybe too noisy for this particular use, let's swap it out with another one."

No drama, no complicated math formulas, just humans using their ears, and monitors gained up high enough to allow their ears to hear any low level issues before they hit the record button. Easy. No minutia involved.

Of course, the above is a hypothetical as any pro engineer would never put up something he's not already familiar with, like put Art Tube Pres on all the ambient mics or whatever.

But generally speaking, that's the single biggest difference between "them" and us, that we can't wait to "mix" and post our songs on the net so we end up spending hours fixing stuff that we should have fixed on the front end before recording it, while they don't record anything until it already sounds good. Probably explains why their monitor mixes sound better than our final mixes.

Last edited by Lawrence; 09-03-2014 at 08:39 AM.
Lawrence is offline   Reply With Quote
Old 09-03-2014, 08:55 AM   #91
Bristol Posse
Human being with feelings
 
Bristol Posse's Avatar
 
Join Date: Jan 2011
Location: Southern California
Posts: 642
Default

Quote:
Originally Posted by Lawrence View Post
But generally speaking, that's the single biggest difference between "them" and us, that we can't wait to "mix" and post our songs on the net so we end up spending hours fixing stuff that we should have fixed on the front end before recording it, while they don't record anything until it already sounds good. Probably explains why their monitor mixes sound better than our final mixes.
Tend to agree with this and also thank clepsydrae for the OP and interesting discussion

that's the problem with the net and all these opinions or guidelines that turn into hard and fast rules parroted without any regard for context or original intent

People end up focusing on unnecessary minutiae and missing the larger picture while they search for magic bullets that don't really exist
Quote:
Originally Posted by Bruce Lee
Its like a finger pointing away to the moon. Dont concentrate on the finger or you will miss all that heavenly glory
Bristol Posse is offline   Reply With Quote
Old 09-03-2014, 09:16 AM   #92
BobF
Human being with feelings
 
BobF's Avatar
 
Join Date: Apr 2013
Posts: 699
Default

I agree. Thanks for the OP, the detailed presentation and the great discussion. Despite what appears to be the final summary, there are a few little gems in this thread that can be put to practical use.
BobF is offline   Reply With Quote
Old 09-03-2014, 02:40 PM   #93
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by Reno.thestraws View Post
The result seemed weird.
Why? There is going to be a noise floor inherent in the analog part of the ADC, AFAIK. Recall that Apogee said that lowering the input to the ADC will result in more noise? And when Jim Williams said "With a wide dynamic operating area, levels are not important anymore. Only the residual background noise will vary. Here I hear no difference in +4 or +12 levels, but any noise is reduced at those higher operating levels".

Quote:
Did you let the tone volume at the same level trough your speakers for both takes? It's near impossible to have more noise by recording with less gain the preamp.
I did not alter the outgoing signal in the DAW, I didn't alter the outgoing signal strength with any knobs, I didn't touch the mic or the speaker... I didn't even shift in my chair. I stopped the recording, muted the newly-recorded track, armed the other track, set the gain knob so that peaks went to -0.1, and recorded.

Quote:
Originally Posted by Lawrence View Post
Are you actually reading what you're responding to? I specifically talked about overdriving a preamp not the ADC.
Sorry for misreading -- given the history of the thread and how many people have talked about overdriving the analog side of the ADC, I was biased to interpret what you wrote incorrectly.

I'm also skeptical that on modern preamp/ADC combo boxes (like the PreSonus FireStudio series, other prosumer products, and high-end products designed to be "clean") that driving the preamps harder will matter any more than driving the ADC's, but that's another thread (for someone else to start. :-) )

Quote:
Yes that is a really huge "propeller beany" nitpick. We're talking reality, not the physics lab.
People might interpret what you wrote to mean that peaks of a sine wave are the same as RMS, and that might have real repercussions if they are calibrating gear, etc. 71% is not the same as 100% "for all practical purposes", afaik, but it's not my area of expertise. [update: i was wrong here -- in the audio world, true RMS is is biased 3dB in an "RMS" meter, although not in reaper's master meter by default. See below]

Quote:
Originally Posted by Lawrence
Case in point, here is a 440hz sine wave where the peak and RMS are - for all practical purposes - metering the same visually. The white line is RMS...

Not sure what meter you're using, but here's what the difference looks like on Reaper's default master track (also 440Hz):



Perhaps you have a 3dB gain offset being applied? Engaging a 3dB offset makes them appear the same in Reaper's master meters.

Quote:
Originally Posted by Reno.thestraws View Post
When you try to reach -1 on your adc instead of -18, you use more gain on your preamp
First, I never have suggested gaining up a preamp to achieve -0.5 just for the sake of getting that level into the ADC. I have suggested that if someone is really worried about noise, the evidence I've gathered suggests that recording hotter into the ADC is better. Obviously if that hurts the signal because the analog gear ahead of the ADC performs worse, as with your SM7b example, or as with my old PreSonus Firebox which had preamps that would add a ton of digital hashy noise crap when they were turned all the way up, then it's not worth the small noise improvement at the ADC.

But the point is that if you have analog gear running at optimal levels and you go into your ADC and the level is peaking at -0.5 dBFS, there is no reason you need to turn anything DOWN in order to get to -18 dBFS RMS: as far as I can tell, there will be no advantage to the audio quality, provided that clipping doesn't happen, and it seems that there certainly won't be any magical improvement in tonality, etc etc.

Quote:
Originally Posted by Reno.thestraws
More gain = more noise
More gain also = more signal; less gain = less noise and less signal, so the overall SNR doesn't necessarily change as simplistically as this formulation may suggest.

Let's accept that a preamp will have lower SNR as the gain goes up. Even so, when you then go in to the input stage of an ADC, hotter = less noise (basic gain staging), so these two dynamics are working against each other: a hotter preamp = less SNR from the preamp and better SNR at the analog part of the ADC. Which one will win? In my PreSonus FireStudio mobile, it's clear that turning the preamp down to achieve -18 dBFS RMS resulted in a microscopic bit of increased noise, no audible differences, and no obvious change in THD.

Last edited by clepsydrae; 09-03-2014 at 08:47 PM.
clepsydrae is online now   Reply With Quote
Old 09-03-2014, 03:03 PM   #94
Bristol Posse
Human being with feelings
 
Bristol Posse's Avatar
 
Join Date: Jan 2011
Location: Southern California
Posts: 642
Default

Quote:
Originally Posted by clepsydrae View Post
Not sure what meter you're using, but here's what the difference looks like on Reaper's default master track (also 440Hz):



Perhaps you have a 3dB gain offset being applied? Engaging a 3dB offset makes them appear the same in Reaper's master meters.
Here's a quote from Bob Katz over at Gearslutz that might help with this but in a nutshell you need to apply the 3dB offset or your RMS meters are "wrong"

Quote:
Originally Posted by bob katz View Post
There is an official AES and IEC-supported standard for RMS meter calibration. All RMS meters should support that standard or else they are wrong. I won't get into the technical and scientific explanations for why the IEC 61606:1997 standard is the correct one to use, as all you want to do is find a meter that's correct.

There are a number of terminologies that people try to use to describe a meter which is correctly-calibrated. Some people speak about "sine wave-based" and "square-wave based" calibration, but I find that confusing. Here's how to test if the meter is correct:

1) Send in a sine wave whose peak level is x. The RMS level in dBFS should read the SAME as the peak. If it reads 3 dB lower, it does not meet the standard.

2) Download the -20 dBFS RMS pink noise signal from our site. Digital Domain-Downloads

If this signal reads (on the average) -20 dBFS on your RMS meter, then you can feel reasonably comfortable the meter is accurate and conforms to the standard. This test is not 100% correct because it does not test the actual RMS algorithm, it only tests how the meter behaves with two different test signals, but at least this test should show that the meter reads correctly with these two types of signals.

If the meter proves to be off, you will be doing the folks a service by citing the IEC standard and telling them that it does not meet the standard. Tell the to get in touch with me if they want to argue the science behind it, and I'll explain to them the historical precedents and the logic behind it.

As for the Roger Nichols, I understand there was a controversy over who owns the company. RN Digital (the new company) produces a meter called "Inspector XL" and the K-System meters in there meet the standard. Wavelab since version 6 is correct. Previous versions were 3 dB off.

I hope this helps,


Bob
Full thread here http://www.gearslutz.com/board/maste...rms-meter.html
Bristol Posse is offline   Reply With Quote
Old 09-03-2014, 04:15 PM   #95
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by Bristol Posse View Post
Here's a quote from Bob Katz over at Gearslutz that might help with this but in a nutshell you need to apply the 3dB offset or your RMS meters are "wrong"
Thanks for the correction -- I understand now from that thread that there is a conventional "RMS" of the audio world that is 3 dB different from a true RMS. Good to know about.

I'm curious why Reaper's RMS meter doesn't default to using this offset? Anyway, it's a small side issue.
clepsydrae is online now   Reply With Quote
Old 09-04-2014, 12:19 AM   #96
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Focusrite has gotten back (they are responding here to the same email as Apogee and RME, posted above). Same story:

Quote:
Originally Posted by Focusrite
The -18dBFS is usually recommended as this is mathmatically the same as 0VU in an analog environment. Turning up signals louder than this is thus louder than you usually would use.

However the sound characteristics of the unit should remain the same at all levels, unless clipping, just louder or softer.

While -18dBFS is a good goal for an incoming signal may people do louder or softer as needed. That levels just leaves good headroom for later signal amplification, such as compression, mixing and mastering techniques.

If bringing in audio you are not going to alter, such as from a vinyl record, the level you suggest, -0.5 or so with no clipping ever occurring, should work fine, just don't clip!
clepsydrae is online now   Reply With Quote
Old 09-06-2014, 09:18 PM   #97
Aesis
Human being with feelings
 
Join Date: Jan 2011
Posts: 445
Default

Finally someone actually did a real test. Thank you Clepsydrae.

You should use a clean pre for the test and no mics at all though, the second harmonic is so strong that it's fairly difficult to see how much of that the higher gain adds. Also 8kHz is too high a frequency for detecting the non aliasing harmonics. But this is a great start, thanks again.
Aesis is offline   Reply With Quote
Old 09-09-2014, 11:06 PM   #98
yep
Human being with feelings
 
Join Date: Aug 2006
Posts: 2,019
Default

Sorry, I didn't read the whole thread.

A lot of DAW and soundcard manuals tell you to record as close to 0dB as possible before clipping for best sound quality. Most of them are wrong.

Sound is anlog before it is digital. The same devices that tell you to record as hot as possible often lack the internal current-handling capacity to deal with signal close to 0dB. The analog front-end can clip and distort well before the AD converter registers a digital "over".

Everyone has to suss this out for themselves. If you record a vocal track or a DI bass and the sound is clipped and distorted, then pointing to the unlit "clip" LED is pointless. You can't rely on lights to tell you whether it sounds good, they just help point out technical conditions. If it's clipped and distorted, you can't point to the unlit LED and prove it sounds good.
yep is offline   Reply With Quote
Old 09-10-2014, 10:33 AM   #99
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by yep View Post
Most of them are wrong.
I appreciate that manufacturers may be wrong about this and that the ground-truthed reality may differ from expectations, but until someone somewhere on the internet posts an example I have to remain highly skeptical that it is a risk that deserves mentioning.

This thread collected explicit responses from RME, Lavry, Apogee, Focusrite, and Jim Willams from Audio Upgrades, and quoted an article from Sound on Sound, all of which all denied that hotter signals will make any difference, and a couple of which pointed out ways in which it will be marginally better. (Most of them also favored 18 dBFS audio-RMS average levels into the interface, but for reasons unrelated to "overdriving" the ADC input.)

My own gear (PreSonus Firestudio Mobile) tests as expected: slightly more noise when recording lower, no other perceptible difference.

Quote:
Everyone has to suss this out for themselves. If you record a vocal track or a DI bass and the sound is clipped and distorted, then pointing to the unlit "clip" LED is pointless. You can't rely on lights to tell you whether it sounds good, they just help point out technical conditions. If it's clipped and distorted, you can't point to the unlit LED and prove it sounds good.
Agreed. But until one of these people with one of these units posts an example, and a person with the same unit can repeat their test, I'm left assuming that this concern of distortion at hot levels is a myth, personally. There's no presented evidence for it being true (as far as i can find) and a fair amount against.

For my personal purposes, I'm defining "evidence" here as one of:
- people running a test, decribing it, and posting the results
- a manufacturer of a device stating something about it
- a professional electrical audio engineer stating something about it

I'm discounting "evidence" of these forms:
- someone on a forum saying it is so
- someone on a forum saying they have heard it but not posting an example
- a famous recording engineer saying it isn't so
- a famous recording engineer saying it is so*

*that last one, however, is why i'm interested to figure it out. I have heard it repeated by people with Serious Experience, and my instinct is to trust those people that have bucketloads more experience than i do. However, even famous/experienced people can be guilty of confirmation bias and of accepting things without testing them.

edit: Sound on Sound was quoted above, and they admittedly don't fall into my "evidece" category, but what i know of their writing made me consider them borderline qualified, so I included them.
clepsydrae is online now   Reply With Quote
Old 09-11-2014, 01:10 PM   #100
ashcat_lt
Human being with feelings
 
Join Date: Dec 2012
Posts: 7,271
Default

It is a fact that many preamps have an intrinsic noise floor which is completely independent of gain. It is also true that most decent preamps don't actually make much (if any) more noise just because they're adding gain. Obviously, they will bring up the level of any noise they are fed. This means that yes, in fact, many preamps have better internal S/N ratio at reasonable to high gain levels than at lower levels. I came to this information when I was kind of shooting the other way - thinking about recording everything (including microphones) at unity and gaining up only once it gets to digital. At a certain point on everything I tried, the noise became more of a problem at lower gain levels.

But I did run my own version of the sort of test outlined above just the other day. I don't have pictures or samples or exact numbers, but...

My studio "converter" is actually a Fostex D2424LV light-piped to a PCI card in my computer. I have 16 channels as inputs and 8 channels as outputs on a patchbay. The first 8 inputs are normalled to the pres in a Nady PRA-8. Not super awesome any way around, but it all works pretty well for my needs.

The only way I could figure to actually make this work with what I had on hand was to run a test tone out two of the outputs at a level close to 0dbfs, through a stereo DI (I think the -20db pad was on, on top of the step down from the transformer itself), into the first two preamp channels, and then back into Reaper. I then tweaked the preamp gains so that channel 1 was hitting RMS around -18dbfs, and channel 2 was as loud as possible without hitting 0. Recorded, normalized peaks, and brought them up through SPAN.

There was no visible difference in harmonic content between the two. There was some distortion, but it's impossible to say where except, I would guess "before the preamp", since it's the same in both. It might be the DAC, but I personally suspect the transformer in the DI.

The noise results are interesting, though. In my case, the higher gain preamp had quite a bit less noise in the mids and highs, but there was a big bump of low frequency something else that is almost distressing (except that these are pretty extreme conditions, and it's still rather low compared to the signal).

I tried this also with a full-range commercial mix, with similar results. While it was playing, there was no audible difference at all, no visible changes to frequency content. A null test leaves a "residue" of the original sound which can easily be explained by the difference in the noise levels.


And just because I have this info that kind of fits somewhere in this discussion, but not really specifically anywhere, I have a bit of experience trying to calibrate a couple of different converters with it, that the board itself starts to break up and sound pretty nasty as you push the levels through it up toward +18dbVU. Whether the gain comes at the channel or at the master fader, when the meters get up toward the top it clips hard and fast and ugly, and it doesn't really matter much whether the ADC after it clips or not because it already sucks.


Edit to add -
OK, a bit more. The difference in noise really isn't huge. It's visible on SPAN in this particular test, but in actual practice - at least in my room - the actual noise into the preamp (ambient room noise mostly, microphone self-noise less so, and cable noise barely at all) is the real bottom limit. Nothing I personally record through a microphone is acoustically enough louder than the room noise that the preamp's noise floor is really an issue. In fact, I've gone so far as to plug an SM58 via XLR>TS cable directly into the line input on the converter, and there was not enough noise to make it unusable. Would have been too much for an audiobook or unaccompanied spoken word performance, but it was inaudible in the rather dense mix where I put it.

Likewise, I run my passive electric guitars (buffered via a pedal) into the line inputs as a matter of course. They're all different of course, but typically come in around -30dbfs or so, and I just gain them up as necessary to hit whatever amp sim comes after. The S/N ratio of even of a fully shielded guitar with humbuckers is so poor that the "compromise" just isn't. OTOH, I started out trying to do it "the right way" by running through a passive DI, to a mic preamp, and then in, and that was too much noise, especially when pumping into high gain guitar type effects.

Last edited by ashcat_lt; 09-11-2014 at 01:49 PM.
ashcat_lt is offline   Reply With Quote
Old 09-11-2014, 05:49 PM   #101
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Thanks a bunch for the report-back. If I'm reading you right, it sounds pretty much in-line with what I saw as well.

Re: the low-end bump you saw... as I mentioned in an earlier post, I, even in my limited experience, have definitely seen lower-end preamps that added (higher-end) noise besides just the noise floor when gained way up (I cited my old presonus Firebox... don't know if it was defective or what) -- maybe a similar thing with your gear... when you were gaining the preamps differently, how high was the gain on the "hot" channel? Just curious if it was extremely high or if the level-setting was more a scenario of turning the quieter channel down to get -18.
clepsydrae is online now   Reply With Quote
Old 09-11-2014, 06:03 PM   #102
ashcat_lt
Human being with feelings
 
Join Date: Dec 2012
Posts: 7,271
Default

Quote:
Originally Posted by clepsydrae View Post
...when you were gaining the preamps differently, how high was the gain on the "hot" channel? Just curious if it was extremely high or if the level-setting was more a scenario of turning the quieter channel down to get -18.
Well, my preamps don't have actual markings around the gain knobs, and I didn't really try too hard to figure it out. There was something like 30 or 40db of attenuation through the DI box, I think, and it was starting at just less than 0dbfs to begin, so that's probably about 30 or 40db to get it back up there, right? The answer to the question is exacerbated by the fact that for some reason (which I can't remember) I had changed the position of the knobs on the pots, so that the little line doesn't actually point to where it's at, and all the channels are different... The louder channel was maybe 2/3 the way up, while the other was not quite half, if I remember correctly from sort of eyeballing and trying to interpret through all that weirdness...
ashcat_lt is offline   Reply With Quote
Old 09-12-2014, 09:42 PM   #103
yep
Human being with feelings
 
Join Date: Aug 2006
Posts: 2,019
Default

Quote:
Originally Posted by clepsydrae View Post
...For my personal purposes, I'm defining "evidence" here as one of...
You don't have to take my word for anything, but just by way of example, here are some of the places that you could achieve clipping/distortion on, say, a DI bass track, without lighting up the "clip" LED:

- Clipping at the pickup. E.g., if the pickups are set too close to the strings, or if the player is hitting the strings harder than the setup is intended for, the coil-windings on the pickup magnets will clip/overload.

- Clipping of the onboard electronics or preamp (if active). Same as above, except the clipping happens in components downstream of the pickups.

- Clipping at the DI or instrument preamp.

- Clipping on the analog front-end of the AD converter (this is still an analog signal processing component, before it converts to digital).

Any or all of the above could be flatlining well before the signal gets converted to ones and zeroes. And an analog flatline is no different from a digital one. And that's just on a DI bass track with no mics, processing, etc.

Maybe someday I will have the time to provide "evidence" of this by forcing all these gain-stages into clipping and documenting it on an oscilloscope or some such. But it's real, and it happens all the time.

Sometimes clipping sounds good, some bass-players use it deliberately to get a hard-edged sound. Clipping is essentially the same thing as extremely hard limiting.

The reason to specifically try and avoid digital clipping is because the digital medium is no longer able to accurately record the source material above 0dBFS (and it usually sounds bad).
yep is offline   Reply With Quote
Old 09-12-2014, 10:56 PM   #104
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by yep View Post
here are some of the places that you could achieve clipping/distortion on, say, a DI bass track, without lighting up the "clip" LED:
Yes -- I'm not debating any of those places as being potentially very relevant to the sound at hot levels, clipping or not, with the possible exception of the ADC you mentioned possibly breakinp up or clipping without the light going off. If you have an example of an ADC being driven within it's stated parameters and without any clip light tripping (where provided) but still sounding different at hot levels, that would be very interesting. But don't take time with the other stages -- I at least am not addressing those -- I know you didn't have time to read the thread (don't blame you) but the scope here is only the ADC analog stage and whether there is isntrinsic value in -18 aside from the other clear benefits it has for level matching and workflow, etc.

(I'm a little suspicious that the -18 recommendation is a tad overblown with some of those other pre-ADC stages, e.g. modern "clean" pre-amps, but I don't the experience to have an actual opinion on that.)

Incidentally, MOTU has gotten back as well (same email as RME et al.). Same story again:

Quote:
Originally Posted by MOTU
With MOTU audio interfaces, the only difference in audio quality at different input levels is dynamic range (noise floor). Distortion and frequency response are the same at any level.

Yes, level matching between analog gear can make a difference to sonics, so the MOTU interface is only half the question. The other half of the question is: Is there an optimum level output from the device that is sending into the MOTU Interface?
clepsydrae is online now   Reply With Quote
Old 10-06-2014, 03:31 PM   #105
Magicbuss
Human being with feelings
 
Join Date: Jul 2007
Posts: 1,957
Default

Quote:
Originally Posted by Bristol Posse View Post
I generally shoot for line level RMS of 0VU (+4dBu) because I have an extensive analog tracking/mixing chain

How that shows up in reaper depends on which converters I happen to be using and how I have them calibrated

It can be anywhere from -19dBFS (RMS) to -9dBFS (RMS). I don't care about peaks so long as I don't clip

if you're not using a lot of analog gear in tracking and mixing then it becomes less relevant, unless you are using really bad interfaces that add a lot of noise and THD at higher levels or have poor S/N performance

I think the argument in the DAW ITB realm is mostly about retaining headroom and keeping the faders close to unity in mixing for easier use due to better resolution and fine adjustment closer to unity on the Fader

Also Waves and UAD (maybe slate and others too) for the most part use -18dBFS as simulated 0VU for their emulation plugins where they will give the most linear performance with best S/N ration (like analog gear)
I havent read all 3 pages yet but I agree with the above. After my own experience and watching the Kenny Gioia and puremix vids on this subject my mind on this is clear.

I record into my Daw at around -18dbfs RMS because:


1) There is no benefit to recording hotter. Those of you pushing way into the yellow on your digital peak meters can you tell me WHY??? In the old days we did it to maximize S/N. In 24 bit digital that is completely unnecessary.

2) Many plugins makers seem to be using -18db rms as a standard for nominal input gain. You most definitely CAN clip the input of some plugins if you hit them too hard. I'd like to keep my tracks in the sweet spot for processing. That sweet spot seems more and more likely to be -18db rms.


3) I think we can all agree that you can record a pristine audio track at volumes higher than -18db rms. But after recording 40 tracks at -18db rms I will probably NOT clip my master fader when I bring up the raw mix with the faders zeroed. 40 tracks at -10db rms most certainly WILL. So I will end up having to turn everything down anyway (which is what Kenny and others bitch about having to do before they can mix).

I start a mix with my faders at zero. If something is too hot I bring it down with a trim plug (satson usually) so that the track is around 0 vu (calibrated to -18). Drums I like to PEAK around -9dbfs. After the gain staging is done THEN I start moving faders. Alot of us were taught to gain stage this way in the analog days and I think its as valid as ever even ITB.

Last edited by Magicbuss; 10-06-2014 at 03:36 PM.
Magicbuss is offline   Reply With Quote
Old 10-06-2014, 04:25 PM   #106
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by Magicbuss View Post
1) There is no benefit to recording hotter. Those of you pushing way into the yellow on your digital peak meters can you tell me WHY??? In the old days we did it to maximize S/N. In 24 bit digital that is completely unnecessary.
This is more or less covered somewhere in this monster thread, but: 24bit relieves us of the S/N issue with respect to the digital noise floor (which was basically moot to begin with), but it does nothing to address the noise floor inherent in the analog stage of the ADC, so it still makes sense to record hotter IF you care about noise (note that most of the interface manufacturers stated this as well in their replies). Granted, that alone is not a good enough reason in most contemporary cases, given the other reasons to record at -18, and given how low the noise is likely to be, but it can sometimes still be relevant, especially on budget gear. E.g. back when I used a Presonus Firebox there was enough hashy interference/noise stuff in the preamps that it made an easily-audible difference to record as hot as possible. (On my current Firestudio Mobile it's not nearly as much an issue.) That was one example where taking the -18 advice would have been a mistake: I was using neither analog processing ahead of the interface that cared about the level, nor plugins that had a "sweet spot", so recording at -18 just made my recordings sound noisy.

Quote:
2) Many plugins makers seem to be using -18db rms as a standard for nominal input gain. You most definitely CAN clip the input of some plugins if you hit them too hard. I'd like to keep my tracks in the sweet spot for processing. That sweet spot seems more and more likely to be -18db rms.
Yeah, I can't speak to the "many" characterization, since I use mostly free plugins and iZotope Alloy 2. None of the plugins I use seem to be aligned in any noticeable way for -18, so it becomes moot for me personally... I just haven't seen this trend as strongly as some say, but I take it from those with broader experience.

Quote:
3) I think we can all agree that you can record a pristine audio track at volumes higher than -18db rms. But after recording 40 tracks at -18db rms I will probably NOT clip my master fader when I bring up the raw mix with the faders zeroed. 40 tracks at -10db rms most certainly WILL. So I will end up having to turn everything down anyway (which is what Kenny and others bitch about having to do before they can mix).
This is another argument that I accept but which doesn't seem to apply to my workflow. Even though I record as hot as is comfortable (i.e. no risk of clipping, but hotter than -18 dBFS RMS) I still have to bring everything up in Reaper before I can really use it. Maybe I record peaky stuff more than the next guy, or perhaps my monitors are gained way lower than most people, I'm not sure. But i'm often eager to get the compressor on a track so I can gain it up and hear it without having to turn the monitors up. I like my monitors set so that when I'm mastering a song and the levels are nice and hot, the levels are a bit too loud out of the monitors (and thus require attenuating them some). That seems a good convenient middle ground for me, but when the raw -18 tracks come in, they are too quiet to work with, so recording hotter lets me postpone the gaining-up of the track if it's convenient to do so.
clepsydrae is online now   Reply With Quote
Old 10-06-2014, 09:31 PM   #107
DuraMorte
Human being with feelings
 
Join Date: Jun 2010
Location: In your compressor, making coffee.
Posts: 1,165
Default

Quote:
Originally Posted by clepsydrae View Post
If you have an example of an ADC being driven within it's stated parameters and without any clip light tripping (where provided) but still sounding different at hot levels, that would be very interesting.
Try your old 8kHz test at 40Hz, and see what happens. But don't just look at the frequency graph; also check the waveforms.

A 40Hz sine wave gained up to -.1dBFS at the input could saturate the input and cause sagging, flatlined garbage... and the clip LED will never light up.
__________________
To a man with a hammer, every problem looks like a nail. - yep
There are various ways to skin a cat :D - EvilDragon
DuraMorte is offline   Reply With Quote
Old 10-06-2014, 10:33 PM   #108
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by DuraMorte View Post
Try your old 8kHz test at 40Hz, and see what happens. But don't just look at the frequency graph; also check the waveforms.

A 40Hz sine wave gained up to -.1dBFS at the input could saturate the input and cause sagging, flatlined garbage... and the clip LED will never light up.
Here is the 40Hz recorded at -0.2dBFS (as close as I could get without occasional clipping):



Here is the 40Hz recorded at -18 (peaks, so maybe -21 RMS):



...looks like the -18 signal resulted in a distortion spike at roughly 520Hz and just a bit more noise in the high frequencies. The distortion spike is in line with what Jim Williams said about most converters having less THD at hotter levels (see post #41).

The waveforms look identical (this is a representative slice):

clepsydrae is online now   Reply With Quote
Old 10-06-2014, 10:53 PM   #109
ivansc
Human being with feelings
 
Join Date: Aug 2007
Location: Near Cambridge UK and Near Questembert, France
Posts: 22,754
Default

Quote:
Originally Posted by Reno.thestraws View Post
Did you let the tone volume at the same level trough your speakers for both takes? It's near impossible to have more noise by recording with less gain the preamp.

If You had down the speakers volume; or reduce de output gain or forget to mute take 1, you false the results
Au jour d'hui, nous speakons franglais!

Highly amused to see your usually impeccable English getting garbled for a change! Its usually ME doing it to French.

ivansc is offline   Reply With Quote
Old 10-07-2014, 07:05 AM   #110
DuraMorte
Human being with feelings
 
Join Date: Jun 2010
Location: In your compressor, making coffee.
Posts: 1,165
Default

Quote:
Originally Posted by clepsydrae View Post
snipped for convenience
Interesting result. I'm a little surprised, tbh. My interface probably wouldn't accomplish such a clean sine wave at 40Hz.

With those results (assuming they're concrete and repeatable), I think we can safely make the following statement(s):
1) Recording with average levels around -18dBFS can help with integration with hardware, allow plugins designed to emulate hardware enough headroom to function properly, allow the mixer to better utilize the resolution of the faders, and gives your performers some space to breathe (dynamically speaking) before clipping occurs.
2) However, "sound quality" MAY OR MAY NOT be a valid reason to record at average levels of -18dBFS, depending on interface quality, preamp quality, etc.

Can we all pretty much agree to those terms?
__________________
To a man with a hammer, every problem looks like a nail. - yep
There are various ways to skin a cat :D - EvilDragon
DuraMorte is offline   Reply With Quote
Old 10-07-2014, 11:19 AM   #111
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by DuraMorte View Post
Interesting result. I'm a little surprised, tbh. My interface probably wouldn't accomplish such a clean sine wave at 40Hz.
Hey, test it out and post! :-) I got my interface for like $80 on eBay, so it's not some high-end piece of gear or anything, and given what all the manufacturers in this thread said, and what other technicians have said, it sounds like the expected result in general, to me.

Quote:
With those results (assuming they're concrete and repeatable), I think we can safely make the following statement(s):
1) Recording with average levels around -18dBFS can help with integration with hardware, allow plugins designed to emulate hardware enough headroom to function properly, allow the mixer to better utilize the resolution of the faders, and gives your performers some space to breathe (dynamically speaking) before clipping occurs.
2) However, "sound quality" MAY OR MAY NOT be a valid reason to record at average levels of -18dBFS, depending on interface quality, preamp quality, etc.

Can we all pretty much agree to those terms?
Yeah, I could agree with that. The way I summed it up was in post #82, above:

Quote:
Originally Posted by me
Recording to levels at -18 dBFS RMS is generally a great idea; it doesn't change anything as far as your ADC is concerned (except adding an insignificant amount of noise, and possibly an even less significant amout of THD), but it leaves you a safe amount of headroom and tends to result in you setting appropriate output levels from other analog gear going into your ADC, and depending on your plugins it may result in more convenient levels once in the DAW; however, some analog gear may sound better outputting hotter or quieter, so use your ears, and don't worry about 'driving your ADC too hard' -- as far as any contemporary and reasonable ADC is concerned, the harder the better; just prioritize the optimal levels for the analog gear before the ADC, and make sure you have enough headroom to guarantee that clipping won't happen.
clepsydrae is online now   Reply With Quote
Old 11-08-2014, 12:26 PM   #112
JamesNV
Human being with feelings
 
Join Date: Feb 2013
Posts: 4
Default

Quote:
Originally Posted by clepsydrae View Post
Hey, test it out and post! :-) I got my interface for like $80 on eBay, so it's not some high-end piece of gear or anything, and given what all the manufacturers in this thread said, and what other technicians have said, it sounds like the expected result in general, to me.
I have a Firestudio Mobile as well. On the specs it says the reference level for 0 dBFS is +10 dBu.

Does that suggest the FSMobile was designed to record a bit hotter? For example, the regular Firestudio lists it's reference level at +18dBu.

Just trying to wrap my head around this.
JamesNV is offline   Reply With Quote
Old 11-08-2014, 12:54 PM   #113
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by JamesNV View Post
I have a Firestudio Mobile as well. On the specs it says the reference level for 0 dBFS is +10 dBu.

Does that suggest the FSMobile was designed to record a bit hotter? For example, the regular Firestudio lists it's reference level at +18dBu.
Yeah, isn't that interesting.

It seems fair to phrase it as you did: from the perspective of the analog gear ahead of the interface, recording into a Firestudio you would want resultant RMS peaks around -18 dBFS, where you want them at -10 dBFS on the Firestudio Mobile.

Put another way, a signal at 0 dBu (the analog "sweet spot" in the generalized, oversimplified sense) coming into the FS registers as -18dBFS, and coming in at 0 dBu into the FSM registers at -10dBFS.

(This distinction mattering only the analog gear ahead of the converter, of course, and any analog-simulating plugins you might use, etc etc, see long thread above...)
clepsydrae is online now   Reply With Quote
Old 11-08-2014, 05:41 PM   #114
JamesNV
Human being with feelings
 
Join Date: Feb 2013
Posts: 4
Default

Quote:
Originally Posted by clepsydrae View Post
Put another way, a signal at 0 dBu (the analog "sweet spot" in the generalized, oversimplified sense) coming into the FS registers as -18dBFS, and coming in at 0 dBu into the FSM registers at -10dBFS.
Stranger still, I've read the analog 'sweet spot' is 0 VU, which is supposedly equivalent to +4 dBu... So does that mean the 'sweet spot' for the FSMobile -6 dbfs? That doesn't leave much headroom. Or... should I not worry about it too much, record the FSMobile relatively hot then later trim it down according to need? (Maybe using a plugin like the HorNet VUMeter?)

Not sure if I'm confusing input with output.
JamesNV is offline   Reply With Quote
Old 11-08-2014, 06:12 PM   #115
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by JamesNV View Post
Stranger still, I've read the analog 'sweet spot' is 0 VU, which is supposedly equivalent to +4 dBu... So does that mean the 'sweet spot' for the FSMobile -6 dbfs? That doesn't leave much headroom. Or... should I not worry about it too much, record the FSMobile relatively hot then later trim it down according to need?
See post 23 from this thread: http://forum.cockos.com/showpost.php...8&postcount=23

(That graph encapsulates more than what I know about the question.)

My information is that analog gear is variable enough that it all depends on what you're going through on the way to the DAC, and really you just need to use your ears to know what's best. (i.e. maybe the "sweet spot" is somewhere else entirely for a particular piece of gear.)

For developing a rough rule of thumb to go by in general, I think what you're trying to match with your DAC levels is 0 dBu. But I'm not sure: maybe engineers design for 0dBVU (+4dBu) as the sweet spot. Frankly I'd be surprised if it's really that precise of an issue for any piece of gear out there.

If you're just wondering about what the FSM "wants", the answer is to record as hot as possible without any risk of clipping, which often will translate to RMS peaks around -6 or so (very rough ballpark, depends on the signal, etc.) The only reason to worry about this -18 stuff is to make sure that analog stages ahead of the FSM (compressors, processors, etc, not talking about mics here) are outputting at a level they are comfortable with, and as said, it really depends on that gear. Or so I'm told.

Quote:
(Maybe using a plugin like the HorNet VUMeter?)
A tad less convenient, but you can use Reaper's master meter in the mixer panel: right-click and choose "Peak+RMS", set the display offset to 0, and set the display gain to 3, then watch the meters on the outside. For unknown reasons they are not available elsewhere except with plugins. There must be a JS plugin that shows RMS levels?
clepsydrae is online now   Reply With Quote
Old 11-21-2014, 03:26 PM   #116
thompal
Human being with feelings
 
Join Date: Aug 2014
Posts: 9
Default

Quote:
Originally Posted by Bristol Posse View Post
Here's a quote from Bob Katz over at Gearslutz that might help with this but in a nutshell you need to apply the 3dB offset or your RMS meters are "wrong"

1) Send in a sine wave whose peak level is x. The RMS level in dBFS should read the SAME as the peak. If it reads 3 dB lower, it does not meet the standard.
This is where I get lost in the discussion. The peak voltage of a sine wave is 1.414 times the RMS voltage of that same sine wave. Put another way, the RMS voltage = .707 x peak voltage.

So, why should a meter show the same RMS and peak values???
thompal is offline   Reply With Quote
Old 11-21-2014, 03:55 PM   #117
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by thompal View Post
This is where I get lost in the discussion. The peak voltage of a sine wave is 1.414 times the RMS voltage of that same sine wave. Put another way, the RMS voltage = .707 x peak voltage.

So, why should a meter show the same RMS and peak values???
It's just a standard bias that audio meters use... mathematically, a sine wave RMS is ~.707 of the peak, but "RMS" in the audio context has been defined so that "RMS" (i often write it "audio-RMS" for attempted clarity) of a sine wave is the same as its peak.

I once found an explanation somewhere, but I'm coming up empty right now. Maybe it had something to do with making the lives of techs easier when they were calibrating? Send a test signal with a known peak voltage and calibrate the RMS meter to 0, or something to that effect? Can't recall. But my vague memory is that it wasn't all that interesting; just a legacy of something or other, and now its the standard, and meters adhere to it for the sake of consistency between gear, as with so many things in the audio world.

Unfortunately in his post Bob Katz just says "I won't get into the technical and scientific explanations for why the IEC 61606:1997 standard is the correct one to use, as all you want to do is find a meter that's correct."
clepsydrae is online now   Reply With Quote
Old 11-23-2014, 04:56 AM   #118
xpander
Human being with feelings
 
xpander's Avatar
 
Join Date: Jun 2007
Location: Terra incognita
Posts: 7,670
Default

Quote:
Originally Posted by thompal View Post
So, why should a meter show the same RMS and peak values???
Why should a meter show the same RMS and peak values when using the sine waves? It's an old standard reference.
i need a correct RMS meter
https://www.gearslutz.com/board/4154742-post4.html

Last edited by xpander; 11-23-2014 at 05:11 AM. Reason: more exact question, maybe.
xpander is offline   Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -7. The time now is 09:14 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.