View Single Post
  #39   Report Post  
Posted to rec.audio.high-end
Doug McDonald[_6_] Doug McDonald[_6_] is offline
external usenet poster
 
Posts: 57
Default Compression vs High-Res Audio

On 9/30/2010 10:28 AM, Arny Krueger wrote:


This is so bad to me that it hurts my head when I read it. If you mismatch
the amplitude of two signals by 1 dB, the difference signal is 10%. Yet
neither signal need have any added nonlinear distortion at all. You just got
the levels a bit wrong. And this is aside from the inaudible phase shift
issue that I raised above. So now *the grat man* would be appear to be
talking trash on two levels. Ouch!

I gotta stop, this sort of gross technical ignorance in high places makes my
head hurt. Hopefully a verbatim report would be more reasonable.



It is exceedingly easy to match the levels to calculate distortion_plus_noise.
Once can also, if one wishes, correct the phase. However, in my experience
doing computerized MP3 tests, using LAME, neither is necessary. If one
tells LAME to use 320 kbps, the difference between files is very small
and mostly noise. If one uses 96 kbps fixed bitrate, the distortion
is fairly large and not all noise. The only question is "at what bitrate
does it actually become audible in double blind tests?". At 96 kbps
I can hear the difference. The difference signal is substantially
nonlinear distortion.

Doug McDonald