View Single Post
  #1   Report Post  
Posted to rec.audio.pro,rec.audio.tech
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default History Lesson: 600 ohm balanced line

In article ,
wrote:
Hello Everyone,

I am looking for a Greybeard of sorts. I have recently been thrown
into the audio realm, particularly testing with semiconductor PA's,
and I am curious to know where the 600 ohm impedance originated from.


If you have open-wire transmission lines with two 18 ga. wires about
five inches apart on the telephone pole, you have a line with a 600
ohm characteristic impedance. This was the standard telephone circuit
well into the 1920s, and as a result the phone company adopted 600 ohm
lines and termination for almost everything.

A sidelight: 20 ga twisted pair with thick cotton insulation tends to
be around 150 ohms characteristic, so the phone company also used that
as a standard, starting in the teens. For many years, CBS Radio used
150 ohms as their transmission line standards, so their equipment would
not interoperate with the rest of the industry without adding more
matching transformers. A lot of gear still had 150 ohm taps well into
the seventies.

For example, most testing I have done is with 4 ohm to 8 ohms with
PA's and 16 ohms or 32 ohms with headphones for portable audio
(computing, MP3, cell phone) and there is generally no need for
impedance matching.


Right, in the modern world almost everything has a high-Z input and a
low-Z output, and you don't care about the cable characteristic impedance
unless you are running cables for tens of miles (as the telcos do).

I have managed to piece together some basic information from multiple
Google searches that 600 ohms originated from the POTS and was adopted
by the pro audio crowd decades ago, but I would like some more
'historical' information of when, why, and how.
What prompted this question is that another group uses an HP 8903B
which has either a 50 ohm or 600 ohm impedance to test audio analog
CMOS switches and 600 ohms is selected for THD+N measurements.
The philosophy of the impedance difference intrigued me and thus has
lead me on a search to understand where the 600 ohms standard came
from and why some equipment only has this option.
Any tips, notes, or thoughts will be greatly appreciated.


You want goofy, look up where the 50 and 75 ohm transmission line
standards came from...
--scott

--
"C'est un Nagra. C'est suisse, et tres, tres precis."