Do you use ribbon mics?


Wire We Doing This
The ins and outs of cable and wire...
By Paul Stamler

It’s only wire. Or is it

The world of audio splits into two camps on the subject of wire and cable. On the one hand, most audiophiles claim to hear massive differences between cables, shelling out remarkable sums for ever more exotic configurations. (I believe the record is $15,000 for a 1-meter stereo interconnect.)

The other side says this is tommyrot, that except for a few effects produced by resistance and capacitance, wire is wire and the mega-priced jewels beloved of audiophiles sound no different than their Radio Shack equivalents. Many—but not all—audio professionals fall into the latter camp.

In this article, I’ll try to steer a path between these two positions, garnering useful information from both. I should state right now that I consider myself a recovering audiophile, and a moderate on the subject of cables: I do hear differences between them, but I’m skeptical about the more extreme claims—and I certainly wouldn’t spend the price of a good used car on them.

Let’s start with stuff everyone agrees on.

Chanting Ohm

Back in the 1960s, “speaker wire” meant a long roll of thin, cheap wire, usually about 22 gauge, that you bought at the hardware store or hi-fi joint. (Most stereo stores still stock the stuff.) What effect will a moderate run of this wire have on the sound of your monitors? (The following discussion is slightly simplified to protect the innocent.)

First principle: all wire has resistance—electrons don’t flow through copper perfectly freely. The resistance of wire is inversely proportional to its diameter; electrons flow more easily through heavier wire, just as water flows more freely through a larger hose. Each conductor of that cheap “speaker wire” has a resistance of .01646 Ohms per linear foot.

Let’s say your amplifier-to-monitor run is 40 feet per side—that’s about what it would be in a medium sized room with the monitors on the wall and the amp in a rack, with enough wire to go over a doorway. The resistance per conductor will then be 0.6584 Ohms; since speaker wire has two conductors, the total circuit resistance will be twice that, or 1.3168 Ohms. This resistance is in series with the monitor.

Giving a damp

The first effect on the monitor’s sound will be subtle. Modern loudspeakers are designed with the assumption that the impedance driving them will be very low, a fraction of an Ohm. (Indeed, this is what most solid-state amplifiers provide.) The speaker’s bass tuning is designed using that assumption.

In a closed box speaker, the bass tuning is defined by the “Q” of the system; lower Q speakers are taut in the bass, higher Q speakers are warmer and less defined. If a closed box system is designed for a Q of 0.8 (slightly warmer than neutral, but not excessively so), using our 40' of cheap wire between that system and the amplifier might shift the Q to 0.9; the sound will be warmer and slightly muddier, with greater ringing on transients. This effect may be subtle, but it means you’re not hearing the monitor as the maker intended—which may or may not worry you.

More serious is the effect on vented box speakers, which include most of the monitors sold for studio use. A vented box is a juggling act between the resonance and Q of the woofer and the tuning of the box, as controlled by the size and depth of the vent. Change the woofer’s Q by inserting that cheap wire and you can easily wind up with a mistuned box: a whopper one-note upper bass resonance (“juke box bass”), a lack of deep bass extension, or both. p>

Divide and conquer

The other cable effect on monitor performance stems from the fact that a loudspeaker is not a simple load. Take a look at Figure 1, which plots the impedance of a typical loudspeaker system. (To simplify things a bit, impedance is a device’s resistance to alternating current, such as an audio signal.)

This is nominally an 8 Ohm device, but the impedance dips to about 4.8 Ohms at its minimum, rises at the crossover point, hits two big peaks in the bass (corresponding to the resonances of the woofer and the vented box), and also rises at the high frequencies (due to the inductance of the tweeter’s voice coil). Eight Ohms, my foot.

Let me introduce you now to something called a ‘voltage divider’ (Figure 2). This is simply a pair of resistors, one of which is in series with the signal (R1), the other shunted between the signal and ground (R2). “Vin” is the signal level you put into the divider; “Vout” is what comes out. The formula for calculating the output is:

Vout = Vin x R2 / (R1 + R2)

In our monitor hookup, R2 represents the speaker’s impedance at a given frequency, while R1 is the resistance of that long hunk of cheap wire. At one of the bass humps, where the impedance is 25.2 Ohms, Vout = Vin x 25.2 / (25.2 + 1.3168), or Vin x 0.949. This is a loss of 0.45 dB—not much, you say.

But at the lowest point of the speaker’s curve, where the impedance is 4.8 Ohms, Vout = Vin x 4.8 / (4.8 + 1.3168), or Vin x 0.780, a loss of 2.15 dB. This represents an additional 1.7 dB loss compared to the level at the bass hump; you can calculate similar figures for every point in the speaker’s impedance curve.

What you get when you do that is a pale echo of the impedance curve, but now it charts a change in frequency response (Figure 3). In effect, you’re superimposing a fixed and involuntary eq curve on your monitor speakers, and it will be audible.

This is not what you want in the system on which you’ll judge your mixes.

We begin to see one reason powered monitors have become so popular: they’re predictable, as the designer knows exactly what will connect the amplifier to the speakers.

We also begin to see one answer to the question “what’s the best speaker cable?” The answer is “a short, fat one.”

Let’s run the numbers again, but this time with nice, hefty 12 gauge cable. Now the resistance of a 40' run is only 0.13 Ohms, less than 1/10 of the cheap wire’s. And the effect on frequency response is proportionally lower; the change in response between the 25.2 Ohm hump and the 4.8 Ohm dip is now only 0.19 dB. By shortening the cable to 5' (placing the amp between the monitors) this drops to a negligible 0.023 dB.

Like I said, short and fat.

A question of atmosphere

While we’re talking about speaker cables: does it do any good to buy cables made from higher grade, oxygen-free copper? Or is this just a scam to part gullible audiophiles from more cash?

Let’s look at copper. It’s a metal that’s easily oxidized; it’s quite happy to turn from pure copper metal to copper oxide. Copper oxide, unlike metallic copper, is not a conductor, but a semiconductor; in fact it was used to make diodes way back when.

Most speaker cables are made up of dozens (or sometimes hundreds) of thin strands rather than single conductors. As current passes through the strand it also jumps from strand to strand—provided the path is clear. A layer of copper oxide on the surface of each strand provides a barrier, and “nonlinearities” can be generated…a long way of saying ‘distortion.’ This phenomenon is called interstrand rectification, and the way to prevent it is to keep oxygen away from the wires. Usually this is taken care of by the cable’s insulation.

Except for one thing: metallic copper, as it comes from most refineries, contains a bit of oxygen. As the wire ages, this oxygen combines with the copper to form the dreaded oxides—not only on the surface of the strands, but within the strands themselves. This means the electrons, as they course through the wire jumping from crystal to copper crystal, must run a gauntlet of copper oxide regions that grows more difficult as the wire gets older.

The cure for that is to use special refining techniques to remove as much oxygen as possible from the copper before it’s drawn into wire. Most higher quality speaker cables (from companies like Monster, Apature, etc.) are now made from oxygen-free copper (which is actually very low oxygen rather than oxygen free, but we’ll let the hyperbole pass). There may be no audible difference between these cables and similar gauge conventional cables when both are new, but I think you’ll hear the difference in five years or so.

With your shield, or on it

Let’s talk about microphone cables now, and particularly about shielding. Contrary to what you may have heard, the shield on a mic cable will not keep out 60 Hz hum.

Oh really? Why not? And if not, why have it at all?

First off, a lot of stuff that we think is 60 Hz hum is actually something else. Much of the time it’s buzz from cheap SCR light dimmers or fluorescent light ballasts. This consists of short, sharp spikes that repeat at the rate of 60 Hz (the standard line frequency of the USA, Canada, and Japan; the rest of the world uses 50 Hz). There’s very little actual 60 Hz content, but there are plenty of harmonics and lots of radio frequency crud, known in the trade as radio frequency interference (RFI). Cable shielding does help screen out this type of RFI.

Another major source of 60 Hz trouble is the vertical sync pulse of television signals, which keeps the picture from rolling. This repeats 30 times per second (again, in the USA, Canada, and Japan), or 30 Hz, with harmonics at 60 Hz, 90 Hz, 120 Hz, etc. This signal modulates a radio frequency signal (VHF or UHF, depending on the offending station); if it’s picked up by a mic cable then amplified by a mic preamp, it can show up in the audio signal as hum. Again, cable shielding can help screen out this form of RFI.

But what about real 60 Hz hum?

Induction ceremony

Hum is typically coupled into a microphone cable inductively. Think back to your high school science course—do you remember how a transformer works? Alternating current passes through a primary coil, which induces a similar current in the transformer’s secondary even though there’s no electrical connection between them. The coupling happens because the current in the primary generates an electromagnetic field, which in turn induces a current to flow in the secondary.

When a studio has a hum problem, the effect is the same as in a transformer, but the primary is a device that radiates a 60 Hz electromagnetic field—say, a wall wart or the power transformer in a guitar amp. And the secondary is, unfortunately, your mic cable. The shielding can’t do doodly to stop it. In fact, the only way to block electromagnetic induction is to encase your cable in a conduit made of steel or, perhaps mu metal, an alloy that’s particularly effective in stopping electromagnetic fields.

Not practical. We need another way.

A delicate balance

The other way is to use “balanced” inputs on mic preamps. Look at Figure 4a, representing the input of a microphone preamp (free-standing or in a console—it doesn’t matter).

This is a “differential” input; there are two separate signal input terminals on the preamp in addition to a ground connection. The preamp is not sensitive to the individual voltages on the two signal terminals, but only to the difference between them.

Microphones usually operate in “push-pull”; their two signal outputs produce the same signal voltages but in mirror image, so that as one output goes up the other goes down by the same amount. (Notice the two little waveforms next to the wires in the figure.) Because the signal is push-pull it creates a difference in voltage between the two preamp inputs, which is duly amplified by the preamp and sent down the line as audio.

What if identical signals with the same polarity appeared at both inputs (Figure 4b)? Because the signal goes up and down at the same time on both terminals to the same degree, there’s never a difference in voltage between the two terminals—and the preamp is designed to only amplify differences, regardless of the absolute level at each input. So the preamp ignores this identical at both terminals signal, which is called common mode.

The trick, of course, is to ensure that the signals we want are differential, while the signals we don’t want are common mode. We do that by making the microphones push-pull (differential mode) but placing the two signal-carrying wires in the mic cable as close to one another as possible. In that way, we hope, any hum that is induced into the wires by stray wall warts will be induced equally in both wires (common mode), and will therefore be rejected by the preamp.

It behooves us to make the two induced signals as close to identical as possible to maximize the preamp’s rejection. This means keeping the wires close, usually twisted around one another inside the shield. But what happens when the hum gets really bad?

I’ll make you a star, baby

A couple of decades ago, manufacturers introduced a new type of audio cable called Star Quad. This contained four signal conductors, two for push and two for pull, braided together rather than simply twisted. The manufacturers claimed that the combination of more conductors and tight braiding yielded much more effective rejection of hum.

Boy, they weren’t kidding. I’ve been in situations where the hum field was bad enough that even the best conventional cables produced audio that sounded like hives full of droning bees, but Star Quad cables stopped the hum dead. If anything, the makers’ claims were understated. The stuff is something of a pain to work with if you solder your own cables, but the results justify the effort (and there are Star Quad cables sold already made up). Originally Star Quad cables were only produced by Canare, but now Mogami and Belden make them as well. In severe hum fields a Star Quad cable may prove the difference between usable recordings and junk.

Onward and upward

Most of what I’ve talked about so far is uncontroversial; cable resistance, Star Quad design, and the like are accepted concepts among all factions of audio. Now it's time, however, to go into the twilight zone where the wild arguments roam, where the wire effects aren't accepted by everyone as real. Audiophiles claim huge differences between cables, waxing rhapsodic over sounds other listeners don’t hear at all. The same battle goes on in the recording community; workshops by audio pros at AES meetings have often pooh-poohed the idea that competently designed cables sound different.

What’s going on?

I need to stake out my own position here: I do hear differences from cable to cable, including microphone cables, that I don’t think are in my imagination. I also hear differences when I use “audiophile” cables as interconnects, although I often don’t like what I hear. Again, what’s going on?

I’m a rationalist; I don’t think audio phenomena have mystical causes or arise without a cause. But I also don’t think we’ve discovered every possible measurement that can be made on audio equipment, so the fact that two devices measure identically on a particular test does not mean that there is no difference. (To take an absurd example, just because two preamps are 19" wide and 1U high does not imply that they should sound alike—size is not necessarily correlated with sound quality.)

Our ears are exquisite measuring devices, and they sometimes seem to be measuring quantities that we haven’t yet figured out how to measure electronically.

Particularly when dealing with microphone cables and interconnects, I suspect most conventional measurements are irrelevant to perceived sound quality. Neither are the explanations of cable manufacturers particularly helpful; either they’re written in purple, mystical prose with no technical basis, or the technical discussion is most charitably described as dubious. I’m persuaded that most cable manufacturers, or at least the people who write their ad copy, have no clue why the cables sound the way they do.

That doesn’t mean, however, that the differences don’t exist. It also doesn’t mean that there aren’t rationally describable causes for these effects. I’ll spend most of this article looking at possible causes, with the warning that these remain thoroughly speculative until and unless there are data to validate or disprove them.

Onward, into the fray!

The oddness of audiophiles

Let me start by explaining a statement I made a few paragraphs ago, when I mentioned not liking what I hear from many audiophile cables. To understand what I mean, you need to know something about the audiophile subculture as it has developed over the last four decades.

Modern audiophilia began, to all intents and purposes, with a man named J. Gordon Holt. After reviewing equipment for several mainstream audio magazines, he founded a small quarterly called Stereophile in the early 1960s. He reviewed audio equipment based not on its measurements in the laboratory, but instead on the way it sounded to his well trained ears.

Holt had good qualifications; he’d been making high quality recordings since the first tape recorders became available (most of the discs folksinger Richard Dyer-Bennet released in the 1950s and 1960s were recorded by Holt, and they’re among the most natural sounding recordings I’ve heard). He always judged gear based on the closest possible approximation to live sound, and attended live, unamplified concerts as often as possible. (It also didn’t hurt that he was and is a superb writer with a remarkable gift for integrating the esthetic and the technical.)

Stereophile was somewhat erratic about its publication dates in the first decade of its existence, and in frustration a group of readers led by Harry Pearson founded another magazine, The Absolute Sound. Its focus was similar—they reviewed equipment and recordings based on sound, not measurements—but its prose was more florid; they would refer to a “chocolate” midrange, and an amplifier was once described as sounding like a peppermint drop after an ice cream sundae. They also brought in a different emphasis, because they often spent more time discussing the equipment’s spatial characteristics than its tonal qualities.

Let me amplify on this, if you’ll pardon the expression. During the 1970s, as electronic and loudspeaker designs matured, it became possible for an audio system to reproduce a remarkable degree of the “space” contained in a good recording. When everything worked properly it was as if someone had sawed off the end of your listening room and sewed it to a concert hall, or a jazz club, or whatever the venue was that contained the original performance. When once a stereo image only contained a left-right panorama, the better gear suddenly let you hear depth.

I remember the first system I heard that could do this, some KEF reference monitors driven by a Conrad-Johnson tubed amp and preamp. The sensation was jaw-droppingly remarkable, and most beguiling.

The Absolute Sound fell head over heels for this new sound and its spacious possibilities; reviews went on for paragraph after descriptive paragraph about the “soundstage” available from a given piece of gear, with somewhat less attention paid to the tonal characteristics (tight vs. loose bass, sweet vs. edgy treble, etc.). Eventually Stereophile, under new ownership and with minimal participation from Holt, similarly changed focus, although to a lesser degree.

The result was bizarre: equipment, particularly loudspeakers, received rave reviews on the basis of its splendid soundstage, despite the fact that the instruments it reproduced sounded nothing like any instruments ever manufactured. I remember one pair of “audiophile” speakers that shall remain mercifully nameless, with no bass below 100 Hz, a squawky midrange, a roller coaster frequency response, and treble that sounded like someone had placed a toilet paper tube in front of the tweeter. Boy, could they image, though—and they got raves in the audiophile press.

Getting back to cables (I’m sure you were wondering when I would), the “space” revolution hit the world of audio cables as well. Audiophile wire, both for interconnects and speakers, imaged better and better—but tonal characteristics sometimes bordered on the bizarre. High frequencies in particular took on a “sizzle” that could become singularly fatiguing after a few hours.

If you’ve read many of the pieces I’ve pounded out for this magazine, you’ll know that sizzly treble is one of my bugaboos. Real instruments (at least real acoustic instruments) don’t have it, and neither do singers who still have all their teeth. It’s present in lots of gear, though, (particularly in this digital age) and it tends to multiply as you move down the audio chain.

Suburban legends

But how can a plain old piece of wire affect sound, anyway? Before I get there, let me talk about a few ways in which audio cables probably don’t affect the signals passing through them. In this discussion I’m mostly talking about microphone cables and interconnects (the wires that connect line-level inputs and outputs to one another), rather than speaker cables.

First, let’s look at resistance. In last month’s installment I showed how the series resistance of a too-thin speaker wire could have unfortunate effects on the speaker’s sound. The same argument, however, doesn’t normally apply to microphone and line-level signals.

A typical microphone cable might contain 100’ of twisted pair 22 gauge conductors. My wire table (Radio Amateur’s Handbook, 1956 Edition) says this gauge of wire has 16.46 milliohms of resistance per foot. There are about 220 feet of cable in all (two conductors, and I’m allowing an additional 10% for the twist). That works out to about 3.62 ohms series resistance for both conductors.

A typical microphone preamp will have a minimum input impedance of 1000 ohms. This forms a voltage divider (see last month’s article) with the cable’s series resistance, but the total loss is only 0.03 dB, which is inconsequential. Even if the preamp’s input varies from 500 ohms to 2000 ohms (unlikely but conceivable), the maximum frequency response deviation would be negligible—about 0.047 dB.

Line inputs typically have an input impedance of 10 kilohms (10K), so the loss from a 100’ cable running to a line input would be 0.003 dB, which is pretty close to zilch. So cables’ DC resistances, at least in reasonable project studio lengths, should not be a factor.


There’s another type of impedance that some cable makers assert is important in audio: the “characteristic impedance” of the cable. This is a bit esoteric; it derives from “transmission line” theory, usually a province of radio engineering, and it refers to an impedance affected by the cable’s inductance and capacitance. In a transmission line, characteristic impedance is matched from beginning to end: source, connector, cable, and termination impedances are all the same. If this is done properly, the signal passes smoothly and cleanly without internal reflections or signal losses.

Transmission line theory, however, really only applies when the length of the cable approaches one wavelength of the signal; for shorter lengths of cable none of this stuff makes a difference. Even the most conservative engineers ignore transmission line effects for cables shorter than 1/10 wavelength.

So what’s the wavelength we’ll be dealing with in audio? Remember that these are electrical (really electromagnetic) waves we’re looking at, not acoustic waves. Electro-magnetic waves (light, radio, electrical signals) propagate at the velocity “c” (as in e = mc2), which is 300,000 km/sec in a vacuum, somewhat slower in other materials. Let’s be conservative and say that the signal in wire goes at 60% of its speed in vacuum, or 180,000 km/sec.

The highest frequency we’re likely to deal with in audio is about 40 kHz, the bandwidth of a good analog reel-to-reel recorder or a 96 kHz digital system, and the rolloff point of some wideband microphones. At 40 kHz the wavelength in wire of a single cycle is 4.5 km; the higher the frequency, the shorter the wavelength, so in practice this is the shortest wave we’ll be generating. We said that transmission line effects don’t matter in cables less than 1/10 wavelength, so in practice we shouldn’t have to worry about any cable less than about 450 meters long—about 1500’, or a bit over 1/4 mile.

Most of us will never deal with audio cables that long unless we’re wiring a large concert hall or broadcast facility. So in practice, characteristic impedance won’t be an issue for our audio cables.

Note, please, that this does not apply to cables carrying digital signals! They have bandwidths of several megahertz, and transmission line issues become vital—hence the strict specification of 75 ohm characteristic impedance for S/PDIF interconnects and 110 ohm ditto for AES digital connections. Deviate from these specs and all sorts of nasty things happen to your signal—which is why you can’t just use a cheap RCA audio cable to hook up digital signals. You may recall that this came up in last issue’s article on digital networking.

All right, we’ve swept the irrelevancies out of the way; what are some things that could affect audio signals as they pass through the wire?

Never say dielectric

An audio cable consists of wires (one, two, or maybe three), each wrapped in insulation, with some sort of shield surrounding them. These are encased in a jacket, usually insulating. The shield may be braided, spiral-wrapped strands, metal foil, or some combination of these. In the early days of audio the insulation was cloth or rubber (some audiophiles still swear cloth wire sounds best); these days most cable uses one or another type of plastic.

Electrically speaking, a cable is a capacitor: two conductors separated by a “dielectric,” which is essentially a fancy name for an insulator. That capacitor is connected across the output of the microphone or other audio source (preamp output, for example), and acts as a load on that source.

All dielectrics are not alike, and so the capacitors made from them (including cables) are not all alike either. It can be shown in laboratories that capacitors made from certain dielectrics exhibit a form of misbehavior called “dielectric absorption.” When a signal, particularly an asymmetrical signal of the sort common in real-world audio, passes through the capacitor, some of the electrical charge is retained by the dielectric, to be released a few milliseconds later. It’s as if the dielectric is sticky and doesn’t want to let the electrons go. As a result the sound becomes smeared, with mushy and muddy bass and sometimes poorer imaging as low-level localization cues are smeared over.

This misbehavior has been known for decades. Indeed, dielectric absorption was a major concern in the design of early analog to digital converters. Walter Jung, in a series of brilliant experiments in the 1980s, showed that capacitors of identical value but made from different dielectrics performed differently when tested with asymmetrical signals, and that the performance differences matched the anecdotal reports of audiophiles quite nicely.

Lab tests have shown that there is a distinct ranking in capacitor dielectric materials, again correlating well with what listeners have observed. Of commonly found cable insulators, the ranking (in increasing order of quality) goes rubber < polyethylene (Mylar) < polyvinyl chloride (PVC) < polypropylene < Teflon.

So why don’t we all use Teflon insulated cables and be done with it?

Teflon has one minor problem and one major one. The minor one is stiffness; cables made with Teflon insulation aren’t particularly flexible, a problem for microphone cables but not necessarily an issue for interconnects, particularly if they’re behind the rack and out of sight.

The major problem stems from Teflon’s high melting point (which, of course, is why it’s so nice to solder around). When wire is made, typically drawn copper is passed through a die where molten insulation is added; the insulation solidifies onto the wire, and Bob’s your uncle. With Teflon, however, the insulation must be so hot that it accelerates the process of oxidation in the copper (see Part 1 of this article, last issue), to the detriment of the conductor. Accordingly, it has become standard practice to build Teflon insulated cables using silver-plated (or solid silver!) wire, rather than plain copper.

As you’d imagine, this can get expensive—especially for solid silver conductors. But there’s another issue with silver-plated wire, and this is one of those bits of voodoo for which I’ve never heard a plausible explanation: most silver wires sound bright when compared with copper wires. With a good dielectric they can image wonderfully, but the high frequency sizzle I mentioned earlier is almost always present to some degree.

Why? I don’t know, and I’m persuaded that no one else knows either. But I’m prepared to swear it’s there, and so are legions of audiophiles. (The difference being that many of them like the sound, and I don’t.)

Most studios (commercial or project) don’t use much Teflon cable; good microphone cables are usually polyethylene (Gotham) or PVC (Canare, Mogami. Monster), while a few interconnect cables use polypropylene insulation (which also tends to be stiff); Belden 8450 and 8451 are common examples.

Is this a microphone or a cable?

In chasing down possible sources of audible differences between cables, one that’s seldom considered is microphonics. It’s not well known, but in some circumstances cables can act as microphones, albeit very weak ones.

If you connect a cable to a dummy source at one end and a mic preamp at the other, then hit it with a wooden mallet, you may see a noticeable glitch emerging from the preamp. Some cables are, of course, worse than others in this regard. There’s one stiff cable with foil shielding that’s routinely laid under carpeting for burglar alarm installations; stepping on the cable generates enough signal to trigger the alarm.

I think it’s barely possible that differences in microphonic abilities may provide another mechanism for audible cable differences, although I’d be hard-pressed to figure out how to test this. I actually use the burglar detecting cable mentioned above in interconnects (because it has exceptionally neutral tonal characteristics), and I wonder if its less than pinpoint imaging may be related to microphonics, which could smear low-level perceptual cues?

Here I freely confess to have entered the realm of pure speculation, but I think it may be worth pursuing as we hunt for rational causes to explain the messages we get from our ears. Here’s a less speculative issue.

Soap box time

If you’ve read my articles over the past several years, you know that one of my pet hobby horses is radio frequency interference (RFI). It’s been shown in laboratory tests that even when obnoxious disc jockeys and taxi dispatchers aren’t audible in the mix, the RFI that carries them can intermodulate with audio signals, causing distortion and “hard sounding” highs. Keeping RFI out of the audio path is thus a vital part of creating a clean signal chain.

I’m gratified to see audio manufacturers picking up on these concerns (which are not, of course, original with me). Reviewing the Apogee Rosetta A/D converter and the new Mackie XDR preamps, I noticed that both use heroic levels of RFI proofing and shielding, both in the audio input circuits and in AC connections. I’m persuaded that part of the reason these components sounded better than their competitors was that they rejected RFI more effectively.

How bad is the problem? In your house, while you’re recording you might have a fluorescent light burning in the basement, a computer tracking the audio, a taxi in the street calling HQ, several radio and TV stations blanketing the area with pap, a neighbor using a cell phone, her kids playing with their radio-controlled fire truck, and Uncle Louie’s parole tracking anklet. (Okay, maybe you live in a duller neighborhood than mine.)

Cables differ in the effectiveness of their RFI shielding. Last weekend I had occasion to run sound from an external mic preamp to an unbalanced board input, and heard a slight hint of radio garbage. Curious, I switched from a braided shield cable to a foil shielded cable with 100% coverage—and the noise immediately got quieter. Easily heard difference. (We were recording about a mile from a 50kW FM station’s tower.)

Unfortunately, foil shields aren’t particularly flexible, so they’re not appropriate for microphone cables or other cables that will be moved around a lot. Instead, they’re usually used in semi-permanent setups, such as wiring harnesses that will be left in place most of the time.

Among microphone cables, one manufacturer—Gotham—has found an unusual solution to the problem of combining flexibility with maximum shielding. In addition to the usual shield (they use spiral wrapped strands, very easy to undo and solder) they employ a conductive plastic material for their outer sheath, rather than the usual insulator. This material—they call it a “Reusen layer”—provides extra shielding without compromising the flexibility of the cable. In my experience Gotham cables are as good at rejecting RFI as Star Quad cables are at hum. Now if someone would combine the two ideas...

Warning: the unusual grounding scheme of Gotham cables can occasionally cause grief. Certain microphone preamps, notably including the Rolls/Bellari series, don’t like it at all, buzzing like fruit flies when a Gotham cable is plugged in. For those preamps another brand of cable is definitely in order.

I find two things worthy of note in talking about cables and RFI. The first is that one of the major differences I hear between cables is the “hardness” or “softness” of the treble—precisely the sort of effect one might expect from differing amounts of RFI leaking into the gear and causing intermodulation effects. This isn’t scientific proof by any means, but I think it’s worth some attention.

The other is an observation of my fellow engineers. To be blunt about it, I’ve found that the degree to which professional engineers accept audible differences between cables is inversely proportional to age and status within the profession. The older and more successful you are, the more likely you are to dismiss the notion of cable differences as nonsense. (Although a significant number of old-line engineers have broken with conventional wisdom and switched to audiophile-influenced cables, such as Monster’s professional line.)

Is this rank fuddy duddyism—old dogs with ears too sclerotic to learn new tricks? I doubt it; many of these folks’ careers have stretched from big bands to hip-hop, and you don’t do that if you’re a stick in the mud. But there’s something technical that’s worth pointing out.

Older and more successful engineers spend most of their time working in very high-end studios, using very high-end equipment from manufacturers like Neve, Studer, and their cohorts. Now, there are many differences between that level of gear and the stuff people like you and me use, but one of the biggest differences is that most of this stuff is transformer-coupled. It was and is standard in high-end studio gear for mic preamps, board inputs and outputs (even aux busses), compressors, eqs, whatever, to be connected using a balanced and floating high quality transformer.

And what’s one important difference between transformer-coupled equipment and transformerless? The former is far more resistant to RFI; a good input transformer retains RFI rejection well up into the FM region, whereas transformerless gear, even balanced, is often susceptible. I find that provocative, to say the least.

Odds, ends and personal tastes

A few bits worth mentioning: for a while, so-called “Litz” wire had a vogue in the audiophile community. Litz wire consists of multiple, separately insulated thin strands that are braided together in complex ways. It was designed to have extremely low inductance, allowing it to pass high frequencies (including radio frequencies) more easily.

It did, all right; the combination of low inductance and high capacitance in the Litz speaker wires that proliferated in the 1970s was a tricky load that caused the outputs of some amplifiers to become unstable and oscillate at high power, quickly frying some expensive audiophile tweeters. There aren’t many Litz speaker cables any more, but it still shows up in some pricey interconnects and tone arm wires (for those too young to remember tone arms and turntables, we’ll talk about them some other time).

Is Litz wire audibly different from more conventional configurations? In my experience, it is (and as usual, I don’t like it). Why this should be, I freely confess to not knowing.

In general, audiophile cable manufacturers make much of the physical configuration of their cables’ conductors. Unfortunately, most of them can offer no scientifically valid reasons why their configuration might be better than another makers, so I remain agnostic. I will report that I have heard some good cables with unusual configurations, and some equally good ones that are as conventional as a Ford Escort.


I mentioned personal tastes; I might as well lay out my own preferences here so you’ll know my prejudices and can allow for them. Can I hear differences? Yes, although they’re low-level rather than dramatic; I’d say the audible differences are less than those produced by different preamps, and way less than using different microphones, but particular cables do add their own flavor to the mix.

As I’ve said before, I like my overall sound to be neutral, without a hyped or tizzy top end, and while I certainly don’t mind good imaging, I place tonal honesty and lack of offense first and foremost. My choices in cables stem from those tastes.

For microphones, my standard cables are Gotham (when I bought mine they were sold as Neumann cables). They’re ever so slightly softer than cables like Canare and Mogami, which I find useful in taming the slight hardness still endemic to digital recordings and condenser microphones, and of course they’re highly resistant to radio garbage. I also keep several Star Quad cables (from Canare, because I have a handy local source for it in bulk) in the kit, just in case I’m faced with difficult hum problems.

Interconnects are usually made up from Belden 8450, a cable with solid-conductor internal wires rather than stranded (much easier to solder), polypropylene insulation, and foil shielding with an internal drain wire. It’s not something I’d want to flex hundreds of times, but for wires that don’t get a lot of abuse it’s reasonable. The insulation melts awfully easily, though, making soldering trickier. The tonal quality is quite neutral, with tight, clean bass and unzippy highs. Speaker cable is Monster Powerline III, which I don’t think they make anymore, but it still sounds fine on my main listening system.

What I’d love to see is a cable combining aspects of all my favorites: Star Quad construction la Canare/ Mogami, a conductive outer layer and internal ground wire like the Gotham cables, and if possible polypropylene insulation. It should also have gold XLR connectors already soldered on, so I don’t have to—and it should come in decorator colors for visual delight and channel identification. Oh, and it should be available wired unbalanced, but using the same cable.

Anybody out there want to negotiate a patent pool?

Winding up

(Well, what else would you do with cables?)

I’d be remiss in ending this article without reminding you that the way you handle cables makes a big difference in their reliability, if not their sound. Do wind cables properly: hold the cable in your left hand, connector pointing toward you, form a loop by rolling the cable between the thumb and finger of your right hand, then lay it into your left hand. Make another loop and lay it gently over the first; if it’s kinked, go back and roll some more, until it loops smoothly and easily. When you’re done, tie the coiled cable neatly with a Velcro tie wrap rather than twisting one end around the rest. And NEVER “elbow wrap” cables—it puts a permanent kink in them and drastically increases the stress on inside conductors, leading to early failure.

Also, if you make up your own cables, use good connectors—Neutrik or Switchcraft, and it’s probably a good idea if they’re gold-plated to minimize oxidation problems. Treat all connectors every few months with a contact cleaner/preservative combination such as Caig’s DeOxit and PreservIt, or their ProGold treatment for gold connectors.

Hopefully this discussion leaves you feeling like you’re in the loop. I’ve tried to conduct it properly to the best of my capacity, although I know I’ll meet some resistance. Oh well, you can always upbraid me.

Or call out the coppers.

Paul J. Stamler ( was doing fine until the last paragraph of this article, which got the Editors so badly wired that they cut him short. It came as a terrible shock.

Kef America

The Magazine | Featured Review | Resources & Info | Readers' Tapes | Editors' Blogs | News | Shop | About Us | Contest | Subscriptions | Contact
Terms and Policy | Advertise | Site Map | Copyright 2014 Music Maker Online LLC | Website by Toolstudios
RSS Newsletter Refer a Friend Q&A Q&A