John Sciacca Writes...

Features, Reviews and a Blog by John Sciacca

Random Thoughts (Blog)

Ask an Expert! Amplifier Wattage

Posted on January 3, 2013 at 3:25 PM

I’m sure that a lot of readers have a variety of A/V questions. Like which is the best cable to use? Or what is the best surround format? Or why was my cable box apparently forged in the very bowels of Hell? Whatever.


I received an e-mail query from a customer the other day, and when my standard answer of, "Shut up! You're stupid!" didn’t seem to satisfy his curiosity, I thought I would turn to some industry experts to provide a more, umm, expert opinion on the matter.


Here was his question:


"I do have a question that is not related to my equipment. It was part of a recent debate. I gave my son my old receiver. He installed it and played a movie and said he heard popping sounds from his speakers and was convinced the receiver was too much for his speakers and that he had damaged them . It only happened on one movie. Personally I think he's like his old man and will use any excuse to upgrade his system components. But, my question is: If I attach a 100 watt amp to speakers that are advertised for 75 watts and play the amp at moderate to low levels will the speakers be damaged?


Amplifier attage is one of those things that seems to trip a lot of people up. Also, it is one of those numbers – like contrast ratios on a TV – that is frequently inflated by marketing.


When considering ampliier wattage, the first thing you need to realize is that not all watts are created – or rather measured – equally. Some manufacturers will rate an amplifier with 1 channel driven, while others will rate it with ALL channels driven (a much more robust way of measuring). Further, some will rate at only one frequency – say 1 kHz – while others will measure at all frequencies from 20-20k Hz; again, a much more revealing test. Some manufacturers – cough, Pioneer, cough – rate amps at 6 ohms (yielding a higher number) than the far more common 8 ohms.


It's takes power to create wattage, and general a quality power supply weighs. A lot. As an “not all watts are created equal” example, consider this: We used to have two amplifiers in our showroom; one was a Pioneer receiver rated at 5 x 150 watts and selling for around $399. Another was a Lexicon rated 5 x 125 watts selling for around $4000. I would have people lift the two components – the Lexicon outweighed the Pioneer by MANY pounds – and it was clear that the wattage rating was a small – and totally inaccurate – means of comparing the two.


My actual reply to my customer was that you are WAY more likely to damage a speaker by UNDER powering it than by OVER powering it; thus a 25-watt receiver is far more prone to blowing speakers than say a 200-watt one, regardless of the speakers it is connected to. The reason being that a low powered amplifier when driven to loud volumes will start to distort more quickly, and it is this distortion that damages speakers.


In fact, a high powered amp playing cleanly will rarely damage a speaker. You could likely connect a 1000 watt amp to that speaker and not have any trouble. (Unless you played it at deafening volume levels for an extended period of time, where you could literally blow the speaker out of its surround...)


But then, what do I know? Perhaps there was a better answer? So, I reached out to three industry know-it-alls, a trio of Sound + Vision experts, Geoff Morrison, Brent Butterworth and Danny Kumin, all of whom has far more amplifier testing and experience than I do. Here’s what they had to say…


Geoff Morrison: “Unfortunately, using an amp with a higher wattage rating than recommended is a lot like dividing by zero. A quantum hole forms at the speaker terminals, spreading along the internal wiring, splitting in two at the crossover (or more, depending on the design), and growing to envelop the entire known universe.


That could happen, or more likely, nothing will.


A low-wattage amp driving at high volumes is way more likely to cause damage than the opposite. In fact, at most listening volumes you're only using a watt or two, at best. There are high-end audiophile amps with 1,000-watts, and you could drive $100 bookshelves with them if you wanted to. Though why you'd want to...”


Brent Butterworth: “What Geoff said. Except for the 1st paragraph.”


Danny Kumin: “Short answer: What Brent and Geoff said.


Except that...most listening volumes you're only using a watt or two, at best.


...Well, kinda.


With a peak-to-average ratio of 20:1 (or more), dynamic music via a 100-watt amplifier at an "average" level of 1 watt will be hitting the clipping point regularly (in the absence of any headroom, i.e. power-supply "sag.") At a few-watts average level it will be clipping many times per second, and in fact this is what, in my worldview, makes amplifiers "sound different":  how they clip, and what they do on entering and leaving clipping.


One possible solution -- the one most popular in audiophile circles -- is a megadollar, brutally regulated 1,000-watts-per-channel amplifier, but there are others.


You can play ANY speakers from ANY amplifier if you use common sense. If you run far into clipping for long durations it will a) sound bad, and b) probably fry your tweeters. Why? Clipping produces square-wave-like waveforms, which (for reasons we'd need Prof. Fourier to explain) dump huge amounts of high frequencies at, essentially, full power, to the outputs. Not good.


Shopper: Hey, how many watts will them speakers hold?


Salesman: 500, if you slice 'em real thin."


Have you got a burning A/V question you've always wanted answered? Post it in the comments and I'll do my best to get you an expert reply!

Categories: January 2013, Electronics

Post a Comment

Oops!

Oops, you forgot something.

Oops!

The words you entered did not match the given text. Please try again.

Already a member? Sign In

0 Comments