Quick question... seemed logical to me, but the clip light on an amp is pretty much just detecting DC in what should be an AC signal and lights up whenever it sees it, right?
If that's the case, then the clipping indicator should light whenever a signal is clipped, regardless of it's origin or gain right?
the question boiled down to real world terms is that my amp has a clipping indicator and I can turn up my head unit to the point just before it clips, but i can crank my onboard sound all the way to the top... and no clipping indication. So, either i'm wrong in my assumption about how this works, or i can actually output a clean signal at full volume from my mac.
Anyone want to sound off?
In your amp I am sure the clipping indicator is just a voltage comparator. When they designed the amp they know how much input voltage it can take. They then probably built a circuit to monitor the voltage level. If it ever goes over the light turns on. It won't really know if the AC is turning into DC. That is my best guess because I doubt they took any more time designing that feature, otherwise they would have gone for a limiter/compressor circuit instead.
Originally Posted by GoHybrid
From what I have seen, without any boosting of bass/treble/EQ adjustments over 0db, then you can turn the volume all the way up on the soundcard without clipping.
maybe i'll see if can track down someone at Eclipse and see if i can get an answer for this.... just for *****s. The source is just my mac mini 3.5mm output, and not ever having adjusted it other than volume-wise i think i'll probably be okay. Thanks man.
That output's probably distorting a little. The only way to know is to put a scope on it.
i got to thinking a little more and now i'm wondering:
1. if you play a square wave signal through a speaker, how do you determine how many volts/amps @ frequency you can push without damaging the speaker? Obviously speakers can handle a very slight level of abuse, but how do you determine that?
2. which is an easier/more reliable method of determining whether your speaker is receiving DC without an oscilliscope: measuring for constant line voltage, or measuring for constant speaker impedance above nominal?
3. would an audio-specific DC detection tool be a desirable/feasible tool for setting gains? Would this be any more effective than the voltage comparator that Durwood suggests?
4. lastly, potentially revealing my ignorance on the matter: since certain sounds generated by certain effects in music (the most obvious being distortion on an electric guitar) can produce an effectively clipped signal, given a maximally tuned gain on an amplifier, could more distorted genres of music (metal, punk etc...) be potentially damaging to speakers/amps versus cleaner signals like classical or jazz?
Bear in mind i'm not passing judgment on one style over the other, just remarking on the differences in signal quality you might expect to see.