6 or 7 years ago was before something called "CEA2006" power ratings. it was common for amplifiers to be rated higher than their actual output. if the older amp was 15 years old, it would likely be louder than the new one, for different but similar reasons.
distortion, in technical terms, is the difference between the sound going into the amp, and the sound coming out. In simple terms, it's when you tun it up loud and the sound goes fuzzy or breaks up.
MacMini in an Alfa? - Why not!
BTW, Thanks everyone for your input
i think the key is the input sensitivity! for example, You have a 2Vrms line output from the soundcard, the 50W amp has 2Vrms line input and the 200W amp has 4Vrms line input. the result is that You need 2x high gain level on 200W amp to reach the same loudness! but if You have a matching output, this amp will kick the other! of course the smaller one will be distorted because of too high output level...
"case": Skoda Superb Elegance 1.8T
Asrock G41MH-GE, E2180 2GHz dualcore, 1024MB DDR2/800, Samsung F1 750GB/7200RPM/32MB, M4-ATX, Lilliput FA1011 HDMI touchscreen, ASUS Xonar DG PCI, homemade Quectel L10 USB GPS