Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What gets me about the intermodulation effect is that some people want to have it both ways. They are so stuck in the 'there is no difference' camp they fail to see the self contradiction in the argument.

On one hand, higher frequencies above 20KHz can't be heard at all, so there's no point having them! You can't hear them!

Then on the other hand, higher frequencies above 20kHz affect the audible region of the sound (intermodulation distortion), so you can hear them, so make sure they aren't there!

What if the presence of the higher frequencies in a spectrum that shares a harmonic relation to the audible region causes intermodulation distortion that is pleasing and musical to the ear. What if the complete absence of this high frequency information, or alternatively a non-harmonic higher frequency signal (say some kind of switching or power supply noise) causes the audible region to be perceived in a less pleasant manner?



Intermodulation in this case is distortion that wasn't in the source material. It's a product of the failings of the playback system to perfectly recreate high frequencies without distorting the lower ones, and will vary depending on which system it is played on.

Certainly some people find certain kinds of distortion pleasant, but the people arguing for 192kHz claim increased fidelity, not pleasing distortion - when it is just the opposite for any stereo that introduces these artefacts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: