In the world of video, there is an empirical difference between DVD, Blu-Ray, and 4K, a demonstrable difference to the human eye achieved with each higher level that earlier formats did not achieve. But eventually there will come a time when a level of video is reached where the human eye cannot detect the difference as it would exist on such a microscopic level, and while be technically “better”, it would not effectively be better.

With CD vs. SACD (and other higher level formats), there are people that argue that the latter are the Blu-Ray and 4Ks of the audio world in that there is a level of detail hearable detail and richness that CD cannot attain, and then there are those that say CD was already the “4K” of the audio world in that it allegedly reaches the highest level of detail the human ear can experience, and that new formats are going beyond what you can hear and are marketing superior remaster quality as higher audio fidelity.

So the question is not “is SACD technically better and more equivalent to something like 4K and Blu-Ray?”, because that’s tangibly true. The question is: can the human ear hear that difference, or are the superior mastering processes used for SACD releases making people think the format is better, when in fact a redbook CD would sound just as good if mastered in the same way?

Have experts weighed in on this? I hope I worded it right. Thanks for any polite answers.

  • Konstantine_13@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    If I remember my audio engineering classes correctly, I believe 44.1 was chosen for technical reasons. Like ease of transferring between popular equipment at the time or something. It was also Sony who came up with that standard and there was some back and forth on exactly what they would use. 40khz is really all you need. 44.1khz was chosen more for convenience rather than a necessity.