In the world of video, there is an empirical difference between DVD, Blu-Ray, and 4K, a demonstrable difference to the human eye achieved with each higher level that earlier formats did not achieve. But eventually there will come a time when a level of video is reached where the human eye cannot detect the difference as it would exist on such a microscopic level, and while be technically “better”, it would not effectively be better.

With CD vs. SACD (and other higher level formats), there are people that argue that the latter are the Blu-Ray and 4Ks of the audio world in that there is a level of detail hearable detail and richness that CD cannot attain, and then there are those that say CD was already the “4K” of the audio world in that it allegedly reaches the highest level of detail the human ear can experience, and that new formats are going beyond what you can hear and are marketing superior remaster quality as higher audio fidelity.

So the question is not “is SACD technically better and more equivalent to something like 4K and Blu-Ray?”, because that’s tangibly true. The question is: can the human ear hear that difference, or are the superior mastering processes used for SACD releases making people think the format is better, when in fact a redbook CD would sound just as good if mastered in the same way?

Have experts weighed in on this? I hope I worded it right. Thanks for any polite answers.

  • lollroller@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    We can argue this forever, but it comes down to: does the playback scheme equal how the music was recorded?

    If the two are not equal, then there will always exist the possibility that the playback is missing some elements of the recording; and that some people, with some equipment, with some systems, could possibly here the difference.

    But once the playback equals the recording, assuming the speakers are capable of the dynamic range, hen there can be no further argument