How did we get to 192?

Prev Next

How did we get to 192?

When playing high-resolution PCM files the defacto industry standard seems to be 192kHz, 24 bit. Right? I mean, given a choice between the next candidate, 176.4kHz, we automatically choose 192kHz.

Why would we do this? Perhaps human nature. We assume higher is better.

But, in fact, there's reasonable evidence that in many cases, 176kHz is preferred. Much depends on the original recording process.

At the dawn of digital recording, the pro machines were all based on the standard of 48kHz (and later its multiples: of 96kHz and 192kHz). To produce useable digital releases on the (then) new CD format, everything had to be rejiggered through a complex compromise that resulted in 44.1kHz for CDs.

Life at the time would have been a lot simpler if CDs had just adopted 48kHz as the standard, but, alas that was not to be.

Today, some recording engineers still mindlessly choose 192kHz as their high watermark to record at, then downsample to the uneven result of 44.1kHz (which is always a bit of a compromise).

If one were to think about it just a little, we'd be asking for releases and recordings to be standardized at 44.1kHz and its multiples: 88.2kHz, 176.4kHz, 352kHz, etc.

It' probably doesn't matter much anymore and it's certainly nothing to lose sleep over.

But if I have a choice it's always 176kHz.

Back to blog
Paul McGowan

Founder & CEO

Never miss a post

Subscribe

Related Posts


1 of 2