DanielPhoton / xld

Automatically exported from code.google.com/p/xld
0 stars 0 forks source link

Bit-depth transcoding truncates audio frequencies #258

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
It should be possible to transcode a 24-bit audio source to 16-bit with little 
or no loss in quality.  Unfortunately, Core Audio does not implement this 
correctly, and the resulting degradation is quite audible.  The effect is 
similar to applying a 18 KHz low-pass filter, as seen in the attached images.

Until XLD implements better bit-depth transcoding than what's available in Core 
Audio, or Apple improves Core Audio, anyone transcoding from a higher to a 
lower bit depth might appreciate knowing that they risk impairing their audio.

The attached files show the frequency spectrum at the same point in three 
files, namely:
1. The original 88 KHz/24-bit source,
2. QuickTimes Best Quality (320 Kbps, 44.1 KHz/16-bit) AAC transcoding,
3. Better 44.1 KHz/16-bit transcoding, with little or no audible difference 
from the original.

Original issue reported on code.google.com by slew...@gmail.com on 23 May 2014 at 12:52

Attachments:

GoogleCodeExporter commented 8 years ago
Huh? applying a low-pass filter in psychoacoustic codecs is a normal. AAC 
shouldn't be used for bit-depth transcoding. You say AAC is 16-bit, but there 
is no bit depth definition for AAC (as MP3).

Original comment by tmkkmac on 23 May 2014 at 1:44

GoogleCodeExporter commented 8 years ago
It's a poor psychoacoustic codec that truncates audible frequencies.

Perhaps I'm not understanding you.  Say I have a 24-bit recording that I want 
to sync to an iPod.  The obvious choice is to open it in XLD, which encodes an 
AAC and adds it to my Library.  But this seemingly innocuous work flow has just 
compromised my audio.

If I knew what I was doing, I'd use a dithering/noise-shaping transcoder which 
produces the same size AAC with no appreciable loss of audio quality (on an 
iPod, at least).

But most people aren't going to know this, so what I'm proposing is that XLD at 
least warn that going straight from 24-bit to AAC is not recommended.

Original comment by slew...@gmail.com on 23 May 2014 at 2:23

GoogleCodeExporter commented 8 years ago
Perhaps this is the misunderstanding:  Yes, bit depth is not part of the AAC 
spec, but QuickTime only produces 16-bit AAC.  No?

Original comment by slew...@gmail.com on 23 May 2014 at 2:36

GoogleCodeExporter commented 8 years ago
>It's a poor psychoacoustic codec that truncates audible frequencies.
Not poor. Better psychoacoustic codecs use low-pass filter wisely to preserve 
lower frequency which is sensitive to human's ear.

There is no 24-bit AAC, or no 16-bit AAC, period. AAC is lossy codec. Just use 
lossless codecs like AIFF, ALAC, etc for such purpose.

Original comment by tmkkmac on 23 May 2014 at 2:44

GoogleCodeExporter commented 8 years ago
Agreed.

You are correct.  And yet AAC itself is not to blame.  Apple presumably made a 
policy decision that no (Apple) AAC shall expand to an audio stream of depth 
greater than 16 bits.  To enforce that, they apply this heavy-handed low-pass 
filter, irregardless  of the input.

Thank you for setting me straight !!  And thank you for XLD !!
-AM

Original comment by slew...@gmail.com on 23 May 2014 at 3:39

GoogleCodeExporter commented 8 years ago
Again, bit depth has nothing to do with AAC, nor its low-pass filter. You are 
misunderstanding something.

AAC is a frequency domain codec. All input samples are converted to 
floating-point expression (usually 32-bit) at the input stage, and translated 
to a frequency domain using MDCT. No one can define bit depth for AAC.

Original comment by tmkkmac on 23 May 2014 at 3:58

GoogleCodeExporter commented 8 years ago
My bold claim is that before encoding with AAC, QuickTime/Core Audio ensures 
that PCM input is reduced to 16-bit depth.  How else to explain that when a 
24-bit WAV file is encoded to AAC via QuickTime and then expanded, the result 
is a 16-bit WAV file?  This is not the fault of AAC, which is lossy, but not 
that lossy.  I am blaming QuickTime/Core Audio.  Remember, Apple distributed 
128 kbps AAC audio for years, which probably is indistinguishable from WAV with 
16-bit depth.

No one can define bit depth for AAC - except Apple, by how they filter the 
input to AAC in Core Audio.

Original comment by slew...@gmail.com on 24 May 2014 at 3:52

GoogleCodeExporter commented 8 years ago
Again, AAC encoders convert input samples to floating point expression (usually 
32-bit) before encoding. Why do you think PCM input is reduced to 16-bit depth?

>How else to explain that when a 24-bit WAV file is encoded to AAC via 
QuickTime and then expanded, the result is a 16-bit WAV file?

It depends on the decoder. It isn't a matter of the encoder. Decoder can always 
create 24-bit PCM from ANY AAC files, but I don't think it makes sense.

Original comment by tmkkmac on 24 May 2014 at 4:02

GoogleCodeExporter commented 8 years ago
It�s possible that Apple�s implementation of AAC is broken, but that seems 
unlikely to me.

To verify the problem is simple enough.  Encode a 24-bit WAV file to AAC with 
QuickTime (I�ve tried both version 7 and the current version), or with XLD, 
which uses Core Audio, I believe.  Decode that AAC with any decoder (I�ve tried 
QuickTime, Audacity and iZotope RX3), restoring it back to WAV format.  The 
restored WAV file's effective bit depth is never be greater than 16.

It would be easy to prove me wrong, but I have yet to find a counter example...

Original comment by slew...@gmail.com on 24 May 2014 at 4:53

GoogleCodeExporter commented 8 years ago
I'm not sure how you checked "effective bit depth", but just decoding an AAC 
file with QuickTime and force output bit depth to 32 makes a file which has 
non-zero value in least 16-bit of samples.

And a fact that a decoder generates 16-bit PCM doesn't prove that an encoder 
truncates a bit depth. AAC is not a PCM. It applies mathematical transformation 
while encoding, and stores frequency domain samples in floating-point like 
format (base amplitude + dynamic range). Decoder needs to use floating-point 
maths to rebuild PCM samples, and it is impossible to tell "AAC bitstream 
encoded from 16-bit PCM" from "AAC bitstream encoded from 24-bit PCM". It just 
decode bitstream in a floating-point format, and (usually) truncate to 16-bit 
PCM in a final stage.

Original comment by tmkkmac on 24 May 2014 at 5:13

GoogleCodeExporter commented 8 years ago
I�m reading Apple�s �Mastered for iTunes� document which describes a 
command-line tool, afconvert, for generating AAC-plus files.  I�ll play with 
that and let you know what I find.

Original comment by slew...@gmail.com on 24 May 2014 at 5:18

GoogleCodeExporter commented 8 years ago
Okay, CoreAudio does not reduce audio to 16 bit depth before encoding to ACC.  
However, I did notice that in transcoding an 88.2 KHz 24-bit file to AAC, XLD  
resamples at 48 KHz, rather than 44.1 KHz.  Is there a way to change that to 
44.1 KHz?  Thanks.

Original comment by slew...@gmail.com on 28 May 2014 at 3:44

GoogleCodeExporter commented 8 years ago
Change output samplerate in the encoder option.

Your question is totally off-topic. Please do not use this topic for the 
question.

Original comment by tmkkmac on 28 May 2014 at 3:50