Closed jrlanglois closed 3 years ago
Ok, so this is getting interesting: the codebase appears to use FloatAudioFormat
(ie: the frozen file format, getFrozenFileFormat
) for rendering tracks in different circumstances... so I guess that bit can be ignored. Although, I'm thinking it would still be invalid to add the float format to the readFormatManager
.
As for a different observation: I could avoid getValidFileExtensions
if I had access to readFormats
(which is odd that it's not available because getWriteFormats
is!), and string together calls to getFileExtensions
of all of these formats... but then why wouldn't getValidFileExtensions
just do that?
Another finding: WaveInputDevice::getRecordFormatNames()
should be updated to allow customisation by the user. (and also, it just assumes presence of FLAC, and weirdly ignores OGG, MP3, any other format...)
Ok, so the list goes on and on: lots of dependencies on OGG, FLAC, MP3/LAME, REX (eg: TargetFileFormat
, ProjectItem::getSourceFile()
).
Looks like plenty of APIs require all of this, even though I won't use any of it explicitly. I'd be content with AIFF as a fallback overall... not sure why it's overlooked in the renderers and tests.
Also, I guess this means I can't render with my own codecs...
I noticed a bunch of things when wanting to add my own audio formats into an instance of this class:
getValidFileExtensions
still adds REX related extensions.FloatAudioFormat
for R/W... can this be made optional (maybe add a new platform config macro?)?getValidFileExtensions
.JUCE_USE_FLAC
JUCE_USE_OGGVORBIS
JUCE_USE_MP3AUDIOFORMAT
JUCE_USE_WINDOWS_MEDIA_FORMAT
JUCE_USE_LAME_AUDIO_FORMAT
The annoying part is that this class is integrated into the
Engine
, albeit with limited impact... I could avoid using it altogether but then why bother with adding an instance ofAudioFileFormatManager
to theEngine
in the first place?