Closed mackron closed 5 years ago
Hey David! Count on that! I'll test it as soon as possible! :)
Hi! These changes are exciting! Thanks a lot, I will test them ASAP :)
Testing it but I get a compiling problem:
mingw32-make PLATFORM=PLATFORM_WEB
Process started (PID=9036) >>>
emcc -c external/mini_al.c -O1 -Wall -std=c99 -D_DEFAULT_SOURCE -Wno-missing-braces -Werror=pointer-arith -fno-strict-aliasing -s USE_GLFW=3 -s ASSERTIONS=1 --profiling -I. -Iexternal/glfw/include -Iexternal/glfw/deps/mingw -DPLATFORM_WEB
In file included from external/mini_al.c:8:
external/mini_al.h:19267:13: error: expected expression
return EM_ASM_INT({
^
C:\emsdk\emscripten\1.38.21\system\include\emscripten/em_asm.h:180:92: note: expanded from macro 'EM_ASM_INT'
#define EM_ASM_INT(code, ...) emscripten_asm_const_int(#code _EM_ASM_PREP_ARGS(__VA_ARGS__))
^
In file included from external/mini_al.c:8:
external/mini_al.h:19355:34: error: expected expression
pDeviceInfo->minSampleRate = EM_ASM_INT({
^
C:\emsdk\emscripten\1.38.21\system\include\emscripten/em_asm.h:180:92: note: expanded from macro 'EM_ASM_INT'
#define EM_ASM_INT(code, ...) emscripten_asm_const_int(#code _EM_ASM_PREP_ARGS(__VA_ARGS__))
^
In file included from external/mini_al.c:8:
external/mini_al.h:19650:24: error: expected expression
int resultFromJS = EM_ASM_INT({
^
C:\emsdk\emscripten\1.38.21\system\include\emscripten/em_asm.h:180:92: note: expanded from macro 'EM_ASM_INT'
#define EM_ASM_INT(code, ...) emscripten_asm_const_int(#code _EM_ASM_PREP_ARGS(__VA_ARGS__))
^
3 errors generated.
shared:ERROR: compiler frontend failed to generate LLVM bitcode, halting
mingw32-make: *** [Makefile:537: mini_al.o] Error 1
<<< Process finished (PID=9036). (Exit code 2)
Also note that EM_ASM_INT()
macro requires #include <emscripten/em_asm.h>
.
About Android AAudio
backend, note that it requires Android 8.0 (Oreo) with API level 26, that's ok but only around 21% of the market has that Android version device. Most developers still work for older API levels, so, making AAudio
a priority backend for Android maybe is not the best idea just yet.
OK, that compiler error is due to the -std=c99
switch. I'm not sure why EM_ASM_INT
is failing, but EM_ASM
is working fine... Will look into it. Is the #include <emscripten/em_asm.h>
included automatically with #include <emscripten/emscripten.h>
?
The Android build should seamlessly fall back to OpenSL|ES if AAudio is unavailable.
I've pushed a fix for those EM_ASM_INT
errors. Note for future reference that if you use any of the EM_ASM
macros with the -std=c99
switch, you need to pass at least one argument. I just pass a dummy value like this:
int result = EM_ASM_INT({
return someVar;
}, 0); // <-- Pass in a dummy argument to fix build with -std=c99.
Ok, tested new backend, it seems to initialize successfully but for some reason audio does not play. Tested sound playing and music streaming. Here it is browser console output:
...
INFO: OpenGL default states initialized successfully
INFO: [TEX ID 13] Texture created successfully (128x128 - 1 mipmaps)
INFO: [TEX ID 13] Default font loaded successfully
The AudioContext was not allowed to start. It must be resumed (or created) after a user gesture on the page. https://goo.gl/7K7WLu
The AudioContext was not allowed to start. It must be resumed (or created) after a user gesture on the page. https://goo.gl/7K7WLu
INFO: Audio device initialized successfully: Default Playback Device
INFO: Audio backend: mini_al / Web Audio
INFO: Audio format: 32-bit IEEE Floating Point -> 32-bit IEEE Floating Point
INFO: Audio channels: 2 -> 2
INFO: Audio sample rate: 44100 -> 48000
INFO: Audio buffer size: 2048
INFO: [resources/weird.wav] WAV file loaded successfully (11025 Hz, 8 bit, Mono)
INFO: Unloaded wave data from RAM
INFO: [resources/tanatana.ogg] OGG file loaded successfully (48000 Hz, 16 bit, Mono)
INFO: Unloaded wave data from RAM
...
Message about google autoplay new policy shouldn't affect.
Sorry, my bad, problem with file paths (and my usb-powered speaker)...
Web Audio seems to work great!
Good to hear! What is the file size like compared to the SDL2 implementation by the way?
Hi @mackron, sorry to bother you but I got bad news, after some more in-depth testing, audio does not play on some situations:
falling back to ArrayBuffer instantiation
It seems related to WebAssembly but not sure if related to audio, previously (1 month ago) there was no error.
About sizes, here are the new values (same project, Emscripten 1.38.20):
No Audio
: 309KB (.js) + 127KB (.wasm) = 436KBSDL1
: 597KB (.js) + 288KB (.wasm) = 885KBSDL2
: 353KB (.js) + 792KB (.wasm) = 1145KBWeb Audio
: 549KB (.js) + 385KB (.wasm) = 934KBUPDATE: After some testing, get through second issue, it seems that using -s ALLOW_MEMORY_GROWTH=1
breaks something internally (asm black magic?) and audio does not play, maybe some memory missalignment? no idea... Just replaced it by a fixed heap size (-s TOTAL_MEMORY=67108864
) and now audio works perectly with Web Audio
backend.
But first problem still persist, if no gesture is done previously to audio context creation, audio does not play. More info here.
OK, regarding the first point about Google's autoplay thing, I don't think there's anything I can do about this because it's controlled by the browser itself. Is this also happening with the old SDL2 backend (you'll need to go back to the master branch to test that)?
I thought the way the workflow was supposed to work is that the audio system needs to be initialized, with the device left in a stopped state, and then the actual playback needs to start only after a "Press any key..." prompt or something. The sequence would be something like this: 1) Initialize the audio device, but leave it in a stopped state 2) User input ("Press any key...", "Press Start", etc.) 3) Start the device (I think this will fail if no user input has been performed)
Having said all that, I see this on that link you provided:
An AudioContext will be resumed automatically when two conditions are met:
- The user has interacted with a page.
- The start() method of a source node is called.
Does the order matter here? Does this mean I can call start()
before the user has interacted with the page, and then when the user interacts with it, it will automatically start playing? I'm doubting this is the case, but I might try experimenting with it. If it doesn't work I can't think of anything else I could do... I don't like what Google are doing with this 👎
Hi @mackron, just testing the WebAudio backend this morning. It is working fine! This is our size report:
I will tell you later if we found issues, but so far it is great!
Thanks a lot!
PS: For Android, I don't have a Oreo device here, but I will try ASAP.
Good to hear. Thanks! A non Android 8+ device is actually still a good test because it'll test that the seamless fallback to OpenSL is working.
These backends will be added to version 0.9.
I've gone ahead and pushed an update to the dev branch which includes two new backends: Web Audio (Emscripten) and AAudio (the new Android audio API). These will be the new priority backends for web and Android respectively, so I was hoping to get some testing in before merging into master.
Calling @raysan5, @jesusdesantos and @digitalgust (in addition to anybody else who'd be able to donate some time) since I think you guys are using one or both of these platforms which will be affected. Doesn't matter if you can't get to it, and no rush or anything - just trying to get some more coverage before merging is all.
Thanks!