Open sys-b opened 4 years ago
What iOS version is your iPad running on? Have you tried playing audio on an iPhone? Here they say that on iOS 10.1.1 the web audio API is not working: https://developer.apple.com/forums/thread/48815
This was fixed in the second build of iOS 10.1.1. iPad now plays Web Audio API again, but didn't from 9.3.2 to the first build of 10.1.1. Thanks.
It is an iPad Pro. Running iOS 14.0.1. Same as an iPhone 11 Pro Max. Also Not Working. On the current iMac it only works in chrome also, Not in safari. Maybe an webkit Issue.?
Yep, smells like an compatibility issue with webkit. I don't own any iDevices to fix this, so I've labelled this with "help wanted".
@sys-b @FredVDB Can you try the develop branch?
Or extract this file snapcontrol.js.zip and copy it to your doc_root
(default should be /usr/share/snapserver/snapweb/
)
Still no audio with the modified snapcontrol.js. Is there some way to get logging info ?
Yes, this should be possible: https://stackoverflow.com/questions/4478271/remote-console-log-on-ios-devices
The modified snapcontrol.js introduced an explicit event listener for the play button click event, as suggested on stackoverflow. Before it was a link to a java script function. I think without having an iOS device, I'm out for now :(
Did you restart the snapserver after copying the file? The server doesn't cache (refreshing the browser should be enough), but who knows.
Hi. I just pulled the development branch, make, replaced the folder, restart service.
It seems to behave like before. In Chrome on iPadOS 14:
Also interesting is, that it isn't working in Safari on the recent iMac and recent OS:
Hopefully this will help.
Can you please try again the develop branch or the attached snapstream.js.zip?
Snapweb is now falling back to webkitAudioContext
if there is no AudioContext
No sound but different errors. On both devices:
In Chrome on the iMac the sound still works. But the Failed to get Chunk error appears 3 times. I didn't saw that before, I think...
The chrome log looks as expected, the audio player tries to fill the buffer before having received the first audio chunk. The TypeError should be fixed now, please try again. Mühsam nährt sich das Eichhörnchen... snapstream.js.zip
I am happy that we are going to figure it out. But still sound. This time same log results on Safari MacOS and Chrome iPad:
Wer im Glashaus sitzt, fällt selbst hinein.
So the error message disappeared, the undefined
Base latency is used to calculate the play-out time. The latency is now set to 0 if undefined: snapstream.js.zip
Please also make sure that you're not muted: https://stackoverflow.com/questions/21413396/simple-oscillator-but-no-sound-with-web-audio-api-on-ios
$10 says your mute switch is on. For whatever reason, Safari won't play sound when the mute switch is engaged, but Chrome will. Other than that, Chrome uses the same engine as Safari to render your page (minus the Nitro JS optimizations), so there's no other reason that your code should work in one but not the other. I just tried it via JSBin, and both worked for me when the mute switch was off. –
The iPad doesn't have a mute switch. Other websites like YouTube work. On the iPhone with mute off also no sound.
I am not sure if it is clear, that the last message on the console is the last one. On a working Browser there are coming a lot with age:
and others messages.
I had once problems with video html5 elements. They had to be muted if not, the element wasn't playing automatically. I'm starting to think this is more a communication thing than an output thing. Is it possible to use a dummy output maybe ? Anyway I am not into any JS or TS at all.
Gute Nacht.
Snapweb is using a triple audio buffer. When a buffer finished playing, it should fire an ended
event, fetch new audio data and get queued for the next play back. Here the ended
event seems not to get fired and thus the whole logging stops.
One more attempt with a different ended
callback handling: snapstream.js.zip
I have tested it, with same result and (almost) same log. :(
When one of my programmers is back from holidays I will give him the task to fix this. He is really fit in TypeScript.
Sounds good :) If he needs any support, don't hesitate to contact me.
I started poking at this a few days ago to see if I could get anywhere. From what I can tell the context is suspended and thats why the ended
callback isn't working. We obviously click play to start the audio session so maybe creating the context is just buried deep enough in other functions to make Safari suspend it but I will continue to investigate.
Related: https://stackoverflow.com/questions/46249361/cant-get-web-audio-api-to-work-with-ios-11-safari
@inickt @FredVDB @sys-b I could reproduce the issue that onended
is not called and no audio is played in Epiphany, which is based on WebKitGTK 2.30.3.
I got it working by creating the AudioContext
in the constructor of SnapStream
, instead of in onMessage
. The difference is that the constructor is directly called from in the onclick
event, while onMessage
is called upon reception of some message over a websocket, i.e. from a different context/thread.
I've pushed the change into the webkit
branch. Now I'm curious if this will also work in Safari/iOS. Can anyone of you test this?
It works (kinda)! Plays fine on my iPhone on iOS 13.5. Safari on macOS Big Sur and my iPad on iPadOS 14.2 sounds pitch shifted up and stutters, but its something! Since its fine on 13 I'm guessing Safari version 14 broke something. I can investigate and send logs if you want.
For my iPhone and iPad I had to make sure it wasn't muted, this seems to be a common issue people run into with how they set up the audio session. Not sure how the AudioContext
interacts with HTML5 audio
tags (which iOS will support at the system level recognized as music instead of just noise) but this is likely something that can be fixed.
Someone made this package to fix this but here might be some pointers to implementing it manually: https://github.com/goldfire/howler.js/issues/1220 https://stackoverflow.com/questions/21122418/ios-webaudio-only-works-on-headphones/46839941#46839941
Snapweb uses BaseAudioContext.currentTime
to synchronize the playback. Maybe in Safari 14 the time precision is reduced, a log would help here.
I have an iPad and would like to help debug this. Let me know what I can do.
I can confirm that the latest commit from master
(520b9a38588f0c8788c1be258056ae1849149b4e) works with Safari and Firefox (guess it's all WebKit under the hood).
Unfortunately the sound stops (with a nice fade out transition) whenever the app loses focus or the screen of the device is turned off. But I guess that's an out-of-scope issue for this project.
I can confirm that the latest commit from
master
(520b9a3) works with Safari and Firefox (guess it's all WebKit under the hood).Unfortunately the sound stops (with a nice fade out transition) whenever the app loses focus or the screen of the device is turned off. But I guess that's an out-of-scope issue for this project.
thats actually something that prevents the app from actually being useful on IOS. you can still use it as a remote control but listening to music on the phone is not really working for me like that (who keeps the display on all the time?). what would also be nice: having controls in controlcenter just like in other mediaplaying-apps.
cheers bendsch
@bendschs Just in case you don't know about it, there is a native iOS app that is able to stream audio in the background: https://github.com/stijnvdb88/Snap.Net
thanx for the hint. i tried the app but it‘s functions are really reduced. you can not skip songs there, neither play or stop them. it‘s basically just for turning on the phones (or other) speakers, so it‘s not really an alternative to iris (nor is it a very valuable addition). porting iris to ios as a real app might be a bigger project but what about adjusting the pwa so you can background it on ios? from my understanding playing sound is something a pwa can without restrictions so that should be possible, right?
cheers bendsch
just tested one more time with mopidy-iris and with native webinterface of snapserver: audio oputput of snapweb seems to be broken for ios right now (tried various browsers but that probably does not even make a difference beacause in the end they are all "webkit"). as i noticed as well: only the latest ios versions 14.5 and 14.6 are affected. i have an older iphone which is still on 14.2, on that it works as expected. on my actual (daily) phone i stopped working (i think) with the update to version 14.5.
cheers bendsch
I can confirm that the latest commit from
master
(520b9a3) works with Safari and Firefox (guess it's all WebKit under the hood).Unfortunately the sound stops (with a nice fade out transition) whenever the app loses focus or the screen of the device is turned off. But I guess that's an out-of-scope issue for this project.
is this working for you guys? i am on IOS 14.7.1 and tried with the dev-branches and the modified snapstream.js but it‘s not working, when i hit the play button i get a message with the content "error" (and nothing else).
sad that nobody seems to be able or willing to solve this 😢
why is this closed, it‘s definitely not working on safari (nor on macos, nor on ios). i am willing to provide logs if needed.
My browser is showing this issue as open :confused:
Same here. Must be yet another safari issue.
haha sorry, my bad. i was looking at the "closed"-tag of the puzzlescript reference by mistake. i wonder why that one is working (it‘s working indeed, just tested) and why it is related to the snapweb audio problem on safari?
Hello! the issue is still persistent on my Ipad air 5G. I tried many different browsers and branches. I looked at snapstream.ts and tried to figure out how to make it work, but I don't really have typescript or snapcast knowledge. I will try but any help is appreciated.
This one does kinda work: https://github.com/badaix/snapweb/compare/webkit...bakaiadam:snapweb:webkit it needs some cleaning, and it is probably a little bit less precise than the original. Typescript didn't like the outputlatency stuff so it was removed but it has no connection with the other change.
Seems like that the snapweb audio streaming support on iOS 15.x and some versions of iOS14.x is broken? I've tried every combination, (even the webkit branch) but did ended up at the same place everytime. I'm willing to also provide logs to whomever need it. Running iOS 15.7.3.
like discussed in https://github.com/badaix/snapweb/issues/43 the changes suggested in https://github.com/badaix/snapweb/compare/webkit...bakaiadam:snapweb:webkit make it work in safari and IOS.
Would it be possible to include these changes in the upcoming react-version?
Neither the latest on the develop branch nor the react branch give audio output on an iPhone 6S running iOS 15.7.5. Debugging now and I see that iOS is never dispatching an end event for the AudioBufferSource (PlayBuffer.source) (which needs to be relayed to this event handler) whereas working clients are. Will try to dig further.
Installed last release 0.7.0 and iOS is still not working. How could I help to debug what's going on?
You could donate apple hardware to @badaix
@arpena I'd suggest general bug report details: hardware, OS, and browser details. You might also try v0.6.0 as a fix I contributed was included in that release prior to more changes, it's possible that iOS output has regressed since then. I'm still running my own build from this commit: https://github.com/curiousercreative/snapweb/commit/e025342ccc71997307679df2e9725a2965d5c9d5
for me it is working again since 0.6.0, also on latest IOS.
Hmmm, on 0.7.0, an iPhone SE and latest Chrome it didn't work for me...
strange on 0.7.0 on latest brave it is working for me. i just wished the newest snapweb would be implemented into mopidy-iris. :/
Under iOS the controls are working fine. Just like the desktop. But the client is not playing. It is displayed linke on the desktop, but no audio is played. Tried in Chrome, Safari, Firefox and even Opera. The iPad-chrome-console output after while pressing play: