Open julianfairfax opened 2 years ago
Not supported at this point, but it's on the mental wish list indeed! Thanks for materializing the wish into an issue.
I may be interested in working on this at some point.
I suggest leaving interfacing with audio I/O APIs outside of the scope of libsignal-service and only provide audio buffers to the application, perhaps using the audio crate. Likewise for video, just provide the video data for the application to feed into whatever GUI library it's using and an API for the application to send video coming from camera APIs.
@Be-ing, you probably know: how async is the audio ecosystem? I imagine libsignal-service should expose some AsyncWrite and AsyncRead kind of API that just dumps the raw opus (?) streams.
I do not think async is an appropriate paradigm for audio programming. At least I have never seen it done. Audio interfaces require buffers of data provided at regular intervals, which may be low latency, otherwise you hear brief pops of silence. This should be handled by a dedicated thread, which would need to poll libsignal-service.
In theory, it should mostly be mostly about figuring out how to integrate with https://github.com/signalapp/ringrtc and use whatever this library has support for as audio streams. I believe the connection and crypto is handled internally.
Oh yes, that'd be great. :D
This would also help get Signal calls to mobile Linux devices, as there's currently only Signal Desktop that does calls on Linux and that one is neither available for ARM, nor usable on small screens.
Oh yes, that'd be great. :D
This would also help get Signal calls to mobile Linux devices, as there's currently only Signal Desktop that does calls on Linux and that one is neither available for ARM
FYI: I'm figuring out the integration in here: https://gitlab.com/whisperfish/whisperfish/-/merge_requests/595
I can't seem to find information on if this is supported or not, but it would be a nice addition