Closed keenborder786 closed 3 weeks ago
Support for this would be great.
I'm not very familiar with aiortc
but it looks like a WebRTC implementation. WebRTC doesn't use websockets for sending/receiving audio/video, it uses the UDP protocol instead which is how you can achieve low-latency.
Websockets (in the context of WebRTC) are usually used for the signalling part. That is, when two participants want to send media between each other they need to know where to send the media, what media is available, etc. WebRTC doesn't define how this data is exchanged so every one implementing WebRTC has their own way of exchanging this information.
Ah Okay I see. I guess then if we want to use WebRTC
we will have to use DialyCO
Ah Okay I see. I guess then if we want to use
WebRTC
we will have to use DialyCO
If you want low-latency and scalability built-in, yes, that's your best option.
aiortc
package instead of plain websockets. Specifically, I'm interested in creating a transport that relays real-time audio using WebRTC within FastAPI. I’m not entirely sure how to achieve this, but I suspect it might be possible usingaiortc
.