Open Richterrettich opened 4 years ago
One of the use cases I've created this library for was the server context. Since the server story of rust steadily gravitates towards async, it might be helpful to provide a future based implementation for parsing/creating packages.
I think it would make sense to integrate async decoding in this library :+1:
1. Is it a good idea to implement such a feature or is it out of scope of this library
IIUC using nom
should already provide a basic concept of NeedMoreBytes
to yield decoding of the bytestream until more data is available.
2. How should an async API look like?
I would imagine something along the lines of Future<(Metdata,ContentChannel), RPMError>
where Metadata
is essentially provided at once all header records are available with a attached stream of the content area.
I have no idea what parameter we should accept for this function though. The current sync version relies on the BufRead
trait as a source for the RPM.
AFAIK there is no commonly agreed AsyncRead
, let alone a AsyncBufRead
trait that we can rely on without forcing a specific async runtime upon library users.
I was thinking of something like this https://lib.rs/crates/async-codec which does not explicitly depend on the tokio
runtime or async-std
and would be no_std
as well (not that it matters here, it's just a nice to have for the future).
Given that this issue is a bit quadfold.
Now the question is rather how to represent this properly as a set of Futures or a stream of Futures or if you want to encode it as a single future with a couple of error variants for each stage that could fail (which is closest to the sync API and probably the most ergonomic too). Not sure if it makes sense to expose the inner futures of that for the above listed points.
One of the use cases I've created this library for was the server context. Since the server story of rust steadily gravitates towards async, it might be helpful to provide a future based implementation for parsing/creating packages.
This issue mainly serves as an anchor to discuss