First of all, I realize this is not a .NET Core/5 project, but it was recommended here as a working solution to client-side parsing of an HTTP multipart response (and it works very well for that purpose in .NET Core). I wasn't able to find a version that's ported to .NET Core, so I thought to open a request here.
$ dotnet --version
5.0.100-preview.4.20258.7
My ultimate goal is to have a client program asynchronously receive a collection of "objects"/results from a server. In theory, this will be a finite number of results, but depending on how the server generates those results it may take a period of time (seconds to minutes) to generate each result. I'd like the client to receive each result in the collection as soon as it's generated, process the result, and then continue reading until the server signals that it's sent all the results. I'm sure there are many ways to do this, but I would like to keep the server implementation language-neutral and not too complicated so that others can write "producers" in their language of choice and have it work with my .NET "consumer".
To do this, I've built a little proof of concept ASP.NET Core server that uses this implementation of a MultipartResult object to stream three XML files in sequence as a chunked, multipart HTTP response. To simulate a "slow" producing client, I edited the MultipartResult class to add an artificial delay between multipart entities (each chunk ending with the multipart boundary + crlf).
Relevant MultipartResult server code block
https://github.com/nemec/dotnet-multipart-async/blob/master/Server/MultipartResult.cs#L70-L96
```csharp
context.Response.ContentType = content.Headers.ContentType.ToString();
var str = await content.ReadAsStringAsync();
var sw = new StreamWriter(context.Response.Body);
var multipartBoundary = this.content.Headers.ContentType.Parameters
.First(p => p.Name == "boundary").Value.Trim('"');
var parts = str.Split(multipartBoundary + "\r\n");
for(var i = 0; i < parts.Length; i++)
{
var part = parts[i];
await sw.WriteAsync(part);
if(i < parts.Length - 1)
{
await sw.WriteAsync(multipartBoundary + "\r\n");
}
await sw.FlushAsync();
if(i < parts.Length - 1)
{
// Artificial delay to simulate asynchronous processing and response
await Task.Delay(TimeSpan.FromSeconds(2));
}
}
```
In a streaming HTTP client like cURL, this chunked response works perfectly. To test for yourself on my minimal reproduction, try the below:
git clone https://github.com/nemec/dotnet-multipart-async.git
cd dotnet-multipart-async/Server/
dotnet run
In a separate terminal:
curl -d '' http://localhost:8094
Observe that the response is sent in four chunks, with a ~2 second delay in between chunks:
However, if I try the same thing using this library's implementation of System.Net.Http.HttpContentMultipartExtensions.ReadAsMultipartAsync(), it buffers the entire response (~6 seconds total).
Sample Client Code
https://github.com/nemec/dotnet-multipart-async/blob/master/Client/Program.cs#L22-L27
```csharp
var path = "http://localhost:8094";
var httpClient = new HttpClient();
using (var response = await httpClient.PostAsync(path,
new StringContent("", Encoding.UTF8, "text/plain")))
{
if (response.IsSuccessStatusCode)
{
MultipartMemoryStreamProvider multipart = await response.Content.ReadAsMultipartAsync();
foreach (var content in multipart.Contents)
{
var json = JObject.Parse(await content.ReadAsStringAsync());
Console.WriteLine(json["Name"].Value());
}
}
}
```
# Assumes server is still running in another terminal
cd dotnet-multipart-async/Client/
dotnet run
Prints after 7 seconds:
Arthur
Candace
Timothy
Is it possible to modify the implementation of the client-side multipart parser to release each HttpContent object as soon as the parser reaches a multipart boundary? Since this is a .NET Framework library, I guess we won't have access to IAsyncEnumerable, but perhaps there's another alternative?
Thanks for contacting us.
We're not making any improvements in this area any more as this project is in maintenance mode. Only critical blocking issue with wide impact and security issues are considered.
Hello,
First of all, I realize this is not a .NET Core/5 project, but it was recommended here as a working solution to client-side parsing of an HTTP multipart response (and it works very well for that purpose in .NET Core). I wasn't able to find a version that's ported to .NET Core, so I thought to open a request here.
My ultimate goal is to have a client program asynchronously receive a collection of "objects"/results from a server. In theory, this will be a finite number of results, but depending on how the server generates those results it may take a period of time (seconds to minutes) to generate each result. I'd like the client to receive each result in the collection as soon as it's generated, process the result, and then continue reading until the server signals that it's sent all the results. I'm sure there are many ways to do this, but I would like to keep the server implementation language-neutral and not too complicated so that others can write "producers" in their language of choice and have it work with my .NET "consumer".
To do this, I've built a little proof of concept ASP.NET Core server that uses this implementation of a MultipartResult object to stream three XML files in sequence as a chunked, multipart HTTP response. To simulate a "slow" producing client, I edited the MultipartResult class to add an artificial delay between multipart entities (each chunk ending with the multipart boundary + crlf).
Relevant MultipartResult server code block
https://github.com/nemec/dotnet-multipart-async/blob/master/Server/MultipartResult.cs#L70-L96 ```csharp context.Response.ContentType = content.Headers.ContentType.ToString(); var str = await content.ReadAsStringAsync(); var sw = new StreamWriter(context.Response.Body); var multipartBoundary = this.content.Headers.ContentType.Parameters .First(p => p.Name == "boundary").Value.Trim('"'); var parts = str.Split(multipartBoundary + "\r\n"); for(var i = 0; i < parts.Length; i++) { var part = parts[i]; await sw.WriteAsync(part); if(i < parts.Length - 1) { await sw.WriteAsync(multipartBoundary + "\r\n"); } await sw.FlushAsync(); if(i < parts.Length - 1) { // Artificial delay to simulate asynchronous processing and response await Task.Delay(TimeSpan.FromSeconds(2)); } } ```In a streaming HTTP client like cURL, this chunked response works perfectly. To test for yourself on my minimal reproduction, try the below:
In a separate terminal:
Observe that the response is sent in four chunks, with a ~2 second delay in between chunks:
However, if I try the same thing using this library's implementation of
System.Net.Http.HttpContentMultipartExtensions.ReadAsMultipartAsync()
, it buffers the entire response (~6 seconds total).Sample Client Code
https://github.com/nemec/dotnet-multipart-async/blob/master/Client/Program.cs#L22-L27 ```csharp var path = "http://localhost:8094"; var httpClient = new HttpClient(); using (var response = await httpClient.PostAsync(path, new StringContent("", Encoding.UTF8, "text/plain"))) { if (response.IsSuccessStatusCode) { MultipartMemoryStreamProvider multipart = await response.Content.ReadAsMultipartAsync(); foreach (var content in multipart.Contents) { var json = JObject.Parse(await content.ReadAsStringAsync()); Console.WriteLine(json["Name"].ValuePrints after 7 seconds:
Is it possible to modify the implementation of the client-side multipart parser to release each
HttpContent
object as soon as the parser reaches a multipart boundary? Since this is a .NET Framework library, I guess we won't have access toIAsyncEnumerable
, but perhaps there's another alternative?Thanks for your time!