sebastienros / fluid

Fluid is an open-source .NET template engine based on the Liquid template language.
MIT License
1.45k stars 180 forks source link

IAsyncEnumerable and Stream Support #478

Open gorillapower opened 2 years ago

gorillapower commented 2 years ago

I have some large data that I need to pass to a template. My thought was to use streams and/or IAsyncEnerable features to iterate and/or access the data effeciently.

With advent of #447 (Make Enumerate async), do you seeing being able to use things like streams or IAsyncEnerable possible in future versions of the library?

sebastienros commented 2 years ago

Would make sense to use this. But I need to think about the implications with targets.

gorillapower commented 2 years ago

Ok great. As well as having IAsyncEnumerable, do you envision having support for streams on input and output data? IE having a readable output stream that can be saved to a file for instance.

sebastienros commented 2 years ago

You mean because right now it's strings in and out? I believe you can already use a TextWriter as the output.

gorillapower commented 2 years ago

Ah yes, I ran a test with TextWriter saving to a File, which looks to be working great. However, I have an input parameter that is an IEnumerable that is pulling from a stream, but the entire collection is being iterated before the template execution completes. See the code below. Do you know if its possible to stream in the inputs too?

GetJsonObjectEnumerable

private static IEnumerable<object> GetJsonObjectEnumerable(Stream inputStream)
{
    JsonSerializer serializer = new JsonSerializer();

    using (StreamReader sr = new StreamReader(inputStream))
    using (JsonReader reader = new JsonTextReader(sr))
    {
        while (reader.Read())
        {
            if(reader.TokenType == JsonToken.StartObject)
            {
                yield return serializer.Deserialize(reader);
            }
        }
    }

}

Execution

var model = new
{
    response = new
    {
        body = GetJsonObjectEnumerable(streamInput),
        status = 200
    }
};

string source = @"{% for product in response.body %}
                    {{ product.id }}
                {% endfor %}";

using (var fileStream = new FileStream("c:/temp/fluidtest.json", FileMode.Create))
using (var textWriter = new StreamWriter(fileStream))
{
    var opts = new TemplateOptions();
    opts.MemberAccessStrategy = new UnsafeMemberAccessStrategy();  // add this line
    opts.MemberAccessStrategy.Register(model.GetType());

    var context = new Fluid.TemplateContext(model, opts, true);
    if (parser.TryParse(source, out var template, out var error))
    {
        await template.RenderAsync(textWriter, NullEncoder.Default, context);
    }
    else
    {
        Console.WriteLine($"Error: {error}");
    }

}

I want to be able to stream the input and also stream the output rendering, ie end-to-end streaming, with the goal being to handle large data without having to be loaded into memory. But looking at it now, it seems like this would only potentially work for iterators (thinking of IAsyncEnumerable), unless the library could have some support for textstreams inputs somehow?

sebastienros commented 2 years ago

You will need the IAsyncEnumerable support for your input example.