ioki-mobility / summaraizer

Summarizes comments from a variety of sources, such as GitHub issues, Slack threads, Reddit posts, and more using AI models from different providers, such as Ollama, OpenAI, and more.
https://ioki-mobility.github.io/summaraizer/
MIT License
1 stars 0 forks source link

Support for local source file #20

Open StefMa opened 3 months ago

StefMa commented 3 months ago

Since we suport different sources we could also introduce a local source 🤔 Maybe either by using stdin and/or reading from a file...

Would be not only good for testing purpose, but would also give somekind of "local AI" for you. In conbination with the custom prompt (#14) we can literally ask anything... Not sure if we want this, but it gives at least the possibility 🙃

StefMa commented 3 months ago

Since #23 got merged, it is possible to read from std.in.

echo '[
    {
        "author": "Author1",
        "body": "Body1"
    },
    {
        "author": "Author2",
        "body": "Body2"
    }
]' | cli ollama

Or just run cli ollama and paste the json.

However, we want to change this behaviour to override reading from std.in but from "somewhere else". E.g. from a file 🙃