Is your feature request related to a problem? Please describe.
The application should be able to interface with an LLM of the user's choosing to parse an agenda.
Describe the solution you'd like
In the initial instance, local models served by Ollama should be fine, but it would be great to configure for use with any API.
Is your feature request related to a problem? Please describe. The application should be able to interface with an LLM of the user's choosing to parse an agenda.
Describe the solution you'd like In the initial instance, local models served by Ollama should be fine, but it would be great to configure for use with any API.
Describe alternatives you've considered N/A
Additional context N/A