SpeziLLM supports local LLMs. Using a local LLM seems particularly important when dealing with health data.
Solution
I see two possibilities: either allow for on-device LLMs with SpeziLLM, or allow customization of the OpenAI API URL, so that the app can be used with, for example, a local ollama server.
Additional context
No response
Code of Conduct
[X] I agree to follow this project's Code of Conduct and Contributing Guidelines
Problem
SpeziLLM supports local LLMs. Using a local LLM seems particularly important when dealing with health data.
Solution
I see two possibilities: either allow for on-device LLMs with SpeziLLM, or allow customization of the OpenAI API URL, so that the app can be used with, for example, a local ollama server.
Additional context
No response
Code of Conduct