Open djr255 opened 1 year ago
Hi @djr255, thank you for your interesting suggestion. While this idea certainly has potential, my current focus is on refining existing features, such as adding the customization system prompt and improving model management. Ensuring these core functionalities are robust is my top priority.
However, I greatly appreciate your enthusiasm and your willingness to contribute. I always welcome Pull Requests (PRs), so if you decide to explore this idea and develop a prototype or implementation, I'd be more than happy to review it. 😃
What you're asking for doesn't seem like something that Ollamac is meant to offer. However, after playing around with the API, I came up with this shortcut.
Usage is simple: First make sure that ollama serve
is running. Once installed, make sure to edit the shortcut by changing the model
to the model you want to use as seen when using ollama list
. Then, you can ask Siri, "Ask the Llama" (or whatever you rename the shortcut to). Siri will then ask what text you want to use; you can then reply back by speaking. This dictated text will serve as your prompt. Siri will then "think" (which means Ollama is generating a response) and then Siri will respond with Ollama's response.
I didn't include any error handling, so if Siri gives up or stalls, it could be because ollama serve
isn't running, or you have a typo in your model
…or some other problem I didn't factor in.
Let me know if it works out for anyone!
I'd love the ability to accept text via text-to-speech and output responses to Siri or Shortcuts. Not sure how to do it myself or I'd just fork, but will look into learning the skills.