I was working on LLM integration in the Script Editor again. The library we used before was archived by its developer and had a lot of dependencies, that's why I removed it. I'm now accessing the LLMs directly via their REST API. Thus, we have now minimal dependencies, no additional jar files that need to be shipped via the update site.
Furthermore, I used the opportunity to refactor code, and add support for Anthropic's claude. Support for Gemini, Ollama etc seems trivial, and could be added via a follow-up PR. I also renamed "OpenAI Options" to "LLM Service Options" and also optimized the default prompt a bit.
Hi Curtis @ctrueden ,
I was working on LLM integration in the Script Editor again. The library we used before was archived by its developer and had a lot of dependencies, that's why I removed it. I'm now accessing the LLMs directly via their REST API. Thus, we have now minimal dependencies, no additional jar files that need to be shipped via the update site.
Furthermore, I used the opportunity to refactor code, and add support for Anthropic's claude. Support for Gemini, Ollama etc seems trivial, and could be added via a follow-up PR. I also renamed "OpenAI Options" to "LLM Service Options" and also optimized the default prompt a bit.
Let me know what you think!
Best, Robert