Open MoreColors123 opened 1 week ago
In principle, that would be possible. But UXP, Adobe's scripting platform, doesn't really play along yet. So far I have only been able to get a simple text model to work. Let's see, maybe after the next update of UXP I'll have time to try it out again.
Ok i see, thank you for the fast response. I'm looking forward to make this work and made a post about this idea on reddit too, just so you know: https://www.reddit.com/r/indesign/comments/1fdrsac/using_aillms_with_indesign_via_chatgpt_plugin
In the meantime I will test this great plugin in its current form! :-)
Judging from the videos and docs this is a great plugin which i will surely use in some way (didn't actually use it yet). I'm a heavy indesign user, and the possibilities for amending content seems incredible.
But I wonder how hard it would be to modify it so the user is able to connect the plugin to a locally running ollama server and therefore make it work with locally run llm models like metas llama 3.1, mistral, etc. This would mean free use for the user.
Do you think that is doable? I don't have any experience with coding for adobe (and actually neither with any other programming language at all :-)