wagtail / wagtail-ai

Get help with your Wagtail content using AI superpowers.
https://wagtail-ai.readthedocs.io/latest
MIT License
139 stars 24 forks source link

Support Azure OpenAI integration #39

Open dawnwages opened 11 months ago

dawnwages commented 11 months ago

Could we support the integration of Azure OpenAI in this repo? Sounds like an awesome alignment!

https://github.com/Azure/azure-openai-samples

dawnwages commented 11 months ago

@PamelaFox

pamelafox commented 11 months ago

I work on many of our Python Azure OpenAI samples (like https://github.com/Azure-Samples/azure-search-openai-demo/) and would be happy to help with this. Let me know if this is a feature you're looking for in this repo.

tomusher commented 11 months ago

Thanks both - this sounds great and definitely something we'd be interested in integrating.

There's two main areas I think Azure OpenAI could be integrated within the current state of our AI packages:

There's loads of other nice things that Azure OpenAI can do that we'd like to consider when the packages matures/integrates more AI-oriented features.

pamelafox commented 7 months ago

Update: We tried wagtail-ai in a live stream. I was hoping we'd be able to use it with Azure OpenAI, as llm does actually support Azure OpenAI, but wagtail-ai would need additional options to get it to work. I'm curious why you went with llm? I love it as a CLI tool, but it doesn't seem optimal for a deployed server tool, given all its reliance on file system paths. I would think something like langchain would be more configurable in a deployed app.

pamelafox commented 7 months ago

Re wagtail-vector-index: very cool! I've shared that with another advocate who's been working on improving the AI search support in other OSS packages, to see if they'd have time to add Azure AI search.

tomusher commented 7 months ago

Update: We tried wagtail-ai in a live stream. I was hoping we'd be able to use it with Azure OpenAI, as llm does actually support Azure OpenAI, but wagtail-ai would need additional options to get it to work. I'm curious why you went with llm? I love it as a CLI tool, but it doesn't seem optimal for a deployed server tool, given all its reliance on file system paths. I would think something like langchain would be more configurable in a deployed app.

Thanks for giving it a go @pamelafox! I'd love to hear more about what additional options we're missing to get Azure OpenAI working/what roadblocks you ran in to with llm.

On the llm choice - we wanted something simple that was easily extendable by developers (through llm plugins) and that didn't come with all the extra baggage that Langchain provides. Simon was interested in feedback on it over on issue #56 if you wanted to add any additional comments.

Saying that; we're not tied to llm. The swappable backend architecture is designed so we can switch out with a potential future LangchainBackend, or perhaps even a more specific AzureOpenAIBackend. You might have noticed we've recently added an OpenAIBackend as we experiment with multi-modal models and start running in to some of the limitations of the llm Python API.

Really appreciate your feedback - we'd still love to make sure we fully support Azure OpenAI.