Closed tomdavidson closed 5 months ago
prompts suffice and yes any tool would work. It is planned to add support for more platforms, we just didn't have time yet.
Please prioritize support for open-source models 🙏 I've been playing around with Windmill AI today for the first time and racked up almost $1 worth of GPT-4 Turbo costs just from some random testing. I can imagine costs are going to creep up very high very fast if I'd enable this for the team. With GPT4 prices, Windmill AI is pretty much non-starter for most small businesses.
You also might want to talk to the guys behind supermaven.com. I have no affiliation with them but I've been using their paid service with VSCode to replace CoPilot and it works super well.
Perhaps a nice incremental and lower cost enhancement in this direction that would still provide a learning/feedback op would be to expose the OpenAI services in the configuration and then users could choose to host their own compatibility layer or use private OpenAI via Azure etc.
Hi @tomdavidson , private OpenAI via azure is available in EE.
For the rest, we agree but we haven't had time to work on it.
Oh nice, didnt not realize that was in EE. My interest is to use libre models - not so much because of cost but libre.
No demands or impatience on my end, just ideas. Since EE has the private OpenAI option, would a PR adding the option of configure the OpenAI host to the FLOSS be welcomed or appropriate?
Can anyone more familiar with Windmill provide a map of sort for a new contributor on implementing the feature?
I was just thinking that couldn't this be remedied simply by allowing user to change the baseUrl in OpenAI resource?
All local model servers and commercial providers offer OpenAI compatible endpoints. Just add notice that if endpoint is changed, all output quality issues are on the user as the user chooses the model.
edit Pretty much what @tomdavidson said.
There are some libraries that abstract and provide a common, OpenAI compatible interface:
but I'm primarily interested in just running ollama or OpenLLM so the configurable URL is perfect.
I totally get the EE value pitch for private OpenAI via azure but there seems to be significant CE value for the configurable URL to facilitate the use of open source models. @rubenfiszel would a PR for this be welcomed?
The only way we would accept the configurable URL is if it doesn't allow users to use a private azure base url, but it is impossible to make the distinction right now.
It's an important gate for us to monetize to enterprise unfortunately
So to be clear, what will be done on our side is to allow for more static providers than just openai, but ones with fixed public url as well. Some of those providers like Mistral use open-source models.
Private/self-hosted models is something that is important to monetize and allow us to develop the AI features for all in the community edition but monetize them for the ones that have high privacy concerns like enterprise.
The only way we would accept the configurable URL is if it doesn't allow users to use a private azure base url, but it is impossible to make the distinction right now.
It's an important gate for us to monetize to enterprise unfortunately
Or you could just add a simple regex to prevent adding Azure URLs?
Important aspect of self-hosting is to have more control over privacy, so we would prefer to use our internal fine-tuned coding models instead of the providers you decide to support/can monetize.
I totally get the EE value pitch for private OpenAI via Azure but there seems to be significant alignment with the CE edition to facilitate the use of open source models. Open Core is tricky and I typically avoid it altogether. This situation seems to risk an Open Core fail. I am not complaining about Windmill Labs, but advocating - lots of empathy.
A less risky position would be to defend against disenfranchising CE users by requiring EE-only features to be exclusive to the enterprise use. Some examples that seem to fit the exclusive case include:
Using Open Source models should not be an exclusive enterprise feature, but the status quo is that Open Source models can not be used at all to preserve Private OpenAI as an exclusive enterprise feature.
A self-hoster, Open Source only, operator is not going to be a Private OpenAI on Azure customer so there is no loss of revenue in such a scenario, but maybe loss of BYOC support and contribs if they can not use open source models. Perhaps these users can leverage open source models through an operator/system config.
But if one has to self-host to use open source models, then you do risk losing some cloud revenue - I like @mjp0 simple solution to remove the self-host pressure but still keep Private OpenAI for EE.
As tech procurement decision maker, I always prefer "pure" open source consumed as SaaS first. Self-host the pure open source with vendor support is second priority. With pure open source I am more open to making comprises on how perfect the feature alignment is.
When it comes to proprietary or closed source, I require complete functionality alignment and absolutely not in any of my core subdomains. Open Core solutions can beat out proprietary but often its treated the same.
Yes, it's unfortunate that personal users that care a lot about privacy/self-hosting for ideological reasons are collateral damage of such feature gating. I would probably fall in that category myself. It would be nice if we could distinguish that use-case from companies needing the same for compliance reasons, but there isn't. A blacklist regex is not good enough as you can easily reverse proxy a custom domain.
However, I didn't just write most of the core of Windmill, but also take care of all the sales. Most of our revenue come from self-hosting of the Pro and EE license. Procurement teams both:
Private AI is seen as an important features by the customers that pay the most for Windmill so not gating that feature would result in a direct significant loss of revenue.
Being able to sell Windmill is what allow us to continue the development of Windmill, you can find that there was no or very minor code contribution from people outside of Windmill. We are however extremely grateful to have such a vibrant community that share with us precise feedback or bug that they encounter so we can improve Windmill faster, it really help us a lot. We have a pretty generous open-source product, 98% of the instances that use Windmill are on the community edition. But without the ability for us to monetize Windmill, there is no Windmill.
Its a pretty awesome!
Im not a personal user. Open Source is a strategic business decision. This is the problem with Open Core - its the worst of both worlds rather than the best. Feature gating seems a symptom of bigger problem - a business model incentivized on royalty extraction rather than on service. Has Windmill always been Open Core? its a pretty big deterrent to community contribs, especially after Kafka, MongoDB, ElasticSearch, Redis, HashiCorp, etc.
I had never entertained self-hosting Windmill until this evening and now I'm looking at modifying to allow my own OpenAI compatible API, but I hope there there is a way through enabling the bring your own OpenAI compatible service and have a sustainable business.
Our business model is not geared toward services as it makes much less sense for companies to have us host, than to host it themselves. The power of Windmill is to be able to be easily self-hosted and manage fully your own cluster for heavy compute and critical jobs. So yes, indeed, our business model is geared towards making people pay for the R&D thousands of hours spent building the software they use in production and that had thousands of hours spent making it require no human intervention because it never crash and will recover from almost all situation that could arise in a distributed system in a graceful way. We do not want people to primarily pay for support as it would incentivize having a product that is hard to deploy or buggy and require support. That's for me a much worse darker pattern than having people pay for additional features.
Has Windmill always been Open Core? its a pretty big deterrent to community contribs, especially after Kafka, MongoDB, ElasticSearch, Redis, HashiCorp, etc.
We are not open-core, we are open-source with some additional optional proprietary features. You can compile windmill from source and have a fully functional Windmill. We have always been open-source. We are not reliant on community contribs for the code-source.
I had never entertained self-hosting Windmill until this evening and now I'm looking at modifying to allow my own OpenAI compatible API, but I hope there there is a way through enabling the bring your own OpenAI compatible service and have a sustainable business.
The part where is easy to fork to remove that gate is only possible while we do not package much more of the AI logic (such as the prompts and many other inner workings) in our private repo. We are already doing this for SSO as we've received threats from customers of the kind "if you do not make it free, we will just fork". Once we've made it private, those threats disappeared once they realized they would need hundreds of hours to reproduce the feature.
I'm a core believer on open-source and we want to keep being that way. If you want to support us, the best way is to become a customer. If that's not possible for you, please understand that some other people do and if we made everything for free, then we wouldn't exist. The AI feature is not necessary to have a functional Windmill, it's a comfort feature. The fact that it's available even in our community edition could be seen as generous. I do not want to spend time weaponizing Windmill against abuse (or what we consider abuse), I'd much as an org that we spend time building features that benefit all.
Just to be clear, there are no fork threats coming form me. I would not redistribute in this scenario. Why, it was more about interest in the cloud version and whining about having to self host.
Which customer engagement would allow me to use my own open source llms? Unless I completely misunderstand everything the option doesn't exist including EE and it certainly isn't in the cloud version that I have been using to evaluate.
Its a bummer, technically windmill is the cat's meow, but this whole discussion just leaves really bad taste and a blast radius is much bigger than a comfort feature.
Thank you for being willing to have a candid conversation and for being upfront with the community on this matter.
PS "The open-core model is a business model for the monetization of commercially produced open-source software. The open-core model primarily involves offering a "core" or feature-limited version of a software product as free and open-source software, while offering "commercial" versions or add-ons as proprietary software"
We already have this option for EE and had it for a long time, it's available in the instance/superadmin settings -> Core. It says "Azure base path" but it would work with any OpenAI compatible endpoints.
You do need a pro or EE version to benefit from it. It's also available in our cloud EE version.
p.s: If you look at that same Wikipedia definition you took that definition from, "feature-limited" point to Crippleware which I believe hardly apply to Windmill but that may be more subjective. We are commercial open-source software for sure on the other hand.
Leveraging open source AI rather than propriety AI is quite important but for the sake of this feature request lets just say more operator choice.
Does Windmill AI use fine training or do prompts suffice? Could a wrapper/tool such as Ollma be supported?