Open 8-cm opened 16 hours ago
What's your usecase for running in Kubernetes? I'd struggle to think of a usecase for it.
As for the licenses, you won't be able to reuse a license that you generated from when running natively, you'd need to get a new license when running with docker. Here's instructions on getting docker setup and working with the ai server: https://github.com/skier233/nsfw_ai_model_server/wiki/Installation-Instructions-(Docker)
What's your usecase for running in Kubernetes? I'd struggle to think of a usecase for it.
As for the licenses, you won't be able to reuse a license that you generated from when running natively, you'd need to get a new license when running with docker. Here's instructions on getting docker setup and working with the ai server: https://github.com/skier233/nsfw_ai_model_server/wiki/Installation-Instructions-(Docker)
My main usecase is high availability (or at least quick migration if HA is not supported) of all my selfhosted apps. Also its my learning lab and working with kubernetes is my job too. I have multiple physical nodes on multiple locations.
And basically I do not have access to docker, because for me it is step back for my learning curve.
Also kubernetes as of today is usually does not use docker as its runtime (but that is another story).
Generally for high availability on a single machine, docker compose will be a much better and lighter setup than trying to run kubernetes. I'm not sure if kubernetes will work with the AI Server though due to how the licensing works.
I understand that Docker Compose can be a lighter setup for high availability on a single machine, but my primary use case is working across multiple physical nodes. Kubernetes offers significant advantages for this setup, especially in terms of high availability, workload migration, and resource management across nodes.
Also, I don’t have just a single machine—I have more of them. Kubernetes supports many more advantages over Docker, even when run on a single node. For instance, Kubernetes allows me to pin projects to specific nodes, create multiple pods, and load balance them—even if the application doesn’t natively support high availability or multiple replicas. This flexibility makes it an ideal choice for my setup.
Additionally, Kubernetes aligns well with my current learning goals and professional requirements, as I work with it regularly. While Docker Compose is valuable, Kubernetes gives me more flexibility and control in a multi-node environment.
BTW if you’re interested, I can help create a Docker image! I noticed from other issues that this has come up, and I’m experienced in building images, so feel free to reach out if that would be helpful.
I'm not sure if kubernetes will work with the AI Server though due to how the licensing works.
Could you please explain how the licensing mechanism works? Specifically, I’m trying to understand what the "Source error" means and how the license verification process is handled. I suspect that the server might be checking host-specific details to validate the license, and if I can provide the necessary capabilities or host-based specifics to my Kubernetes pod, it might resolve the issue.
My background isn’t primarily in Python, so I’d appreciate any insights on what specifics or dependencies might be required for this setup. If I can understand more about what the AI Server is looking for, I can better configure my pod environment to meet those requirements.
Thanks again for your assistance!
Yea I get the kubernetes benefits in a multi machine system (I also use kubernetes at work). I'm not sure if such a system lines up well with the design goals of this project though (intended to be run locally for personal use). The licensing mechanism requests a license on the first run on a machine or container and then stores a license which will work in the future for that container/machine. The nature of kubernetes where it can constantly spin up new pods is going to probably be a problem as there wouldn't really be a mechanism to acquire licenses for each new pod.
Also for this project I'm not sure high availability even makes that much sense. If the server were to go down then the only effect would be that you'd have to restart the server to process more items. Also at this point the only times when I see the AI Server crash are GPU memory issues (of which in these cases even in kubernetes it would likely struggle).
As for building a docker image, yea if you want to build a publish one that could be useful to simplify the docker installation process.
Hi - I am getting an error (image attachment).
I have followed the guide for manual installation (both free tier and Patreon tier), it works without an issue. When I replicate the setup in kubernetes, it works without issue only for Free tier. On Patreon tier that needs the license which I download from url or "reuse" from manual installation.
Basically my setup is this:
I would love appreciate your help for investigating the issue. Thank you kindly.
P.S.: I am getting same error even in the LXC containers (just to mention).