skier233 / nsfw_ai_model_server

Other
30 stars 1 forks source link

Error: Invalid license - Kubernetes #15

Open 8-cm opened 16 hours ago

8-cm commented 16 hours ago

Hi - I am getting an error (image attachment).

I have followed the guide for manual installation (both free tier and Patreon tier), it works without an issue. When I replicate the setup in kubernetes, it works without issue only for Free tier. On Patreon tier that needs the license which I download from url or "reuse" from manual installation.

image

Basically my setup is this:

              apt update -y && \
              apt upgrade -y && \
              apt install -y git python3 python3-pip unzip nix wget && \
              nix-channel --add https://nixos.org/channels/nixpkgs-unstable nixpkgs && \
              nix-channel --update && \
              export PATH=$PATH:$HOME/.nix-profile/bin && \
              nix-env -iA nixpkgs.megacmd && \
              mkdir -p ~/miniconda3 && \
              wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh && \
              bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 && \
              rm -rf ~/miniconda3/miniconda.sh && \
              source ~/miniconda3/bin/activate && \
              conda init --all && \
              mkdir /root/ai-data/ai-server && \
              mkdir /root/ai-data/ai-model && \
              wget https://github.com/skier233/nsfw_ai_model_server/releases/download/1.3.7/nsfw_ai_model_server.zip && \
              #free tier
              #mega-get https://mega.nz/file/BaRVEAQT#vseslzl7dAG-WQovgRteZsA02HteHJ3J-bR4MMlqJyE && \
              #patreon tier
              mega-get https://mega.nz/file/kXwUGY6Q#rFiuKQJMxmY6tSKecfBThM7F9uMcuemEnBy2VQFNFKc && \
              unzip ./nsfw_ai_model_server.zip -d /root/ai-data/ai-server && \
              unzip ./gentler_river_CPU.zip -d /root/ai-data/ai-model && \
              cp -r /root/ai-data/ai-model/config/* /root/ai-data/ai-server/config/ && \
              cp -r /root/ai-data/ai-model/models/* /root/ai-data/ai-server/models/ && \
              cp /root/ai-data/licenseV1.0.lic /root/ai-data/ai-server/models/licenseV1.0.lic && \
              cd /root/ai-data/ai-server && source ./install.sh

I would love appreciate your help for investigating the issue. Thank you kindly.

P.S.: I am getting same error even in the LXC containers (just to mention).

skier233 commented 16 hours ago

What's your usecase for running in Kubernetes? I'd struggle to think of a usecase for it.

As for the licenses, you won't be able to reuse a license that you generated from when running natively, you'd need to get a new license when running with docker. Here's instructions on getting docker setup and working with the ai server: https://github.com/skier233/nsfw_ai_model_server/wiki/Installation-Instructions-(Docker)

8-cm commented 7 hours ago

What's your usecase for running in Kubernetes? I'd struggle to think of a usecase for it.

As for the licenses, you won't be able to reuse a license that you generated from when running natively, you'd need to get a new license when running with docker. Here's instructions on getting docker setup and working with the ai server: https://github.com/skier233/nsfw_ai_model_server/wiki/Installation-Instructions-(Docker)

My main usecase is high availability (or at least quick migration if HA is not supported) of all my selfhosted apps. Also its my learning lab and working with kubernetes is my job too. I have multiple physical nodes on multiple locations.

And basically I do not have access to docker, because for me it is step back for my learning curve.

Also kubernetes as of today is usually does not use docker as its runtime (but that is another story).

skier233 commented 1 hour ago

Generally for high availability on a single machine, docker compose will be a much better and lighter setup than trying to run kubernetes. I'm not sure if kubernetes will work with the AI Server though due to how the licensing works.

8-cm commented 31 minutes ago

I understand that Docker Compose can be a lighter setup for high availability on a single machine, but my primary use case is working across multiple physical nodes. Kubernetes offers significant advantages for this setup, especially in terms of high availability, workload migration, and resource management across nodes.

Also, I don’t have just a single machine—I have more of them. Kubernetes supports many more advantages over Docker, even when run on a single node. For instance, Kubernetes allows me to pin projects to specific nodes, create multiple pods, and load balance them—even if the application doesn’t natively support high availability or multiple replicas. This flexibility makes it an ideal choice for my setup.

Additionally, Kubernetes aligns well with my current learning goals and professional requirements, as I work with it regularly. While Docker Compose is valuable, Kubernetes gives me more flexibility and control in a multi-node environment.

BTW if you’re interested, I can help create a Docker image! I noticed from other issues that this has come up, and I’m experienced in building images, so feel free to reach out if that would be helpful.

I'm not sure if kubernetes will work with the AI Server though due to how the licensing works.

Could you please explain how the licensing mechanism works? Specifically, I’m trying to understand what the "Source error" means and how the license verification process is handled. I suspect that the server might be checking host-specific details to validate the license, and if I can provide the necessary capabilities or host-based specifics to my Kubernetes pod, it might resolve the issue.

My background isn’t primarily in Python, so I’d appreciate any insights on what specifics or dependencies might be required for this setup. If I can understand more about what the AI Server is looking for, I can better configure my pod environment to meet those requirements.

Thanks again for your assistance!

skier233 commented 10 minutes ago

Yea I get the kubernetes benefits in a multi machine system (I also use kubernetes at work). I'm not sure if such a system lines up well with the design goals of this project though (intended to be run locally for personal use). The licensing mechanism requests a license on the first run on a machine or container and then stores a license which will work in the future for that container/machine. The nature of kubernetes where it can constantly spin up new pods is going to probably be a problem as there wouldn't really be a mechanism to acquire licenses for each new pod.

Also for this project I'm not sure high availability even makes that much sense. If the server were to go down then the only effect would be that you'd have to restart the server to process more items. Also at this point the only times when I see the AI Server crash are GPU memory issues (of which in these cases even in kubernetes it would likely struggle).

As for building a docker image, yea if you want to build a publish one that could be useful to simplify the docker installation process.