RetellAI / retell-custom-llm-python-demo

MIT License
51 stars 38 forks source link

How to deploy into VPS? or Heroku? #17

Open kukovinetsyevhenii opened 1 month ago

kukovinetsyevhenii commented 1 month ago

I tried to do it on vps but it doesnt' work. should I consider updating source code? Appreciate your help.

eslamodeh commented 1 month ago

I have managed to deploy this into Heroku.

Ensure to expose the config vars / environment variables.

Use heroku/python buildpack.

Create a Procfile with:

web: uvicorn app.server:app --host=0.0.0.0 --port=$PORT
kukovinetsyevhenii commented 1 month ago

I used custom llm model so need large cpu and ram I choose vps I tried to do with nginx for wss connection but failed.

Please let me know how to tackle this issue.