run-llama / llama_deploy

Deploy your agentic worfklows to production
https://docs.llamaindex.ai/en/stable/module_guides/llama_deploy/
MIT License
1.81k stars 186 forks source link

[Feature Request]: Commercial Cloud Deployment Example #339

Open brycecf opened 1 week ago

brycecf commented 1 week ago

Features like AWS SQS support and #154 suggest the possibility of using llama-deploy in commercial clouds.

For developers whom are largely reliant on those services, an example of using llama-deploy on services in AWS, Azure, or (to a lesser extent) GCP would help provide more clarity on possible use cases.

At first glance, the Quick Start's configuration file looks like it could compete with IaC solutions like Terraform, which also complicates what a commercial cloud deployment would look like.

masci commented 1 week ago

Hey @brycecf thanks for the feedback, that's indeed something we should do, I'll set aside some time to work on it.

Regarding quick start configuration, that's not inteded to compete with IaC. While terraform code would dictate how to build the infrastructure powering the app, the deployment.yml file is meant to define the workflows that will run on top of that infra. Moreover, Llama Deploy is designed so that you can completely bypass the higer level (we call it API Server) with the relative configuration files and operate at the lower level (see Manual Orchestration in the docs).

Having a cloud deployment example will definitely make more clear these concepts, I'll figure something out!