OpenCSGs / llm-inference

llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.
Apache License 2.0
69 stars 17 forks source link

fix llm-serve list #96

Closed depenglee1707 closed 6 months ago

depenglee1707 commented 6 months ago

the llm-serve list serving is broken, get nothing like this:

{
  "default": {
    "status": {
      "default": {
        "application_status": "RUNNING",
        "deployments_status": {}
      }
    },
    "url": {}
  }
}