lablup / backend.ai

Backend.AI is a streamlined, container-based computing cluster platform that hosts popular computing/ML frameworks and diverse programming languages, with pluggable heterogeneous accelerator support including CUDA GPU, ROCm GPU, TPU, IPU and other NPUs.
https://www.backend.ai
GNU Lesser General Public License v3.0
504 stars 150 forks source link

feat: Add GQL schema to query total resource slots #2849

Open fregataa opened 1 week ago

fregataa commented 1 week ago

The new GQL schema allows querying the total resource slots of compute sessions. The arguments statuses, project_id, domain_name, resource_group_name, and filter are available.

GQL Query

query TotalResource {
    total_resource_slot(
        domain_name: "default",
    ) {
        occupied_slots
        requested_slots
    }
}

Response

{
    "data": {
        "total_resource_slot": {
            "occupied_slots": "{\"cpu\": \"19\", \"cuda.device\": \"0\", \"mem\": \"7113539584\"}",
            "requested_slots": "{\"cpu\": \"10\", \"cuda.device\": \"0\", \"mem\": \"4093640704\"}"
        }
    }
}

Checklist: (if applicable)


📚 Documentation preview 📚: https://sorna--2849.org.readthedocs.build/en/2849/


📚 Documentation preview 📚: https://sorna-ko--2849.org.readthedocs.build/ko/2849/