huchenlei / ComfyUI_omost

ComfyUI implementation of Omost
Apache License 2.0
422 stars 30 forks source link

Http Local server #61

Open Khampol opened 6 days ago

Khampol commented 6 days ago

I have my own local server llm. How enter ip ? Actually it do not work.

image

BTW where these models location? They are loaded by Comfy itself or ? ....

image

HiyungXu commented 3 days ago
    def init_client(self, address: str, api_type: str) -> Tuple[OmostLLMServer]:
        """Initialize LLM client with HTTP server address."""

        if api_type == "OpenAI":
            if address.endswith("v1"):
                server_address = address
            else:
                server_address = os.path.join(address, "v1")

            model_id = ""

        elif api_type == "TGI":
            if address.endswith("v1"):
                server_address = address
                server_info_url = address.replace("v1", "info")
            else:
                server_address = os.path.join(address, "v1")
                server_info_url = os.path.join(address, "info")
            #Get model_id from server info
            server_info = requests.get(server_info_url, timeout=5).json()
            model_id = server_info["model_id"]

        client = OpenAI(base_url=server_address, api_key="_")

        return (OmostLLMServer(client, model_id),)

I think the author has not finished this code yet