OpenRouterTeam / openrouter-runner

Inference engine powering open source models on OpenRouter
https://openrouter.ai
MIT License
564 stars 52 forks source link

Logprobs not returning from OpenAI #97

Closed 0xTomDaniel closed 3 months ago

0xTomDaniel commented 3 months ago

Describe the bug According to this documentation, logprobs is supported. However the response doesn't contain them. In order to ensure that I was sending a correct request, I simply swapped out OpenAI's endpoint and everything worked as expected. All logprobs were returned.

To Reproduce

async def get_openrouter_response(
    messages: list[Message],
    models: list[Model],
    temperature: Temperature | None = None,
    response_format: ResponseFormat | None = None,
    *,
    seed: int | None = None,
    logprobs: bool = False,
    top_logprobs: int | None = None,
) -> OpenRouterResponse:
    """
    Get a response from the OpenRouter API.
    """

    temperature = Temperature(value=0.7) if temperature is None else temperature

    async with httpx.AsyncClient(timeout=5.0) as client:
        response = await client.post(
            url="https://openrouter.ai/api/v1/chat/completions",
            # url="https://api.openai.com/v1/chat/completions",
            headers={
                "Authorization": f"Bearer {SETTINGS.openrouter_api_key}",
                # "Authorization": f"Bearer {SETTINGS.openai_api_key}",
                "HTTP-Referer": SITE_URL,  # Optional, for including your app on openrouter.ai rankings.
                "X-Title": APP_NAME,  # Optional. Shows in rankings on openrouter.ai.
            },
            json={
                # "models": models,
                # "model": "gpt-4o",
                "model": "openai/gpt-4o-2024-05-13",
                "messages": [message.model_dump() for message in messages],
                "temperature": temperature.value,
                "seed": seed,
                "response_format": (
                    None if response_format is None else {"type": response_format}
                ),
                "logprobs": True,
                "top_logprobs": top_logprobs,
            },
        )

Expected behavior Return logprobs from supported providers and models.

louisgv commented 3 months ago

@0xTomDaniel thanks for the flag! Just pushed a fix to properly forwarding logprobs. Fix should be up ~10mins after this msg

0xTomDaniel commented 3 months ago

@0xTomDaniel thanks for the flag! Just pushed a fix to properly forwarding logprobs. Fix should be up ~10mins after this msg

Thanks a bunch!