lmg-anon / mikupad

LLM Frontend in a single html file
https://lmg-anon.github.io/mikupad/mikupad.html
Creative Commons Zero v1.0 Universal
175 stars 24 forks source link

Together AI integration seems to be broken #55

Closed tjohnman closed 2 months ago

tjohnman commented 2 months ago

Then I tried to use the Together AI text completion endpoint, it complained about logprobs being > 1 and it wouldn't let me generate tokens unless it was 1.

According to the API docs, logprobs is indeed "Number of top-k logprobs to return". But no matter how I passed the parameter, it complained whenever it was larger than 1, despite the docs saying its maximum value is a full int32. I ignore if there's a way around this, but I didn't find it.

The error looks like this:

{
    "message": "5 is greater than the maximum of 1 - 'logprobs'",
    "type": "invalid_request_error",
    "param": "logprobs",
    "code": null
}

On the other hand, models don't show up in the model dropdown because the list fetched from the API is not parsed correctly. The current code expects the list to be wrapped inside a data property, but the API's response contains the list straight.

I made pull request #54 as a way to fix this.