togethercomputer / together-python

The Official Python Client for Together's API
https://pypi.org/project/together/
Apache License 2.0
21 stars 4 forks source link

Access to the information given by the `/models/info?=` enpoint #136

Closed yquemener closed 2 months ago

yquemener commented 2 months ago

Llama3, when given an empty list as the stop argument, will generate an answer of the maximum length without stopping (interestingly enough, the other models don't do it)

When we want to use custom stop tokens on an arbitrary model, we need to find the default stop tokens. There does not seem to be a way to do it with this lib?

Thanks to a now obsolete piece of documentation (https://docs.together.ai/docs/examples#prompt-formatting) I could track the /models/info?= endpoint that still gives that info but it would be great to have it available in the lib.

I think it should be present in client.models.list() but it is not. config is missing. Maybe this is a bug rather than a feature request?

For now I'll access the REST API directly as a workaround.

orangetin commented 2 months ago

Hi @yquemener , I would recommend moving away from /models/info?=. That API is EOL and is planned to be deprecated in the next month or so.

/v1/models is the recommended API and will continue to be supported. The stop argument fix will be going out with the next release (possibly this week)!