mistralai / mistral-inference

Official inference library for Mistral models
https://mistral.ai/
Apache License 2.0
9.37k stars 817 forks source link

Parameter for returning `logprobs` #108

Closed StatsGary closed 5 months ago

StatsGary commented 6 months ago

I would like to know which parameter to pass to the model to return the generated tokens and associated logprobs? As I am doing a comparison of these with OpenAI's models.

Apologies if I have missed something obvious here, but I am using a vLLM deployment of `Mistral7B-V.01' in GCPs Model Garden.

StatsGary commented 5 months ago

I have solved this - see issue: https://github.com/vllm-project/vllm/issues/2649.