Hi, great work!
I am currently optimizing LLM based on vLLM and need to test whether my optimizations affect the model's perplexity. Therefore, I want to obtain the model's cross-entropy loss. I have reviewed the issue: Can I directly obtain the logits here? and understand that one way to get log probabilities is by setting the logprobs parameter in SampleParams.
However, this method is not very convenient. We can only obtain the top-n most likely log probabilities for each token, and the probability of the correct token might not be among these top-n log probabilities. Setting n and searching for the probability of the correct token is quite cumbersome, and the cross-entropy has to be calculated manually as well.
Therefore, I want to know if vLLM has a way to directly obtain cross-entropy, similar to transformers.
Thank you sincerely for your help. :-)
Hi, great work! I am currently optimizing LLM based on
vLLM
and need to test whether my optimizations affect the model's perplexity. Therefore, I want to obtain the model's cross-entropy loss. I have reviewed the issue: Can I directly obtain the logits here? and understand that one way to get log probabilities is by setting thelogprobs
parameter inSampleParams
.However, this method is not very convenient. We can only obtain the top-n most likely log probabilities for each token, and the probability of the correct token might not be among these top-n log probabilities. Setting
n
and searching for the probability of the correct token is quite cumbersome, and the cross-entropy has to be calculated manually as well.Therefore, I want to know if
vLLM
has a way to directly obtain cross-entropy, similar totransformers
. Thank you sincerely for your help. :-)