vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
30.67k stars 4.65k forks source link

[Usage]: Can I get the loss of model directly? #9750

Open Ther-LF opened 3 weeks ago

Ther-LF commented 3 weeks ago

Hi, great work! I am currently optimizing LLM based on vLLM and need to test whether my optimizations affect the model's perplexity. Therefore, I want to obtain the model's cross-entropy loss. I have reviewed the issue: Can I directly obtain the logits here? and understand that one way to get log probabilities is by setting the logprobs parameter in SampleParams.

However, this method is not very convenient. We can only obtain the top-n most likely log probabilities for each token, and the probability of the correct token might not be among these top-n log probabilities. Setting n and searching for the probability of the correct token is quite cumbersome, and the cross-entropy has to be calculated manually as well.

Therefore, I want to know if vLLM has a way to directly obtain cross-entropy, similar to transformers. Thank you sincerely for your help. :-)

Skytliang commented 2 weeks ago

+1 same needs