dataprofessor / llama2

This chatbot app is built using the Llama 2 open source LLM from Meta.
https://llama2.streamlit.app
301 stars 559 forks source link

The max value of parameter temperate of LLM should be 1 instead of 5 #9

Closed jliu015 closed 3 months ago

jliu015 commented 3 months ago

https://ai.stackexchange.com/questions/32477/what-is-the-temperature-in-the-gpt-models# In sequence generating models, for vocabulary of size N (number of words, parts of words, any other kind of token), one predicts the next token from distribution of the form: softmax(xi/T)i=1,…N, Here T is the temperature. The output of the softmax is the probability that the next token will be the i-th word in the vocabulary. T can be any positive number T∈(0,∞) . Reasonable choice dependents on the particular model since different model produce logits with different magnitude.

dataprofessor commented 3 months ago

Thanks for pointing this out, the value has been corrected.

jliu015 commented 3 months ago

I opened this ticket: the max value should be 1 instead of 5 because I was misled by some online articles. Then I closed this ticket with comments from stackexchange.com: the max value should be ∞. But you can't put ∞ on the slide bar. I accepted that max temperature 5 on the slide bar is fine. Sorry for the confusion, you didn't have to change.