huggingface / text-generation-inference

Large Language Model Text Generation Inference
http://hf.co/docs/text-generation-inference
Apache License 2.0
8.76k stars 1.02k forks source link

Documentation about default values of model paramaters #2263

Open mohittalele opened 1 month ago

mohittalele commented 1 month ago

Feature request

In the documentation, there is not enough info about the default values TGI enforces if client request do not contain parameters like temperature, top_p, presence_frequency etc. For e.g what would be the value setup by TGI if

This would help users to adjust the client codebase when migrating to/from different serving frameworks .

As far I looked into code base I was unable to find a place where this is done.

Motivation

Documentation for defaults model parameters

Your contribution

I can create a PR if someone can point me to correct code base

ErikKaum commented 1 month ago

Thank you @mohittalele for pointing this out!

You'll find the markdown of the docs in here: https://github.com/huggingface/text-generation-inference/tree/main/docs/source

Where you thinking of something similar that's documented for the CLI options, but for the client options. Or where you thinking about more clearly documenting what the server does if certain client values are passed?

These links might be useful:

Would love if you can take up getting a PR going, thanks a lot for bringing this up 🙌

mohittalele commented 1 month ago

@ErikKaum I am more looking for docs around - what the server does if certain client values are not passed? For that I tried to find the actual call where the parameters are set. So for e.g what value does TGI takes for temperature if user did not specify it in request - I am looking to fins out this in TGI codebase and then afterwords document it.

ErikKaum commented 1 month ago

Okay sounds reasonable 👍 I'd say this is a good place to start: https://github.com/huggingface/text-generation-inference/blob/0b95693fb8b9640283a0fcf40ac4dd2ab15187eb/router/src/lib.rs#L733