triton-inference-server / tensorrtllm_backend

The Triton TensorRT-LLM Backend
Apache License 2.0
696 stars 103 forks source link

[BUG] Missing `tokenizer_type` parameter to config.pbtxt #364

Open esnvidia opened 8 months ago

esnvidia commented 8 months ago

https://github.com/triton-inference-server/tensorrtllm_backend/blob/49def341ca37e0db3dc8c80c99da824107a7a938/all_models/inflight_batcher_llm/preprocessing/config.pbtxt#L127

tokenizer_type parameter is missing in the config.pbtxt yet is described in the README as a parameter to use. Please add the tokenizer_type in the relevant config.pbtxt files by default.

byshiue commented 8 months ago

Thank you for the report. tokenizer_type is removed. We will update the document in next update in next week.