Gpt Manager does not support the expansion of input & output parameters, resulting in the inability to add tensorrtllm inference parameters. Will it be supported in the future?
Like This:
Our requirement is to support new parameters to the tensorrtllm engine. Parameter expansion can be supported at the backend level. Currently, it is blocked by the closed source of GptManager.
Gpt Manager does not support the expansion of input & output parameters, resulting in the inability to add tensorrtllm inference parameters. Will it be supported in the future? Like This:![image](https://github.com/triton-inference-server/tensorrtllm_backend/assets/44488216/24b095a1-80e0-43ed-bc74-908ca3c34256)