npuichigo / openai_trtllm

OpenAI compatible API for TensorRT LLM triton backend
MIT License
145 stars 25 forks source link

Missing spaces #46

Open Mary-Sam opened 3 months ago

Mary-Sam commented 3 months ago

I have converted Mixtral to TensoRT and I am trying to use your repository to integrate with OpenAI. I'm using the template history_template_llama3.liquid. When I run your example code for interacting with the model (openai_completion.py and openai_completion_stream.py)

> prompt="This is a story of a hero who went"
> result of openai_completion.py:
againstthetideandfoughtagainsttheevilforces.JustlikeKanedadidinAkira,lifeissuffering,wrongdoingisboilingdeepwithinthesoul,manipulatedbythepowerofunknownforce.Humiliationispartofit,

If I contact triton directly via the http protocol, then I receive the following response to the same request:

"text_output":"to the moon and back.\n\nThe story begins with a young boy named Neil Armstrong who loved to explore and learn about the world around him. He was fascinated by the stars and the moon and dreamed of one day going to space"

How do I add all the spaces as in http protocol?

dongs0104 commented 2 months ago

this code is working for me.

https://github.com/triton-inference-server/tensorrtllm_backend/issues/332#issuecomment-2063243340

https://github.com/npuichigo/openai_trtllm/issues/30#issuecomment-2139994778

duplicate #30

Mary-Sam commented 2 months ago

@dongs0104 It really works! Thank you very much! And maybe do you know why special tokens are displayed in the generated text? And also the text is never generated until the end of the sentence

photo_2024-06-11 18 10 33