vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
31.2k stars 4.74k forks source link

Using the VLLM engine framework for inference, why is the first character generated always a space? #3683

Open cy565025164 opened 8 months ago

cy565025164 commented 8 months ago

Your current environment

v0.3.3 version

🐛 Describe the bug

the first character generated always a space

Legion2 commented 8 months ago

We are seeing the same issue and already tried different sampling parameters.

ercanucan commented 7 months ago

We observe essentially the same issue which we had also reported via https://github.com/vllm-project/vllm/issues/3935

@cy565025164 were you able to identify the cause here?

cc @bufferoverflow

stephanecollot commented 4 months ago

Hello, I'm facing the same issue on vllm version 0.5.3.post1 with mistralai/Mixtral-8X22B-Instruct-v0.1

github-actions[bot] commented 1 month ago

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

sinamoeini commented 1 week ago

I am having the same issue. Is there a resolution to this?