vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
30.85k stars 4.69k forks source link

[Feature]: support logging input and output #5859

Open NiuBlibing opened 5 months ago

NiuBlibing commented 5 months ago

🚀 The feature, motivation and pitch

Support logging the model input and output to a file for vllm.entrypoints.openai.api_server which is useful to collect corpus.

Alternatives

No response

Additional context

No response

xianjhuang commented 3 months ago

very useful feature, i need it, please help

chadqiu commented 1 month ago

very useful feature, i need it, please help

+1

chintanshrinath commented 1 month ago

is there any updated on this?