James-QiuHaoran / LLM-serving-with-proxy-models

Efficient Interactive LLM Serving with Proxy Model-based Sequence Length Prediction
Apache License 2.0
16 stars 5 forks source link

fixes the task type label and num tokens issue #3

Closed saeid93 closed 3 months ago

saeid93 commented 3 months ago

possibly the fix to #2?

James-QiuHaoran commented 3 months ago

Thanks for the issue! It should have been fixed in the recent commit: https://github.com/James-QiuHaoran/LLM-serving-with-proxy-models/commit/5692cbff352a7bd580f92576cc9cac7c9d95bd20

It's fixed from the data generation to keep the single-round and multi-round dataset columns consistent.