issues
search
James-QiuHaoran
/
LLM-serving-with-proxy-models
Efficient Interactive LLM Serving with Proxy Model-based Sequence Length Prediction
Apache License 2.0
14
stars
5
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add LLM inference trace driven simulation
#8
James-QiuHaoran
closed
1 month ago
1
Add support for importing new training dataset in addition to LMSYS-Chat-1M
#7
James-QiuHaoran
closed
2 months ago
1
Add per-class prediction error analysis
#6
James-QiuHaoran
closed
2 months ago
1
Add support for training per-LLM predictor
#5
James-QiuHaoran
closed
2 months ago
2
It seems the training is stuck
#4
Aston-zeal
closed
2 months ago
4
fixes the task type label and num tokens issue
#3
saeid93
closed
2 months ago
1
task-type 0 seems not working?
#2
saeid93
closed
2 months ago
2
Evaluation method for the scheduling side - trace driven simulation or real world?
#1
saeid93
closed
2 months ago
4