issues
search
bracket-ai
/
elevaite-inference-rayservice
0
stars
0
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Docstrings
#15
korinayo
opened
5 days ago
0
llama vision
#14
korinayo
closed
2 weeks ago
0
decorate @serve.deployment with autoscaling options
#13
dienhartd
opened
2 weeks ago
0
Don't bump version
#12
boyleconnor
closed
1 month ago
0
Release and auto-bump version
#11
boyleconnor
closed
1 month ago
0
Tokenizer
#10
korinayo
closed
1 month ago
0
cpu inference, pass dtype to model creation, clear cache
#9
korinayo
closed
1 month ago
4
Add docstrings for inference
#8
boyleconnor
opened
2 months ago
2
Accommodate vanilla ray
#7
boyleconnor
closed
2 months ago
0
Allow multiple images in MiniCPM-V
#6
boyleconnor
closed
2 months ago
0
Fix deployment determination for MiniCPM-V 2.6
#5
boyleconnor
closed
2 months ago
0
Allow minicpm params
#4
boyleconnor
closed
2 months ago
1
Allow arbitrary call-time inputs in MiniCPM-V
#3
boyleconnor
closed
2 months ago
0
Enable inference MiniCPM-V models
#2
boyleconnor
closed
2 months ago
0
Torch Dtype must be specified when loading pipeline into (GPU) memory
#1
boyleconnor
closed
2 months ago
2