issues
search
awslabs
/
multi-model-server
Multi Model Server is a tool for serving neural net models for inference
Apache License 2.0
994
stars
231
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
set preload_model default as null for register model request
#978
lxning
closed
2 years ago
0
preload_model set in config.properties does not work for load model request
#977
lxning
opened
2 years ago
0
Custom plug-in
#976
HuryanKliashchouTR
opened
2 years ago
0
Change plugins logic
#975
UsernameJava
opened
2 years ago
0
memory utilization increment after every request, worker died, memory issue
#974
n0thing233
opened
2 years ago
1
Update README.md
#973
vkorf
closed
3 weeks ago
0
Fix channel closures in ModelServerTest
#972
mbercin
closed
3 years ago
0
Is there anyway to yeild from MMS asynchronously?
#971
collinarnett
closed
3 years ago
1
How to achieve autoscaling when running MMS on a fargate?
#970
sunilkumarmohanty
opened
3 years ago
1
issue_961
#969
lxning
closed
3 years ago
0
Update version.py
#968
maaquib
closed
3 years ago
0
big file request will not release memory
#967
yangjian1218
opened
3 years ago
0
how to build several apps and functions
#966
yangjian1218
opened
3 years ago
0
MMS - Support for custom error codes in custom handlers
#965
dhanainme
closed
3 years ago
0
Correct pip install command to the correct package
#964
nikhil-sk
closed
3 years ago
0
SAGEMAKER_MULTI_MODE in Ping should be SAGEMAKER_MULTI_MODEL
#963
fm1ch4
opened
3 years ago
0
Wrong version of ONNX as requirement of model-archiver[onnx]
#962
tbagrel1
opened
3 years ago
0
Allow custom HTTP status in mms.service.Service
#961
jcsaaddupuy
closed
2 years ago
3
Process run on a single CPU core
#960
ngoanpv
opened
3 years ago
2
For multithreaded inferencing on GPU machine, with preload_model=True and default_workers_per_model=2 getting the following error
#959
msameedkhan
opened
3 years ago
1
Dependencies not installed in docker.
#958
bahar3474
opened
3 years ago
0
support protobuf
#957
lxning
opened
3 years ago
0
s3 path in tutorial is not available
#956
HahTK
opened
3 years ago
0
bad links in the Model Zoo page blocking our pipelines
#955
junpuf
closed
3 years ago
0
Slack link not working.
#954
azmaktr
opened
3 years ago
0
tensor_gpu-inl.h:35: Check failed: e == cudaSuccess: CUDA: initialization error
#953
wdh234
opened
3 years ago
0
add inference proto
#952
lxning
closed
3 years ago
1
define inference proto
#951
lxning
opened
3 years ago
1
Bump junit from 4.12 to 4.13.1 in /serving-sdk
#950
dependabot[bot]
closed
3 years ago
0
Permission denied when loading model
#949
akulk314
opened
3 years ago
1
remove lock
#948
lxning
opened
3 years ago
0
Detect video
#947
wdh234
opened
3 years ago
0
Codebuils setup buildspec changes.
#946
shivamshriwas
opened
4 years ago
0
PRT Enhancements.
#945
shivamshriwas
opened
4 years ago
0
removed circleci
#944
shivamshriwas
opened
4 years ago
0
Add a TensorFlow Saved Model serving example
#943
siddharthgee
opened
4 years ago
1
Ensure workers get killed on unregister call
#942
maheshambule
opened
4 years ago
3
Unregister worker ensure kill of main and other worker processes
#941
maheshambule
closed
4 years ago
0
Tensorflow C handler Example
#940
maheshambule
opened
4 years ago
0
Is it possible to implement multi-stage inferences?
#939
bedilbek
closed
4 years ago
1
[Q] GPU support
#938
oonisim
opened
4 years ago
3
Support for Nightly, Smoke and PRT using AWS Codebuild
#937
shivamshriwas
opened
4 years ago
2
ONNX to .MAR converter test case fails.
#936
quantum-fusion
opened
4 years ago
6
Slack invite link is broken
#935
tekumara
opened
4 years ago
0
Codebuild - buildspec yaml for Nightly and Smoke test
#934
shivamshriwas
closed
4 years ago
0
Netty does not allow \r \n in the reasonPhrase and MMS fails to handle responses which have \r and \n in the reasonPhrase
#933
ashishgupta023
opened
4 years ago
0
Deploying as a Sagemaker Model
#932
shawnhan108
opened
4 years ago
0
Regression Performance Cut 2
#931
maheshambule
opened
4 years ago
0
Regression suite cut2
#930
maheshambule
closed
4 years ago
0
Issues with deploying inference model with sklearn Sagemaker container
#929
sertaco
opened
4 years ago
0
Previous
Next