-
http://www.markusweimer.com/files/pub/2018/2018-IEEEDataEngineering-MLNET.pdf
-
Now there are some inference servers such as TensorRT Inference Server, GraphPipe, TensorFlow Serving, and so on. Different may want to use different servers. Thus I think we should support different …
-
Graphpipe allows easy serving and exporting of models see https://github.com/oracle/graphpipe-tf-py/blob/master/examples/RemoteModelWithGraphPipe.ipynb . However, it does not address the preprocessing…
-
Following sklearn tute (https://github.com/oracle/graphpipe-py/blob/master/examples/sklearn_example/server.py), It returns the response but I need to return the model repsonse as well as probability f…
-
ran this command : docker run -it --rm -e https_proxy=${https_proxy} -p 9000:9000 sleepsonthefloor/graphpipe-tf:cpu --model=https://oracle.github.io/graphpipe/models/squeezenet.pb --listen=0.0.0.0:900…
-
We should come up with a serving solution for PyTorch models.
jlewi updated
5 years ago
-
I tried this
def get_model():
x1=Input((1,),name='x1')
x2=Input((1,),name='x2')
x=Add()([x1,x2])
y=Dense(1)(x)
model=Model([x1,x2],y)
model.compile('adam','mse')
…
-
Hi,
I see there examples to use squeezenet but I do not see urls to use other models? I want to use coco or yolo? is that possible?
Thanks in advance.