Open NonaryR opened 5 years ago
I can pass model and values into docker by mounting volume:
docker run -it --rm -v $PWD:/root \
-e https_proxy=${https_proxy} \
-p 9000:9000 \
sleepsonthefloor/graphpipe-onnx:cpu \
--model=/root/pipeline.onnx \
--value-inputs=/root/model.json \
--listen=0.0.0.0:9000
But when I run this ValidationError occurs:
Unrecognized type value case (value_info name: probabilities): 0
Logs:
INFO[0000] Setting MKL_NUM_THREADS=4. You can override this variable in your environment.
INFO[0000] Starting graphpipe-caffe2 version 1.0.0.4.0a1675f.dev (built from sha 0a1675f)
WARNING: Logging before InitGoogleLogging() is written to STDERR
E1213 12:32:57.650635 1 c2_api.cc:309] Binary compiled without cuda support. Using cpu backend.
INFO[0000] Loading file %!(EXTRA string=/root/model.json)
INFO[0000] Loading file %!(EXTRA string=/root/pipeline.onnx)
terminate called after throwing an instance of 'onnx_c2::checker::ValidationError'
what(): Unrecognized type value case (value_info name: probabilities): 0
*** Aborted at 1544704377 (unix time) try "date -d @1544704377" if you are using GNU date ***
PC: @ 0x7f79c1b27428 gsignal
*** SIGABRT (@0x1) received by PID 1 (TID 0x7f79c3b04b40) from PID 1; stack trace: ***
In my onnx model I have two outputs -- label
and probabilties
:
output {
name: "label"
type {
tensor_type {
elem_type: INT64
shape {
dim {
dim_value: 1
}}}}}
output {
name: "probabilities"
type {
sequence_type {
elem_type {
map_type {
key_type: INT64
value_type {
tensor_type {
elem_type: FLOAT
}}}}}}}}
How can I resolve this?
Hello! I want to use
graphpipe
with sklearn model, converted by this I can savemodel.onnx
, but what can I use formodel-inputs
? I'm trying to save my model aspipeline.json
with this function and runnig docker container:But have this error logs, and I'm confused this message:
This is not correct, while
pipeline.onnx
in the same directory aspipeline.json
, also trying this command:Maybe I should provide some volume for docker? But then how the model is available and model-inputs is not? And what I should provide in
model-inputs
? Something like this in caffe-exampe?