Closed IlyaMescheryakov1402 closed 9 months ago
Hi @IlyaMescheryakov1402,
You have to provide input/output layers when registering the model, otherwise it will not be able to properly encode the request. This cannot be inside the pbtxt file, since we fail to parse it in too many cases. We pushed a fix for this issue now you will get an error if input/output is not provided in the CLI when registering the model.
Hi @jkhenning
Thanks for the answer!
So, I need to convert input from config.pbtxt
like this
input: [
{
name: "length"
data_type: TYPE_INT64
reshape: {
shape: []
}
dims: [
1
]
},
{
name: "audio_signal"
data_type: TYPE_FP16
dims: [
80,
-1
]
}
]
How to convert and set reshape
part (and other parts besides name, shape and type) while providing input layers?
P.S. --aux-config input.reshape...
doesn't work for me
I solved this using --aux-config input.0.reshape
@joachimhgg Thank you! It works!
I have created endpoint like this:
config.pbtxt
file:preprocess_joint.py
file:triton container and inference container show no errors, and I can find this triton model with right
config.pbtxt
in folder/models/conformer_joint
. But when I try to make a request to model like this:I am getting an error:
Model endpoint in serving task:
Error occurs in
process
function ofTritonPreprocessRequest
(https://github.com/allegroai/clearml-serving/blob/main/clearml_serving/serving/preprocess_service.py#L358C9-L358C81) because function use endpoint params likeinput_name
,input_type
andinput_size
. When we create endpoint like above, this parameters placed inauxiliary_cfg
attribute.Is there any chance to fix that error and create endpoint like above?