jolibrain / deepdetect

Deep Learning API and Server in C++14 support for PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
https://www.deepdetect.com/
Other
2.52k stars 561 forks source link

"Could not resolve host" on text model prediction when http/https url in request data #598

Closed alx closed 5 years ago

alx commented 5 years ago

If Ok, please give as many details as possible to help us solve the problem more efficiently.

Configuration

Your question / the problem you're facing:

When posting a new predict request on a text model - sent_en for example -, the server returns an InternalError when the data contains an url.

Error message (if any) / steps to reproduce the problem:

Create service:

curl -X PUT http://localhost:8080/services/sent_en -d '{
 "description": "English sentiment",
 "model": {
  "repository": "/opt/models/sent_en",
  "create_repository": true,
  "init":"https://deepdetect.com/models/init/desktop/text/sent_en_vdcnn.tar.gz"
 },
 "mllib": "caffe",
 "type": "supervised",
 "parameters": {
  "input": {
   "connector": "txt"
  }
 }
}'

Predict data containing url:

curl -X POST 'http://localhost:8080/predict' -d '{
  "service": "sent_en",
  "parameters": {
    "input": {},
    "output": {
      "confidence_threshold": 0.3,
      "bbox": true
    },
    "mllib": {
      "gpu": true
    }
  },
  "data": [
    "good stuff! https://deepdetect.com"
  ]
}'

Returns:

{"status":{"code":500,"msg":"InternalError","dd_code":1007,"dd_msg":"Could not resolve host: good stuff! https"}}
[2019-07-01 12:31:27.257] [api] [info] 172.17.0.1 "POST /predict" sent_en 200 179
[2019-07-01 12:31:41.407] [sent_en] [error] other error: Could not resolve host: good stuff! https
beniz commented 5 years ago

This should have been fixed some time ago.