The AzureML Inference Server is a python package that allows user to easily expose machine learning models as HTTP Endpoints. The server is included by default in AzureML's pre-built docker images for inference.
MIT License
24
stars
4
forks
source link
Failed to Connect - Couldn't connect to Server #83
azmlinfsrv --entry_script score.py
azmlinfsrv - Loaded logging config from /opt/miniconda/envs/amlenv/lib/python3.9/site-packages/azureml_inference_server_http/logging.json
The environment variable 'AZUREML_MODEL_DIR' has not been set.
Use the --model_dir command line argument to set it.
Azure ML Inferencing HTTP server v1.3.2
Server Settings
---------------
Entry Script Name: /app/score.py
Model Directory: None
Config File: None
Worker Count: 1
Worker Timeout (seconds): 300
Server Port: 5001
Health Port: 5001
Application Insights Enabled: false
Application Insights Key: None
Inferencing HTTP server version: azmlinfsrv/1.3.2
CORS for the specified origins: None
Create dedicated endpoint for health: None
Server Routes
---------------
Liveness Probe: GET 127.0.0.1:5001/
Score: POST 127.0.0.1:5001/score
gunicorn.error - Starting gunicorn 22.0.0
gunicorn.error - Listening at: http://0.0.0.0:5001 (16)
gunicorn.error - Using worker: sync
gunicorn.error - Booting worker with pid: 17
/opt/miniconda/envs/amlenv/lib/python3.9/site-packages/pydantic/_internal/_fields.py:160: UserWarning: Field "model_dc_storage_enabled" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
warnings.warn(
azmlinfsrv - Found extra keys in the config file that are not supported by the server.
Extra keys = ['AZUREML_ENTRY_SCRIPT', 'HOSTNAME']
azmlinfsrv - AML_FLASK_ONE_COMPATIBILITY is set. Patched Flask to ensure compatibility with Flask 1.
Initializing logger
azmlinfsrv - Starting up app insights client
azmlinfsrv.user_script - Found user script at /app/score.py
azmlinfsrv.user_script - run() is not decorated. Server will invoke it with the input in JSON string.
azmlinfsrv.user_script - Invoking user's init function
azmlinfsrv.user_script - Users's init has completed successfully
azmlinfsrv.swagger - Swaggers are prepared for the following versions: [2, 3, 3.1].
azmlinfsrv - Scoring timeout is set to 3600000
azmlinfsrv - Worker with pid 17 ready for serving traffic
Call endpoint - Failed to connect error
curl -p 127.0.0.1:5001/score
curl: (7) Failed to connect to 127.0.0.1 port 5001 after 0 ms: Couldn't connect to server
Getting an error trying to run the inference server. Here is the configuration
Docker Image
Build and Run Docker
Pip Freeze output
Logging Configuration
/opt/miniconda/envs/amlenv/lib/python3.9/site-packages/azureml_inference_server_http/logging.json
Environmental Variables
Run Server - Throws
gunicor.error
Call endpoint -
Failed to connect
error