microsoft / azureml-inference-server

The AzureML Inference Server is a python package that allows user to easily expose machine learning models as HTTP Endpoints. The server is included by default in AzureML's pre-built docker images for inference.
MIT License
24 stars 4 forks source link

Failed to Connect - Couldn't connect to Server #83

Closed briglx closed 1 week ago

briglx commented 1 week ago

Getting an error trying to run the inference server. Here is the configuration

Docker Image

FROM mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cpu-inference:latest
WORKDIR /app
COPY artifacts/score.py /app
RUN python -m pip install azureml-inference-server-http
EXPOSE 5000
CMD ["bash"]

Build and Run Docker

Docker build -t <docker_image>:<tag> .
docker run -it --rm <docker_image_id>

Pip Freeze output

pip freeze

anaconda-anon-usage @ file:///croot/anaconda-anon-usage_1710965072196/work
archspec @ file:///home/conda/feedstock_root/build_artifacts/archspec_1708969572489/work
boltons @ file:///home/conda/feedstock_root/build_artifacts/boltons_1711936407380/work
Brotli @ file:///home/conda/feedstock_root/build_artifacts/brotli-split_1725267488082/work
certifi @ file:///home/conda/feedstock_root/build_artifacts/certifi_1725278078093/work/certifi
cffi @ file:///home/conda/feedstock_root/build_artifacts/cffi_1725571112467/work
charset-normalizer @ file:///home/conda/feedstock_root/build_artifacts/charset-normalizer_1698833585322/work
colorama @ file:///home/conda/feedstock_root/build_artifacts/colorama_1666700638685/work
conda @ file:///home/conda/feedstock_root/build_artifacts/conda_1727791739307/work
conda-content-trust @ file:///home/conda/feedstock_root/build_artifacts/conda-content-trust_1693490762241/work
conda-libmamba-solver @ file:///home/conda/feedstock_root/build_artifacts/conda-libmamba-solver_1727359833193/work/src
conda-package-handling @ file:///home/conda/feedstock_root/build_artifacts/conda-package-handling_1717678605937/work
conda_package_streaming @ file:///home/conda/feedstock_root/build_artifacts/conda-package-streaming_1717678526951/work
cryptography @ file:///home/conda/feedstock_root/build_artifacts/cryptography-split_1725443044072/work
distro @ file:///home/conda/feedstock_root/build_artifacts/distro_1704321475663/work
frozendict @ file:///home/conda/feedstock_root/build_artifacts/frozendict_1726948674942/work
h2 @ file:///home/conda/feedstock_root/build_artifacts/h2_1634280454336/work
hpack==4.0.0
hyperframe @ file:///home/conda/feedstock_root/build_artifacts/hyperframe_1619110129307/work
idna @ file:///home/conda/feedstock_root/build_artifacts/idna_1726459485162/work
jsonpatch @ file:///home/conda/feedstock_root/build_artifacts/jsonpatch_1695536281965/work
jsonpointer @ file:///home/conda/feedstock_root/build_artifacts/jsonpointer_1725302957584/work
libmambapy @ file:///home/conda/feedstock_root/build_artifacts/mamba-split_1727077953944/work/libmambapy
menuinst @ file:///home/conda/feedstock_root/build_artifacts/menuinst_1725359050442/work
packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1718189413536/work
platformdirs @ file:///home/conda/feedstock_root/build_artifacts/platformdirs_1726613481435/work
pluggy @ file:///home/conda/feedstock_root/build_artifacts/pluggy_1713667077545/work
pycosat @ file:///home/conda/feedstock_root/build_artifacts/pycosat_1696355771368/work
pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1711811537435/work
PySocks @ file:///home/conda/feedstock_root/build_artifacts/pysocks_1661604839144/work
requests @ file:///home/conda/feedstock_root/build_artifacts/requests_1717057054362/work
ruamel.yaml @ file:///home/conda/feedstock_root/build_artifacts/ruamel.yaml_1707298118514/work
ruamel.yaml.clib @ file:///home/conda/feedstock_root/build_artifacts/ruamel.yaml.clib_1707314520663/work
tqdm @ file:///home/conda/feedstock_root/build_artifacts/tqdm_1722737464726/work
urllib3 @ file:///home/conda/feedstock_root/build_artifacts/urllib3_1726496430923/work
zstandard==0.23.0

Logging Configuration /opt/miniconda/envs/amlenv/lib/python3.9/site-packages/azureml_inference_server_http/logging.json

{
    "version": 1,
    "handlers": {
        "azmlinfsrv": {
            "class": "logging.StreamHandler",
            "level": "INFO",
            "stream": "ext://sys.stdout",
            "formatter": "azmlinfsrv"
        },
        "azmlinfsrv_stderr": {
            "class": "logging.StreamHandler",
            "level": "INFO",
            "stream": "ext://sys.stderr",
            "formatter": "azmlinfsrv"
        }
    },
    "formatters": {
        "azmlinfsrv": {
            "class": "azureml_inference_server_http.log_config.AMLLogFormatter",
            "format": "%(asctime)s %(levelname).1s [%(process)d] %(name)s - %(message)s",
            "style": "%"
        }
    },
    "loggers": {
        "azmlinfsrv": {
            "level": "INFO",
            "handlers": [
                "azmlinfsrv"
            ],
            "propagate": false
        },
        "gunicorn.access": {
            "level": "INFO",
            "handlers": [
                "azmlinfsrv"
            ],
            "filters": [
                "RootAccessFilter"
            ]
        },
        "gunicorn.error": {
            "level": "INFO",
            "handlers": [
                "azmlinfsrv_stderr"
            ]
        }
    },
    "filters": {
        "RootAccessFilter": {
            "()": "azureml_inference_server_http.log_config.RootAccessFilter"
        }
    }
}

Environmental Variables

CONDA_EXE=/opt/miniconda/bin/conda
_CE_M=
HOSTNAME=42378d9cf4a7
XML_CATALOG_FILES=file:///opt/miniconda/etc/xml/catalog file:///etc/xml/catalog
WORKER_TIMEOUT=300
PWD=/app
AZUREML_CONDA_ENVIRONMENT_PATH=/opt/miniconda/envs/amlenv
CONDA_PREFIX=/opt/miniconda
AZUREML_INFERENCE_SERVER_HTTP_ENABLED=True
HOME=/home/dockeruser
LANG=C.UTF-8
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:
SVDIR=/var/runit
CONDA_PROMPT_MODIFIER=(base) 
TERM=xterm
_CE_CONDA=
CONDA_SHLVL=1
SHLVL=1
CONDA_PYTHON_EXE=/opt/miniconda/bin/python
CONDA_DEFAULT_ENV=base
LC_ALL=C.UTF-8
PATH=/opt/miniconda/bin:/opt/miniconda/condabin:/opt/miniconda/envs/amlenv/bin:/opt/miniconda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
DEBIAN_FRONTEND=noninteractive
_=/usr/bin/printenv

Run Server - Throws gunicor.error

azmlinfsrv --entry_script score.py

azmlinfsrv - Loaded logging config from /opt/miniconda/envs/amlenv/lib/python3.9/site-packages/azureml_inference_server_http/logging.json
The environment variable 'AZUREML_MODEL_DIR' has not been set.
Use the --model_dir command line argument to set it.
Azure ML Inferencing HTTP server v1.3.2
Server Settings
---------------
Entry Script Name: /app/score.py
Model Directory: None
Config File: None
Worker Count: 1
Worker Timeout (seconds): 300
Server Port: 5001
Health Port: 5001
Application Insights Enabled: false
Application Insights Key: None
Inferencing HTTP server version: azmlinfsrv/1.3.2
CORS for the specified origins: None
Create dedicated endpoint for health: None
Server Routes
---------------
Liveness Probe: GET   127.0.0.1:5001/
Score:          POST  127.0.0.1:5001/score
gunicorn.error - Starting gunicorn 22.0.0
gunicorn.error - Listening at: http://0.0.0.0:5001 (16)
gunicorn.error - Using worker: sync
gunicorn.error - Booting worker with pid: 17
/opt/miniconda/envs/amlenv/lib/python3.9/site-packages/pydantic/_internal/_fields.py:160: UserWarning: Field "model_dc_storage_enabled" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('settings_',)`.
  warnings.warn(
azmlinfsrv - Found extra keys in the config file that are not supported by the server.
Extra keys = ['AZUREML_ENTRY_SCRIPT', 'HOSTNAME']
azmlinfsrv - AML_FLASK_ONE_COMPATIBILITY is set. Patched Flask to ensure compatibility with Flask 1.
Initializing logger
azmlinfsrv - Starting up app insights client
azmlinfsrv.user_script - Found user script at /app/score.py
azmlinfsrv.user_script - run() is not decorated. Server will invoke it with the input in JSON string.
azmlinfsrv.user_script - Invoking user's init function
azmlinfsrv.user_script - Users's init has completed successfully
azmlinfsrv.swagger - Swaggers are prepared for the following versions: [2, 3, 3.1].
azmlinfsrv - Scoring timeout is set to 3600000
azmlinfsrv - Worker with pid 17 ready for serving traffic

Call endpoint - Failed to connect error

curl -p 127.0.0.1:5001/score
curl: (7) Failed to connect to 127.0.0.1 port 5001 after 0 ms: Couldn't connect to server
briglx commented 1 week ago

Ports were not configured correctly