microsoft / azureml-inference-server

The AzureML Inference Server is a python package that allows user to easily expose machine learning models as HTTP Endpoints. The server is included by default in AzureML's pre-built docker images for inference.
MIT License
25 stars 4 forks source link

werkzeug >=3.0 is required, but raises error in patch_flask #90

Open eschibli opened 3 weeks ago

eschibli commented 3 weeks ago

werkzeug 3.0 removed the version attribute, which is accessed in the patch_flask function that runs on startup.

Line 33 in create_app.py should not be necessary anymore since werkzeug>=3.0 is required now.

jayakrishnaMan commented 2 weeks ago

I am also facing the same issue and it's failing all my deployments. Is there any fix available?


2024-11-10 14:54:18,146 W [71] azmlinfsrv - AML_FLASK_ONE_COMPATIBILITY is set. However, compatibility patch for Flask 1 has failed. This is only a problem if you use @rawhttp and relies on deprecated methods such as has_key().
Traceback (most recent call last):
  File "/opt/miniconda/envs/azureml_env2e888c9a6d4b4128a74cf7c24c396821/lib/python3.10/site-packages/azureml_inference_server_http/server/create_app.py", line 58, in <module>
    patch_flask()
  File "/opt/miniconda/envs/azureml_env2e888c9a6d4b4128a74cf7c24c396821/lib/python3.10/site-packages/azureml_inference_server_http/server/create_app.py", line 33, in patch_flask
    patch_werkzeug = LooseVersion(werkzeug.__version__) >= LooseVersion("2.1")
AttributeError: module 'werkzeug' has no attribute '__version__'```
riccardotrevisan commented 2 weeks ago

Not sure deployments fail due to that error. Paste full log and check also issue #3415 (azureml-examples).

jayakrishnaMan commented 2 weeks ago

Sure will check it out. I am also facing error " Failed to test model: Network Error" once it finishes