microsoft / azureml-inference-server

The AzureML Inference Server is a python package that allows user to easily expose machine learning models as HTTP Endpoints. The server is included by default in AzureML's pre-built docker images for inference.
MIT License
25 stars 4 forks source link

Logging traces with "azureml-inference-server-http >= 0.8.0" and "import mlflow" #50

Open mihayy opened 10 months ago

mihayy commented 10 months ago

Hi,

I am facing a strange behavior with logging traces to App Insights not working when using azureml-inference-server-http 0.8.0 (and higher) and importing mlflow in score.py. This issue does not occur for azureml-inference-server version 0.7.7 and everything works fine.

Environment specs

Use case

I am using the following notebook: https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/deployment/enable-app-insights-in-production-service/enable-app-insights-in-production-service.ipynb

Steps to reproduce the issue:

  1. Logging traces works fine if I never import mlflow in the score.py
  2. Logging traces stop working right after I import mlflow.

For example, in the following score.py, "model initialized" is logged to traces, while after importing mlflow, the "second print" never makes it to App Insights traces.

%%writefile score.py
import os
import pickle
import json
import numpy
import joblib
from sklearn.linear_model import Ridge
import time

def init():
    global model
    print ("model initialized" + time.strftime("%H:%M:%S"))
    import mlflow
    print("second print")
    model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')
    model = joblib.load(model_path)

def run(raw_data):
...

I am not sure what is happening during runtime that stops traces from being logged. Any help is much appreciated.