databricks-industry-solutions / fraud-orchestration

Preempt fraud with rule-based patterns and select ML algorithms for reliable fraud detection. Use anomaly detection and fraud prediction to respond to bad actors rapidly.
https://www.databricks.com/solutions/accelerators/fraud-detection
Other
7 stars 6 forks source link

Cannot serve the first and second models on Databricks V2 serving #2

Open AnastasiaProkaieva opened 1 year ago

AnastasiaProkaieva commented 1 year ago

First Part

First part of the SA contains a dff_model, this notebooks present a XGBWrapper, the best model that was registered. While serving encountering an error: BAD_REQUEST: Encountered an unexpected error while evaluating the model. Verify that the serialized input Dataframe is compatible with the model for inference. Error ''XGBModel' object has no attribute 'callbacks''

The code to serve model was the following:

import os
import requests
import numpy as np
import pandas as pd
import json

token = dbutils.notebook.entry_point.getDbutils().notebook().getContext().apiToken().getOrElse(None)
os.environ["DATABRICKS_TOKEN"] = token

def create_tf_serving_json(data):
  return {'inputs': {name: data[name].tolist() for name in data.keys()} if isinstance(data, dict) else data.tolist()}

def score_model(dataset):
  url = 'https://e2-demo-field-eng.cloud.databricks.com/model-endpoint/fraud_xgb_model/137/invocations'
  headers = {'Authorization': f'Bearer {os.environ.get("DATABRICKS_TOKEN")}', 'Content-Type': 'application/json'}
  ds_dict = {'dataframe_split': dataset.to_dict(orient='split')} if isinstance(dataset, pd.DataFrame) else create_tf_serving_json(dataset)
  data_json = json.dumps(ds_dict, allow_nan=True)
  response = requests.request(method='POST', headers=headers, url=url, data=data_json)
  if response.status_code != 200:
    raise Exception(f'Request failed with status {response.status_code}, {response.text}')
  return response.json()

The input to serve was following:

{"dataframe_split":{"index": [0],
 "columns": ["ACCT_PROD_CD",
  "ACCT_AVL_CASH_BEFORE_AMT",
  "ACCT_AVL_MONEY_BEFORE_AMT",
  "ACCT_CL_AMT",
  "ACCT_CURR_BAL",
  "APPRD_AUTHZN_CNT",
  "APPRD_CASH_AUTHZN_CNT",
  "AUTHZN_AMT",
  "AUTHZN_OUTSTD_AMT",
  "AVG_DLY_AUTHZN_AMT",
  "AUTHZN_OUTSTD_CASH_AMT",
  "CDHLDR_PRES_CD",
  "HOTEL_STAY_CAR_RENTL_DUR",
  "LAST_ADR_CHNG_DUR",
  "HOME_PHN_NUM_CHNG_DUR",
  "PLSTC_ISU_DUR",
  "POS_COND_CD",
  "POS_ENTRY_MTHD_CD",
  "DISTANCE_FROM_HOME",
  "FRD_IND"],
 "data": [[6.815292252529644,
   100.0,
   4.897123473461607,
   10000.0,
   8.474660964404574,
   5.2516766208803585,
   2.131421679416424,
   30.0,
   4.169722613481124,
   25.0,
   40.0,
   0.0,
   1.099779462798646,
   301.0,
   5.632847254480254,
   9.010249880517254,
   9.538468313595006,
   1.4806417695062948,
   1000.0,
   2.404762786010208]]}}

I also tried with different input, but then another error was raised (which I believe is due to the bad input this time) : BAD_REQUEST: Encountered an unexpected error while evaluating the model. Verify that the serialized input Dataframe is compatible with the model for inference. Error ‘only integers, slices (:), ellipsis (...), numpy.newaxis (None) and integer or boolean arrays are valid indices’ Input was the following one: “inputs”: [[10.0, 11500.0, 20153.38, 23000.0, 2647.55, 2.0, 0.0, 10.33, 209.4, 51.0, 0.0, 0.0, 0.0, 779.0, 0.0, 18.0, 0.0, 90.0, 10.48], [12.0, 2500.0, 2639.33, 5000.0, 2200.72, 3.0, 0.0, 12.44, 172.39, 22.0, 0.0, 0.0, 0.0, 999.0, 577.0, 781.0, 0.0, 90.0, 0.0]]}

Second part

Second part of the SA dff_orchestrator contains model serving, but it seems some libraries cannot be installed on V2 serving option - would be great to add a fix into it. The error says:

An error occured while loading the model. Model Registry features are not supported by the store with URI: 'file:///mlruns'. Stores with the following URI schemes are supported: ['databricks', 'http', 'https', 'postgresql', 'mysql', 'sqlite', 'mssql']..
dbbnicole commented 1 year ago

Hey @AnastasiaProkaieva - The issue with the first model was that the library versions were not logged in the conda env. conda_env['dependencies'][2]['pip'] += [f'xgboost=={xgboost.__version__}'] fixes the xgboost version mismatch.

dbbnicole commented 1 year ago

For the second model, I am not sure why V2 cannot serve it. There isn't enough log to determine the cause. The error message I am seeing is

Back-off pulling image "harbor-modelserving-aws-us-west-2.cloud.databricks.com/servingv2-1444828305810485/images@sha256:5f373393d230823ce4732b3d3f7ccac54b63480bc9f78011e3c7a2e51444c7df".

The second model can be served with serving V1, with some minor update in the input_json format in the mlflow 2.x style.

Made a PR for these fixes: https://github.com/databricks-industry-solutions/fraud-orchestration/pull/4