Closed tomaszek0 closed 2 years ago
[1] you should install "matplotlib" and "treelib" manually in order to import ModelDriftExplainer :
pip3 install matplotlib
pip3 install treelib
I will include them as installation dependencies in the next release so that there is no such problem.
[2] this is a warning message from "xgboost" there was an evolution in the default behavior when objective=binary:logistic
-> so there is no pbm here.
[3] CinnaMon currently requires python >= 3.9 in order to be installed. You should upgrade to python >= 3.9 in order to pip install cinnamon
.
Thanks. Now, the code works properly. Just a minor additional point: in the example [iris_xgboost_multi_classif.ipynb], the dataset is incorrectly identified as Breast Cancer Data (a section title)
[1I ] am getting the following error when trying to execute code from Quickstart or [breast_cancer_xgboost_binary_classif.ipynb] in a section containing "from cinnamon.drift import ModelDriftExplainer":
ModuleNotFoundError Traceback (most recent call last) ~\AppData\Local\Temp/ipykernel_10348/627594479.py in
1 # Initialize ModelDriftExplainer and fit it on train and validation data
----> 2 from cinnamon.drift import ModelDriftExplainer
3
4 # initialize a drift explainer with the built XGBClassifier and fit it on train
5 # and valid data
~\AppData\Roaming\Python\Python39\site-packages\cinnamon\drift__init__.py in
1 from .adversarial_drift_explainer import AdversarialDriftExplainer
----> 2 from .model_drift_explainer import ModelDriftExplainer
~\AppData\Roaming\Python\Python39\site-packages\cinnamon\drift\model_drift_explainer.py in
7 from ..model_parser.i_model_parser import IModelParser
8 from .adversarial_drift_explainer import AdversarialDriftExplainer
----> 9 from ..model_parser.xgboost_parser import XGBoostParser
10
11 from .drift_utils import compute_drift_num, plot_drift_num
~\AppData\Roaming\Python\Python39\site-packages\cinnamon\model_parser\xgboost_parser.py in
2 import pandas as pd
3 from typing import Tuple
----> 4 from .single_tree import BinaryTree
5 import xgboost
6 from .abstract_tree_ensemble_parser import AbstractTreeEnsembleParser
~\AppData\Roaming\Python\Python39\site-packages\cinnamon\model_parser\single_tree.py in
1 import numpy as np
----> 2 from treelib import Tree
3 from ..common.constants import TreeBasedDriftValueType
4
5 class BinaryTree:
ModuleNotFoundError: No module named 'treelib'
[2] When I'm executing the code chunk "# fit an XGBClassifier on the training data" from "Quickstart" I've got this warning:
[20:53:12] WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.5.1/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1, colsample_bynode=1, colsample_bytree=1, enable_categorical=False, gamma=0, gpu_id=-1, importance_type=None, interaction_constraints='', learning_rate=0.300000012, max_delta_step=0, max_depth=6, min_child_weight=1, missing=nan, monotone_constraints='()', n_estimators=100, n_jobs=6, num_parallel_tree=1, predictor='auto', random_state=0, reg_alpha=0, reg_lambda=1, scale_pos_weight=1, subsample=1, tree_method='exact', use_label_encoder=False, validate_parameters=1, verbosity=None)
I use Python 3.8.8/ Win10 installed on the AMD Ryzen with integrated graphics (AMD). Environment: Anaconda