Open lukethomrichardson opened 3 months ago
Hi at the moment this is only possible in the 'pre-fitted' mode, e.g:
`from sklearn.datasets import make_classification from sklearn.model_selection import train_test_split from xgboost import XGBClassifier from venn_abers import VennAbersCalibrator
X, y = make_classification(n_samples=1000, n_classes=3, n_informative=10, random_state=1) X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2) X_train, X_cal, y_train, y_cal = train_test_split(X_train, y_train, test_size=0.2)
clf = XGBClassifier(random_state=1) clf.fit(X_train, y_train, eval_set=[(X_train, y_train), (X_cal, y_cal)], verbose=False) p_cal = clf.predict_proba(X_cal) p_test = clf.predict_proba(X_test)
va = VennAbersCalibrator()
p_prime = va.predict_proba(p_cal=p_cal, y_cal=y_cal, p_test=p_test)`
I will aim to add this for the non pre-fitted underlying classifier option too in the next release. I hope this helps
That will much appreciated! And thanks for the usage example.
I would like to pass fit kwargs to the base estimator provided to
VennAbersCalibrator
. Is this possible in the current framework? For example, I would like to pass aneval_set
to an XGBoost classifier fit method.