Open yanisvdc opened 2 months ago
Interestingly, if I remove discrete_treatment = True, and if I put a regressor for model_y (even though y is binary in my case), the code will run; but not sure if the result will be valid, it could be since model_y still estimates the probability y=1, which is the supposed behavior when discrete_treatment = True and with a classifier as model_y. Please let me know if the result would be valid in that case or if you see how to modify the code to make it work with discrete_treatment = True and with a classifier as model_y.
Interestingly, if I remove discrete_treatment = True, and if I put a regressor for model_y (even though y is binary in my case), the code will run; but not sure if the result will be valid, it could be since model_y still estimates the probability y=1, which is the supposed behavior when discrete_treatment = True and with a classifier as model_y. Please let me know if the result would be valid in that case or if you see how to modify the code to make it work with discrete_treatment = True and with a classifier as model_y.
Did you mean remove discrete_outcome=True
? DRTester is only designed for discrete treatments, so you should certainly not change the discrete_treatment
argument. It does look like a bug that you can't use discrete_outcome=True
, but I think switching to a regressor should be fine in most cases.
On an unrelated note, I would not use the DML
class directly - if you want a non-parametric final model you should either use NonParamDML
if you want to use an arbitrary final model of your choosing (but this only supports a single treatment and outcome), or use CausalForestDML
if you have an arbitrary number of treatments and outcomes and want confidence intervals (but the final model is limited to being a CausalForest).
Yes I meant removing discrete_outcome = True Do you have the same bug as me when trying to run the code, do we agree that it should work and that something unexpected happen within the library that is not under my control, or should I try to dig more? Thanks!
Yes, this is a bug on our end, but using a continuous outcome (even if it's really discrete) should be fine as a workaround.
Hi, I wanted to flag that I am also running into an issue when discrete_outcome = True
. My use case is tuning a CausalForestDML object. I can open a separate issue if appropriate, but it seems like the commonality is that discrete_outcome
may not have been persisted everywhere it ought to have been. Thank you for maintaining a useful package!
Example below:
import pandas as pd
import numpy as np
from econml.dml import CausalForestDML
from xgboost import XGBClassifier, XGBRegressor
# Number of samples
n = 10000
# Treat half of the samples
treatment = np.repeat([0, 1], n/2)
# Create a covariate that defines heterogeneous treatment effect
covariate = np.resize([0, 1], n)
# Define outcome based on treatment and covariate
# TE is 1 when covariate==1, 0 otherwise
outcome = ((treatment==1) & (covariate==1)).astype(int)
# Store in data frame
df = pd.DataFrame({'treatment': treatment,
'covariate': covariate,
'outcome': outcome})
# Instantiate
cf_classifier = CausalForestDML(model_y = XGBClassifier(),
model_t = XGBClassifier(),
discrete_outcome = True,
discrete_treatment = True)
# Executes as expected
cf_classifier\
.fit(Y = df['outcome'],
T = df['treatment'],
X = df[['covariate']])
cf_classifier\
.tune(Y = df['outcome'],
T = df['treatment'],
X = df[['covariate']])\
.fit(Y = df['outcome'],
T = df['treatment'],
X = df[['covariate']])
This returns AttributeError: Cannot use a classifier as a first stage model when the target is continuous!
, but the target is a binary integer.
cf_regressor = CausalForestDML(model_y = XGBRegressor(),
model_t = XGBClassifier(),
discrete_outcome = False,
discrete_treatment = True)
cf_regressor\
.tune(Y = df['outcome'],
T = df['treatment'],
X = df[['covariate']])\
.fit(Y = df['outcome'],
T = df['treatment'],
X = df[['covariate']])
Similar to the suggestion above, using a regressor for model_y
and passing discrete_outcome = False
during CausalForestDML instantiation allows for successful tuning. Using a tree-based model for the regressor should help keep predictions in the [0, 1] interval for a temporary solution.
Hi, here is the code to reproduce the error:
Error message:
It does work when the outcome y is continuous.