Open matigekunstintelligentie opened 4 months ago
It can be fixed by adding:
from metrics.evaluation import simplicity, complexity
@l23
results['model_size'] = complexity(results['symbolic_model'], feature_names)
@l223 in experiments/evaluate_model.py
And: `def complexity(pred_model, feature_names): local_dict = {f:sp.Symbol(f) for f in feature_names} sp_model = get_symbolic_model(pred_model, local_dict)
num_components = 0
for _ in sp.preorder_traversal(sp_model):
num_components += 1
return num_components
` @l68 in experiment/metrics/evaluation.py
what version are you running?
@lacava I believe I pulled the latest version. The blackbox_results.ipynb hasn't changed and I can't find 'model_size' anywhere if I search it here on the github
When I run the following command in the 'experiment' folder
python evaluate_model.py ../../pmlb/datasets/503_wind/503_wind.tsv.gz -ml myalg -results_path ../results_blackbox/503_wind/ -seed 16850 -target_noise 0.0 -feature_noise 0.0
I except a .json with a field called 'model_size' instead I do get a 'simplicity' field.After collating these results, they are incompatible with the blackbox_results.ipynb script because all values in the model_size field are NaN