Closed eddiebergman closed 11 months ago
I am a bit hesitant to bump the openml python version, especially since the ARFF files possibly have different types than the parquet files. I think I would prefer a solution where the code works with either, and not bump the requirements.
E.g., through something like:
try:
set_openml_cache = openml.config.set_cache_directory
except AttributeError:
set_openml_cache = openml.config.set_root_cache_directory
...
set_openml_cache(...)
That would allow individuals to bump their openml python versions and for it to work out of the box, but they would need to be explicit about this. We can pin your issue with an explanation about this.
To be clear, this would be temporary. We should have a discussion sometime about what the right balance should be between bumping versions of dependencies and trying to keep the benchmark stable. Maybe the right call is to not promise any stability (people should use releases for that), but it breaks our policy so far, so I don't want to deviate "on a whim".
Thanks @eddiebergman , this worked to fix the problem on my end! I understand Pieter's reasoning for waiting on the merge though. That makes sense.
Closing this because I added the fix in #579 in a way that it stays compatible with openml==0.13.1. Reasoning for not bumping openml given above.
Please see #573
Not sure where else this might need to be updated or how to test this properly, this was just a local fix that got it to work.