modestyachts / imagenet-testbed

ImageNet Testbed, associated with the paper "Measuring Robustness to Natural Distribution Shifts in Image Classification."
https://modestyachts.github.io/imagenet-testbed/
MIT License
116 stars 7 forks source link

Database down #9

Open nng555 opened 1 year ago

nng555 commented 1 year ago

Hello,

It seems the database endpoint for datasets and models is currently down. (e.g. https://vasa.millennium.berkeley.edu:9000/robustness-eval/datasets/DvAR9QjMvf_data.bytes)

ludwigschmidt commented 1 year ago

Sorry, we had to migrate to a new server! This should work again in a day or two, we're completing the migration right now.

nng555 commented 1 year ago

Great, please let me know when the migrations is complete!

rtaori commented 1 year ago

Btw, the temporary solution in https://github.com/modestyachts/imagenet-testbed/issues/8 might work for you for now in case you only need the eval data (and not dataset/model bytes)

nng555 commented 1 year ago

Unfortunately I'm looking to so some analysis on the model and datasets themselves so I need access to both dataset/model bytes.

nng555 commented 1 year ago

Is it possible to get any kind of ETA on when server migration might be finished? I'm trying to run these models for some additional rebuttal results and want to assuage my reviewers and ask for more time 😬

rtaori commented 1 year ago

Ah really sorry about this, not sure if we can get a reliable ETA. Unfortunately sounds like it might take some time :/ So most of the model checkpoint and dataset bytes are actually stored in google cloud. With the local DB patch referenced above, you should be able to download this data fine. But there are some objects that are stored on the berkeley server that won't be available until after the migration

nng555 commented 1 year ago

I tried that patch and was still unable to get anything downloaded since its hitting the s3 cache and not google drive (is there any way extract the cloud links directly?). I'm trying to use the following datasets:

nng555 commented 1 year ago

Ah figured it out, just had to change the client in s3_utils.py to google on line 81. Not sure why it wasn't trying to google client by default when the s3 client failed...

rtaori commented 1 year ago

Ah I see! Thanks for discovering this bug - sounds like it's in the key_exists function? I guess what we should do here is first try get_s3_client_vasa, if that fails, use get_s3_client_google. Do you mind submitting a PR for the fix?