eto-ai / rikai

Parquet-based ML data format optimized for working with unstructured data
https://rikai.readthedocs.io/en/latest/
Apache License 2.0
138 stars 19 forks source link

CI: cache official pretrained Torch models #623

Closed da-tubi closed 2 years ago

da-tubi commented 2 years ago
da-liii commented 2 years ago

Successfully cached! see https://github.com/eto-ai/rikai/runs/5969892239?check_suite_focus=true

Post job cleanup.
/usr/bin/tar --posix --use-compress-program zstd -T0 -cf cache.tzst -P -C /home/runner/work/rikai/rikai --files-from manifest.txt
Cache Size: ~[2](https://github.com/eto-ai/rikai/runs/5969892239?check_suite_focus=true#step:21:2)179 MB (228468[3](https://github.com/eto-ai/rikai/runs/5969892239?check_suite_focus=true#step:21:3)523 B)
Cache saved successfully
Cache saved with key: Linux-pt-c821a2af[4](https://github.com/eto-ai/rikai/runs/5969892239?check_suite_focus=true#step:21:4)2b[5](https://github.com/eto-ai/rikai/runs/5969892239?check_suite_focus=true#step:21:5)5ce30171bc656ad5655da069bfaa5be5775c661b1a8b615e8208
da-liii commented 2 years ago

Downloading from github cache is faster than downloading from Pytorch officials. It saved about > 30s .

eddyxu commented 2 years ago

Nice!