When following the instructions for cloning the repo, I get this error: batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access. This results in two files, cue/data/demo/inputs/chr21.small.bam and cue/data/models/cue.pt not being cloned properly, and causing the unpickling issue in the tutorial.
Hi @williamphu, thanks for reporting this issue. To fix this issue and avoid these LFS bandwidth limitations, the Cue models and demo datasets have now been moved to Google Cloud Storage.
When following the instructions for cloning the repo, I get this error:
batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.
This results in two files,cue/data/demo/inputs/chr21.small.bam
andcue/data/models/cue.pt
not being cloned properly, and causing the unpickling issue in the tutorial.Related to this issue: https://github.com/PopicLab/cue/issues/3