UOB-AI / UOB-AI.github.io

A repository to host our documentations website.
https://UOB-AI.github.io
1 stars 3 forks source link

Failed to stage the template: No space left on device #14

Closed AlmahmoodAbdulla closed 1 year ago

AlmahmoodAbdulla commented 1 year ago

Good afternoon,

I'm trying to open Jupyter but keeps getting the below error

Screenshot 2023-03-07 at 8 12 12 PM

Thank you in advance for your support

asubah commented 1 year ago

This means that your storage quota is full.

To check your quota you can run:

df -kh ~

To check the directories / files sizes:

du -hd1 ~

If you used pip to install packages recently, you can free some space by deleting pip cache:

conda activate 
pip cache purge

If you installed big packages using pip, please remove them and we will create / modify an environment for you.

asubah commented 1 year ago

If pip cache purge didn't work, try:

conda activate
pip cache purge --cache-dir=~/.cache/pip

If the full storage problem is not yet fixed, try to check if you have large python packages installed in your user space. This can be noticed by looking at the output of du -hd1 ~, you will see that your .local directory is too big (e.g. > 4 GB), in this case, to check for the packages installed in your user space, run:

conda activate
pip list --user

Please remove any large packages, such as torch, cuda, etc. There is no reason for you to install them, as we provide them for you through Conda environments. If you need a special version of these packages that is not available as an environment, please contact us to create a Conda environment for you.

To uninstall a package, run:

conda activate
pip uninstall package_name
asubah commented 1 year ago

If you still have a large .cache directory, even after clearing pip cache, check the .cache directory to see what is the reason by running:

du -hd1 ~/.cache

image If hugging face is causing the cache increase, look at the following link to see how to clear the datasets cache and how to change the cache directory.

I recommand using our datasets' directory to store the cache for now, by setting the environment variable HF_DATASETS_CACHE to something like:

export HF_DATASETS_CACHE="/data/datasets/mydataset/huggingfacecache"

Or from inside a Jupyter Notebook:

from datasets import load_dataset
dataset = load_dataset('LOADING_SCRIPT', cache_dir="/data/datasets/mydataset/huggingfacecache")