togethercomputer / RedPajama-Data

The RedPajama-Data repository contains code for preparing large datasets for training large language models.
Apache License 2.0
4.53k stars 346 forks source link

`No module named 'datasets'` in `data_prep/book/` #21

Closed danielpclark closed 1 year ago

danielpclark commented 1 year ago

I'm going through gathering the data from each of the data_prep folders and besides some inconsistency on where the data folder is in each README this is the only error I've come across.

cd data_prep
mkdir -p data/book
python3 book/download.py
Traceback (most recent call last):
  File "/mnt/AI/RedPajama-Data/data_prep/book/download.py", line 1, in <module>
    from datasets import load_dataset
ModuleNotFoundError: No module named 'datasets'
mauriceweber commented 1 year ago

Thanks for bringing this to our attention! The error your are observing is due to the datasets library not being installed. We will add this to the dependencies. In the meantime, you can fix the error by running

pip install datasets

in your python environment.

danielpclark commented 1 year ago

Thanks! I must have mistyped it.

There have been a couple of times I have had to add system dependencies and copy libs from system python to my pyenv python to get things to work as well in various project folders. For this I had to do:

On Ubuntu:

sudo apt install lzma liblzma-dev libbz2-dev

cp /usr/lib/python3.10/lib-dynload/_bz2.cpython-310-x86_64-linux-gnu.so ~/.pyenv/versions/3.10.6/lib/python3.10/lib-dynload/
cp /usr/lib/python3.10/lib-dynload/_lzma.cpython-310-x86_64-linux-gnu.so ~/.pyenv/versions/3.10.6/lib/python3.10/lib-dynload/

I believe the bz2 was also needed in another dataset folder.

For stack exchange I needed:

sudo apt install p7zip-full

For C4 dataset I needed:

sudo apt install git-lfs

These were not documented in the READMEs.

danielpclark commented 1 year ago

Just discovered that the data_prep/book/ download uses $HOME/.cache/huggingface/ for downloads. My system doesn't have extra space for the home directory to be doing that kind of work so I've symbolically linked it to an external drive.

My main AI machine isn't finished being setup yet so I'm prepping data outside of it.

danielpclark commented 1 year ago

This can be searchable so I'm closing.

danielpclark commented 1 year ago

I should also mention here that the pile download would always fail at or before 40% completion. So to download the data I used wget:

wget -c https://the-eye.eu/public/AI/pile_preliminary_components/books3.tar.gz