allenai / dolma

Data and tools for generating and inspecting OLMo pre-training data.
https://allenai.github.io/dolma/
Apache License 2.0
894 stars 90 forks source link

Workaround to fix memory leak in HuggingFace tokenizer #169

Closed soldni closed 2 months ago

soldni commented 2 months ago

Adds option to refresh tokenizer every few steps to get around the memory leak described here.

soldni commented 2 months ago

This looks good/ seems like it should work around the memory leak in the transformers library. One larger comment- it seems like most of the changes were needed to accommodate running with slow/fast tokenizers. If this is an orthogonal change to the memory leak issue, does it make sense to pull these changes out into a separate PR?

That's a valid concern, @drschwenk! However, GC hack doesn't fully deal w memory issues, so sometimes is necessary to use slow tokenizer instead 😭 hence, all in one PR.