Closed cannotseeme closed 1 month ago
Hi @cannotseeme,
I use Huggingface Datasets for faster dataset processing for NLP tasks. You can install all the necessary dependencies with the following script:
pip install datasets
Hi @cannotseeme,
During the testing of the code on GitHub, I discovered and fixed several issues with the dataloader. I also optimized the code, resulting in a 5x increase in inference speed. Everything, from zero-shot to supervised training, has been thoroughly tested and should now run smoothly.
Hi @cannotseeme,
During the testing of the code on GitHub, I discovered and fixed several issues with the dataloader. I also optimized the code, resulting in a 5x increase in inference speed. Everything, from zero-shot to supervised training, has been thoroughly tested and should now run smoothly.
Hi @cannotseeme,
During the testing of the code on GitHub, I discovered and fixed several issues with the dataloader. I also optimized the code, resulting in a 5x increase in inference speed. Everything, from zero-shot to supervised training, has been thoroughly tested and should now run smoothly.
Hi there ! I have successfully implemented the code following your instructions. Really appreciated for your kindly explanation and reply.
Hi! Thanks a lot for sharing! I'm running the evaluation on the LMTraj-SUP model. However, the error occurred in line 31 in "./model/eval_accelerator.py": "from datasets import load_dataset". It seems this dir contain only data other than any functional methods. I wondered if there is a "datasets.py" in the original release and is missing online?