Shivanandroy / simpleT5

simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
MIT License
386 stars 62 forks source link

Add possibility to configure num_workers #19

Closed dbalabka closed 2 years ago

Shivanandroy commented 2 years ago

@dbalabka Apologies, I didn't check the PR as I was not actively managing this project.

BTW, it's already integrated in latest pre-release (pip install -U simplet5)

from simplet5 import SimpleT5

model = SimpleT5()
model.from_pretrained(model_type="t5", model_name="t5-base")
model.train(train_df=train_df,
            eval_df=test_df,
            source_max_token_len=50, 
            target_max_token_len=50, 
            batch_size=8, 
            max_epochs=3, 
            use_gpu=True,
            logger = "default",
            dataloader_num_workers = 2,
            save_only_last_epoch=True,
            outputdir="outputs"
            )

Closing it.

dbalabka commented 2 years ago

@Shivanandroy thanks for reply! Your project gave us possibility to make a quick start. Right now we are using https://github.com/google-research/t5x because of good support of TPUs. Thanks for your project and work it is really useful!