Closed Alexmac22347 closed 5 years ago
Yeah, the problem is that batch_size
is a parameter that's required by the data pipeline. The way to go about it would be to remove the parameter from this call and set it inside the train()
call in tasks/sequence_tagging/main.py
I'll try giving an update by the weekend. But feel free to make a PR!
Same issue here :/
Hi, even when I try changing the hyperparameters like so:
The batch.batch_size still is 100 (line 167 of train.py)(I added the print statement):
Edit: I can see where the batch size is being set by default to 100, line 41 of torchnlp/ner.py.
However not sure where it's supposed be updated to a custom value.