Closed qgroshens closed 4 years ago
Hey @qgroshens thanks for catching this bug. The problem is coming from batchnormalization in combination with batch size 1 - we cannot normalize one sample to have zero mean and unit variance...
This small commit fixes the problem. The changes are on current master but not in an official release, e.g. FARM version 0.4.3. Hope this solves the problem on your end.
Issue fixed, closing now
Describe the bug When performing inference using inference_from_dicts and a dictionary batch of size 1, we get a torch error linked to batch normalization.
Error message
Expected behavior inference_from_dicts should return the inference of the text provided even if the size of the batch provided is 1
Additional context Using custom WordEmbedding_LM model
To Reproduce
running
still return the error (due to batch_size=1?)
works fine System: