Closed feifeibear closed 2 years ago
See this MR for more information #14
Hi, thanks for taking the time to fix it!
Iit seems like I haven't tested cross entropy losses thoroughly enough.
However, by commenting out that line in your MR #14, you simply tell the toolkit to ignore this type of loss, which isn't the goal here. It seems the issue here is that I've forgotten to update getting-started.py
after I've updated the API.
Really appreciate your effort, but apologies that I can't merge #14 .
I run the following code and set the input batch size as 20. (pytorch 1.10.0)
lazy_loss.backward()
File "/home/user/anaconda3/envs/torch/lib/python3.9/site-packages/koila/tensors.py", line 439, in backward
mini_batch = self.run((total, total + mini_batch_size))
File "/home/user/anaconda3/envs/torch/lib/python3.9/site-packages/koila/tensors.py", line 187, in run
return data.run(partial)
File "/home/user/anaconda3/envs/torch/lib/python3.9/site-packages/koila/tensors.py", line 94, in _run
result = self.func(*real_args, **real_kwargs)
File "/home/user/anaconda3/envs/torch/lib/python3.9/site-packages/torch/nn/functional.py", line 2846, in cross_entropy
return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)
ValueError: Expected input batch_size (16) to match target batch_size (20).
python example/getting-started.py
The errros. Traceback (most recent call last): File "/home/user/codes/koila/examples/getting-started.py", line 97, in