issues
search
lucidrains
/
soft-moe-pytorch
Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch
MIT License
239
stars
8
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Here are some questions about soft MoE
#9
t5862755
opened
4 months ago
0
Early onset of Inf error on a certain self supervised learning method in distributed training mode
#8
swarajnanda2021
closed
6 months ago
1
Unable to Set num_slots Directly
#7
afiff2
closed
6 months ago
1
Still updating and maintainancing?
#6
Shengguang-Zhou
opened
9 months ago
0
have a question
#5
daixiangzi
closed
10 months ago
5
Not receiving grads with cpu_besides?
#4
conceptofmind
closed
4 months ago
26
error:tImplementedError: Could not run 'aten::_amp_foreach_non_finite_check_and_unscale_' with arguments from the 'CPU' backend.
#3
daixiangzi
opened
11 months ago
24
Does the number of slots * the number of experts have to be the same as the length of the token?
#2
ZQpengyu
opened
11 months ago
1