Closed Roboneet closed 6 years ago
Would it work to just left-pad all dimensions instead? e.g. for (5,) .+ (10,5)
in JS you'd just reshape the first to (1,5)
and then broadcast; might be faster.
For softmax, I expect that we can just reuse the dim
argument (I guess it should just be 1
?).
Just realized yesterday that if this line is deleted, both the broadcasts work fine without any modification. The weights, the input and the output will be transpose of the corresponding julia ones, but they should be so by default since it's row major on js. ( math.matMul(x, y)
needs be changed to math.matMul(y, x)
then)
(Fixed in #23 )
broadcasting works differently in tf-js