TEST> (!reshape (make-input `(N C H W) nil)
(~ N C H W -> (* N C H) W))
{CPUTENSOR[float] :shape (LazyAxis: f(N C H) = (N*C*H) LazyAxis: W) :id TID2755206
:vec-state [maybe-not-computed]
<<Not allocated: size=(LazyAxis: f(N C H) = (N*C*H) LazyAxis: W)>>
:facet :input
:belongs-to :memory-pool
:requires-grad NIL
:backward <Node: RESHAPETENSORNODE-T (A[BEFORE] B[AFTER] -> B[AFTER])>}
The shape of AbstractTensor is now given as a LazyAxis structure; which can contain S-expression. This feature is advantageous in networks where the batch-size changes frequently, as it allows operations to be performed without having to recompile. As for CNN, complex shape transmissions can be represented as reported in #132 .
Changes
Doing !reshape for DynamicShape
The shape of
AbstractTensor
is now given as aLazyAxis
structure; which can contain S-expression. This feature is advantageous in networks where the batch-size changes frequently, as it allows operations to be performed without having to recompile. As for CNN, complex shape transmissions can be represented as reported in #132 .is now a valid operation.