Closed vieting closed 2 years ago
One fix: Change
net_dict={name: {"class": "subnetwork", "from": "data", "subnetwork": {"output": {"class": "copy"}}}}
to
net_dict={name: {"class": "subnetwork", "from": "data", "subnetwork": {"output": {"class": "copy", "from": "data"}}}}
Ah we have to change get_returnn_axis_description
. When you remove the if dim_tag.dyn_size is not None:
and just always use this code, does this already work?
Edit There is also another heuristic we can use here: We can use the "dim:..."
syntax when the static dim is unique.
You also need to adapt BatchNorm1d.import_params_torch_to_returnn
for the new param version.
Maybe just:
out_shape = [layer.output.dim]
out_shape = [layer.output.dim]
That seems to work. So we don't neet _expand_dims
at all anymore, I guess.
For #92, we need RETURNN's
behavior_version
to have the correct behavior of theDotLayer
. There might be fixes necessary due to this change.