@cleeff @simonant I found a bug in the pooling of hexconvolution.Conv and hexconvolution.Inception which forbade creating new models for an even board size (conversion from uneven to even was still possible). As the Inception class was only a subclass of Conv with given reach anyway, I took the liberty of removing all unused/unusable model code.
All models have to be converted with hex.model.conversion_type to run again! They can still be converted in size afterwards.
Background:
The reach parameter stands for the pooling size of the first and last convolutional layer. As we change from 2 to intermediate_channels and from intermediate_channels to 1 channel in these layers, the number of parameters is smaller, so we can allow for a bigger convolutional filter. This will now always be read from the config, our default in current models is board_size//2.
@cleeff @simonant I found a bug in the pooling of
hexconvolution.Conv
andhexconvolution.Inception
which forbade creating new models for an even board size (conversion from uneven to even was still possible). As theInception
class was only a subclass ofConv
with givenreach
anyway, I took the liberty of removing all unused/unusable model code.All models have to be converted with
hex.model.conversion_type
to run again! They can still be converted in size afterwards.Background: The
reach
parameter stands for the pooling size of the first and last convolutional layer. As we change from 2 tointermediate_channels
and fromintermediate_channels
to 1 channel in these layers, the number of parameters is smaller, so we can allow for a bigger convolutional filter. This will now always be read from the config, our default in current models isboard_size//2
.