schrum2 / MM-NEATv2

MM-NEAT version 2.0 is no longer supported. Please get MM-NEAT 3+ from https://github.com/schrum2/MM-NEAT
Other
11 stars 5 forks source link

Convolutional weight sharing #453

Closed schrum2 closed 6 years ago

schrum2 commented 7 years ago

Current Convolutional substrates for HyperNEAT use x/y inputs for the receptive field and x/y inputs for the processing substrate position. However, in a standard convolutional network, weights for all receptive fields would be shared/duplicated regardless of target neuron in the target layer. This option should be enabled, which could save on CPPN inputs.

Alternative option: Instead of having separate CPPN outputs for each pairing of layers, inputs indicating where the target substrate is could take on this role ... not sure of the details though.

schrum2 commented 6 years ago

Now supported in MM-NEAT 3+: https://github.com/schrum2/MM-NEAT/issues/15