Closed wbrickner closed 3 years ago
Hey, sorry I'm just seeing this.
I'm not totally sure I understand the topological constraints you are talking about. Does this just mean you would like to use a traditional dense layer without evolving the topology? If so, that is possible.
With radiate, you can still stack layers, so say you want a network with three dense layers. You can stack say, a dense_pool layer, which will evolve its topology, then a normal dense layer, which will not evolve it's topology and act like a traditional feed forward layer, then you could add another dense_pool layer, which again would evolve its topology. This way your second layer would maintain its dimensionality through evolution while still allowing the first and last layers to evolve.
The readme in the models folder has an example of stacking layers like this: https://github.com/pkalivas/radiate/tree/master/radiate/src/models
Yes this works for my use-case! The key is that some hidden layer maintains it's dimensionality throughout, and that the dimensionality is much less than the input and output dimension.
Thank you, I'll take a look at the example!
Great! I'm going to close this issue. Go ahead and open another one if anything comes up.
Hello,
I'd like to evolve a network with a topological constraint for it's hidden layers.
In a dense network this would be some middle layer that forces the reduction of the dimensionality of the information.
In NEAT I'm not sure what this would mean, as the network topology has so much more freedom.
Is this possible in general? If so, can this be accomplished using
radiate
? I'm not very familiar with the library, just only getting started.Thank you!