CodeReclaimers / neat-python

Python implementation of the NEAT neuroevolution algorithm
BSD 3-Clause "New" or "Revised" License
1.42k stars 492 forks source link

RecurrentNetwork Implementation issue #126

Open HeshamMeneisi opened 6 years ago

HeshamMeneisi commented 6 years ago

I was skimming through the code and I noticed that the RecurrentNetwork activation is implemented in a rather strange way.

for i, v in zip(self.input_nodes, inputs):
       ivalues[i] = v
       ovalues[i] = v

for node, activation, aggregation, bias, response, links in self.node_evals:
        node_inputs = [ivalues[i] * w for i, w in links]
        s = aggregation(node_inputs)
        ovalues[node] = activation(bias + response * s)

This means that any edge from a node that is not an input is naturally recurrent. In other words, all hidden nodes belong to a single layer with recurrent edges to the previous time-step (excluding the inputs); and thus we cannot create a multi-layer RNN. Is this behavior intentional?

subtletech commented 6 years ago

@HeshamMeneisi Yes, indeed, from what I understand, in the present RecurrentNetwork implementation a node is able to get its input values either from network inputs passed to net.activate() or from activations of other nodes from the previous step. In other words, a node can't access activations of other nodes from the current iteration - a call to net.activate(). Which is sort of a mistake, I guess.

From what I know, a RNN is like normal FeedForward, but with some recurrent connections. But in the current implementation, only recurrent connections are possible, except for network inputs.

Would appreciate any comment from @CodeReclaimers

CodeReclaimers commented 6 years ago

The RNN implementation is simplistic mainly because it was simple in the original version of the library, and I just never had the time to make it match the literature and other libraries, or to evaluate some of the pull requests that added new features to it.

You may want to check out some of the forks of this project, as they've made a significant number of improvements: