Closed twoletters closed 1 year ago
Hmm, interesting question. It would definitely be possible to utilize these kind of algorithms on SNNs, however not with this library's current setup - the underlying NEAT implementation must in that case be replaced with an implementation exposing these kind of evolvable parameters. As you say, the algorithms should develop/evolve the SNNs, not be utilized in the algorithms themselves. Thus: We must substitute the underlying NEAT. This, however, would be very desirable as the output of the algorithms in this library would benefit from using e.g. Pytorch instead - this is also applicable when actually running ES-HyperNEAT and HyperNEAT: The implementation would benefit greatly from replacing the NEAT implementation with something like Pytorch. Being able to run this in parallel would really be something. You're most welcome to try, heh 🥇
Lately, there has been renewed interest in SNNs, such as here.
The HyperNEAT family of algos could be a great fit to develop SNNs. Is that something that you would be interested in adding to the project?