Open aadimator opened 4 years ago
You need to provide overriding for the NonDifferentiable
with the type of your individual, i.e. Organism
. Basically, it's a wrapper type that keeps fitness function with its parameter & result in one place.
Look how it is done for the BitVector
:
https://github.com/wildart/Evolutionary.jl/blob/30d677fcfc19d517f2a974a00bafda34656c468c/src/api/types.jl#L113-L120
Thank you for replying. I've thought (and searched) about it long and hard, but I'm unable to figure out how I'm supposed to approach this, because of the lack of documentation for the JuliaNSolvers and mostly because of my inexperience. I'm quite new to the Julia environment and still trying to figure things out, although I'm hopeful and excited to learn about it more.
I added an example of supervised learning problem by GA optimization of multi-layer perceptrons, see the notebook in examples/MLP.ipynb
It shows how to fill some missing functionality, when the individuals are custom types (not arrays). For starters, you need to create a NonDifferentiable
constructor, and make sure that copy
/copyto!
functions are working on your individual object. Then, write genetic operations which support your object type. In my example, I reused existing operations by writing wrappers, which accept MLP object.
I've been trying to run GA on my custom structure,
Organism
, but it's giving the following error:Error
The
init_organism
function initializes an object of typeOrganism
and returns it, while thefitness
function takes anOrganism
as input and returns aNumber
as fitness value. I haven't defined the crossover and mutation functions for now, as I only want to see if the GA is being initialized well.After a thorough search, I'm unable to figure out the way to solve this problem/error. If anyone could give me some helpful pointers to solve this error message, I'd be grateful. Thanks