coin-unknown / async-genetic

A blazing fast and fully async genetic algorithm
29 stars 2 forks source link

fitness regression is normal? #1

Closed LucaColombi closed 2 years ago

LucaColombi commented 2 years ago

Hello,

I ran the algorithm, but happens that fitness oscillates up and down between generations, for instances the best comes in generations with: -9007199254740991 0.00000796592263697992 OK 0.000007729281618799878 lower 2.4606257793799084e-7 more lower 7.181040166226487e-11 even lower! 0 higher (0 is placeholder for undefined result, placed between positives and negatives) 0.0000037918488699102034 higher 7.170006884662858e-7 lower like 3 steps ago :( ... and so on

I also passed fittestNSurvives as 2 so I was expecting at least the 2 best would have survived and the value does not regress

I am not understanding some part? does I need to do something, like increment population, change some option?

BusinessDuck commented 2 years ago

Hi! Looks like you fitness formula returns a very small values, be careful with it. JavaScript have precision limitations and it's make cause a math mistakes for very small numbers arithmetic operations. Try to reproduce with little bit more normalised numbers in 0...1 range returned from score function, 0 - worst , 1 - best, if it's still reproduces, send report here. Thanks!

BusinessDuck commented 2 years ago

So, also lower value would regress at some cases, this is ok, because you have a mutation in each generation, mean randomise the results. Highest score should be same or more for each new generation, check it as well

LucaColombi commented 2 years ago

Hi! Looks like you fitness formula returns a very small values, be careful with it. JavaScript have precision limitations and it's make cause a math mistakes for very small numbers arithmetic operations. Try to reproduce with little bit more normalised numbers in 0...1 range returned from score function, 0 - worst , 1 - best, if it's still reproduces, send report here. Thanks!

it was not a case of random variation because it was too frequent

as you suggested I mapped the fit value on positive integers and it worked P E R F E C T L Y.. seem it was some problem handling small fractions, now I have the expected crescent fit progression:

LucaColombi commented 2 years ago

Hello, I searched of some way of write you, I have a question, I will delete if the wrong place

actually the algorithm tends to select a fitting branch and make it dominating all the future generastions, slowly suppressing all the variance

is there a way to use the parameters to enhance the exploring of radical different variants such that it generates other branch that can in the long term reach a better fit that the actually dominant branch?

Like, eliminating the duplicates (the dominant can exists only 1) and enlarging, or reducing, the survivalN between generations? Could it be useful to introduce some filter that block crossing if two branch are too far? or what else? I suppose this is a common problem in genetic programming

BusinessDuck commented 2 years ago

I suppose this is a common problem in genetic programming

Yes! You absolutely right this is local problem for genetical algorithms, most successfully phenotype displace others. For preventing that behavior you can stop saving fittest or increase mutations probability or change selection algorithm. Selection it's about build a pair (or tree) for breed. If you change the selection method that mean survived (old) fittest phenotype may not have a couple for breeding and does not affect other phenotypes in group.

Also sometimes we have only one result for one genetic cycle, you can start 10 parallels populations to optimize and compare total result.

For some cases may be applied something like "laser killer". Take a look at this walking AI https://www.youtube.com/watch?v=K-wIZuAA3EY

LucaColombi commented 2 years ago

I suppose this is a common problem in genetic programming

Yes! You absolutely right this is local problem for genetical algorithms, most successfully phenotype displace others. For preventing that behavior you can stop saving fittest or increase mutations probability or change selection algorithm. Selection it's about build a pair (or tree) for breed. If you change the selection method that mean survived (old) fittest phenotype may not have a couple for breeding and does not affect other phenotypes in group.

Also sometimes we have only one result for one genetic cycle, you can start 10 parallels populations to optimize and compare total result.

For some cases may be applied something like "laser killer". Take a look at this walking AI https://www.youtube.com/watch?v=K-wIZuAA3EY

hello, Id like to contact you on telegram, fb, or similar because I feel here is not like a forum thread :D

I talked with a biologist and he was illuminating, remembered me what Darwin himself noticed about evolution, when I read his book... clustering!

Darwin himself noticed than a species to evolve often need to become insulated for a bit, protected from external interferences and with numbers lower enough to not override little mutations.

Thinking on it it seems absurd that the usual GAs actually does not have clustering in his core, even Darwin required it on evolution!

Your idea of multiple run taking the best, mimes a serialized cluster concept, in fact!

Maybe we could even think to put a clustering concept in the core of the algorithm, but for now I would simply take a gross run, cluster it (for instance 100 -> 10X10 population), rerun any cluster, next re-mix so different branch champions hibridates and competes, and so on, this will let the subtrees to evolve, but next at some point mix them and put all of them to compete

I also say you have hands on TradingView, I was coming from there and now writing my platform, I am just using this library to optimize trading, mixing the GAs strategies with other "classical" ones

BusinessDuck commented 2 years ago

check telegram link in my profile, it's public