keurfonluu / evodcinv

Inversion of dispersion curves using Evolutionary Algorithms
BSD 3-Clause "New" or "Revised" License
69 stars 29 forks source link

Controlling size of inversion resuls #10

Closed ariellellouch closed 1 year ago

ariellellouch commented 1 year ago

When running large inversions (for example - 2000 iterations, 10000 particles, 20 layer model), the size of the output becomes difficult to manage on small machines, especially when running through Jupyter Notebook.

Is it possible to add a flag for a "diminshed" output, something like: 1) Only best model per iteration 2) Random decimation (with a factor of choice) 3) Memory dump every N iterations

Thanks!

keurfonluu commented 1 year ago

Hi @ariellellouch,

That's a good idea, I will see what I can do. This seems to require some changes in stochopy, though.

Also, I am not sure if it's just for the example, but 10000 particles sound a bit too much for 39 parameters to invert. Even the conservative rule-of-thumb 10D (so 390 particles) would usually be too much. For 20 layers, 100 particles should be more than enough. If not, I would suggest to better constrain the search with more restrictive search boundaries.

keurfonluu commented 1 year ago

Hi @ariellellouch,

I finally got some time to work on it. Please update to version 2.1.0: pip install evodcinv -U (that should update stochopy to version 2.3.0 as well, if not, do it manually).

This version adds a new option that allows you to control the verbosity as a fraction of population size (or only return best model for each iteration if set to 0), for instance:

model.configure(
    optimizer="cpso",  # Evolutionary algorithm
    misfit="rmse",  # Misfit function type
    optimizer_args={
        "popsize": 10,  # Population size
        "maxiter": 100,  # Number of iterations
        "workers": -1,  # Number of cores
        "seed": 0,
        "verbosity": 0.0,  # Only return best model for each iteration
    },
)

Hope that works for you!

ariellellouch commented 1 year ago

Yes, works great (both updating and results)!