SciML / ReservoirComputing.jl

Reservoir computing utilities for scientific machine learning (SciML)
https://docs.sciml.ai/ReservoirComputing/stable/
MIT License
206 stars 37 forks source link

What's a good way to deal with matrices containing Infs or NANs? #180

Open AraujoH opened 10 months ago

AraujoH commented 10 months ago

I have been using ReservoirComputing.jl for a while a now, and one error I get sometimes is the one about a matrix containing Infs and NaNs. This happens, as far as I know, when the random matrix of ESN can't be decomposed/factored. I have run ENS successfully just to have it crash when I increased/decreased the reservoir size (without changing anything else).

Is there a way to deal with this issue? Is there a known/common work around?

MartinuzziFrancesco commented 10 months ago

In what reservoir initializer does this happen? I have a (somewhat primitive) workaround here, because this usually happens for specific corner cases like low reservoir size or low sparsity

AraujoH commented 10 months ago

Hello, @MartinuzziFrancesco. Thank you for such a prompt reply. At the moment, I'm building off a reservoir I get from one of the examples in the ReservoirComputing.jl page. The specs are:

    res_size = reservoirsize
    res_radius = 1.2
    res_sparsity = 6 / 300
    input_scaling = 0.1

    ### Build ESN structure
    esn = ESN(
        input_data;
        variation=Default(),
        reservoir=RandSparseReservoir(res_size, radius=res_radius, sparsity=res_sparsity),
        input_layer=WeightedLayer(scaling=input_scaling),
        reservoir_driver=RNN(),
        nla_type=NLADefault(),
        states_type=StandardStates()
        )

I'm trying to vary the reservoirsize variable from 100 to 1000 in increments of 100.

MartinuzziFrancesco commented 10 months ago

ok I see, the matrices probably get too sparse towards the end of the range. I'll play around a little and try to push a small fix

AraujoH commented 10 months ago

Thanks a lot, @MartinuzziFrancesco

MartinuzziFrancesco commented 10 months ago

Anytime! thank you for pointing this out!