NeuroDiffGym / neurodiffeq

A library for solving differential equations using neural networks based on PyTorch, used by multiple research groups around the world, including at Harvard IACS.
http://pypi.org/project/neurodiffeq/
MIT License
680 stars 89 forks source link

Finding value of weights #166

Closed Arup-nit closed 2 years ago

Arup-nit commented 2 years ago

Hi @shuheng-liu

How can I access the initial values of weights? As well as the final value of weights after max_epochs=1000 fixed epochs when using solver1D class?


Solver1D(
    ode_system = lambda u, t: [diff(u, t, order=2) + u],  
    conditions=[IVP(0, 1, 0)], 
    t_min=0.0, 
    t_max=2.0,  
    train_generator=Generator1D(100, 0.0, 2),
    valid_generator=Generator1D(100, 0.0, 2),
)
from neurodiffeq.callbacks import MonitorCallback
monitor = Monitor1D(t_min=0.0, t_max=3.0, check_every=100)
monitor_callback = MonitorCallback(monitor)

solver_system.fit(max_epochs=1000, callbacks=[monitor_callback])
solution_system = solver_system.get_solution()```
Arup-nit commented 2 years ago

Hi dear @shuheng-liu is there any way to find value of weights?

sathvikbhagavan commented 2 years ago

Hello,

We can find the weights by accessing the nets parameter of the solver object and then its NN field (for a FCNN object)(https://github.com/NeuroDiffGym/neurodiffeq/blob/master/neurodiffeq/networks.py#L6). This gives a torch.nn.Sequential object and its weights can easily be found out by indexing the layer and accessing weight field. For your given example,

solver_system.nets[0].NN[0].weight

The output I got when I ran the code:

Parameter containing:
tensor([[ 0.8768],
        [-0.6987],
        [-0.5978],
        [ 0.3867],
        [-0.3607],
        [-0.6576],
        [-0.4753],
        [-0.1397],
        [-0.6517],
        [ 0.4632],
        [-0.5898],
        [-0.3324],
        [ 0.8168],
        [-0.8792],
        [-0.7331],
        [-0.1540],
        [ 0.1956],
        [ 0.7815],
        [ 0.2859],
        [-0.6865],
        [ 0.8375],
        [ 0.8586],
        [-0.8746],
        [-0.5371],
        [-0.7475],
        [ 0.4840],
        [ 0.5717],
        [-0.1432],
        [ 0.2203],
        [-0.3002],
        [ 0.2296],
        [-0.3601]], requires_grad=True)