UW-ERSL / TOuNN

GNU General Public License v3.0
14 stars 6 forks source link

Clarification on Network Parameter Update in TOuNN Method #3

Open nvnkrus opened 5 months ago

nvnkrus commented 5 months ago

Dear @aadityacs,

It is nice to meet you here. I have been studying your paper on "TOuNN: Topology Optimization using Neural Networks" and I have a question regarding the update of network parameters in the TOuNN method.

It seems that the sensitivity analysis for the loss function with respect to network parameters (Eq. 9 in the paper) is not explicitly used in the code, and the network parameters are updated only through backpropagation. I would appreciate it if you could clarify whether the sensitivity analysis for the loss function with respect to network parameters is indeed not utilized in the TOuNN method, and if the network parameters are updated solely through backpropagation.

Thank you for your time and assistance.

aadityacs commented 5 months ago

Hi @nvnkrus ,

You are right in your observation that eq 9 is not exactly reproduced in this implementation. The autograd library (PyTorch) takes care of computing the sensitivities.

I would also prolly add a few more points here.

At that point in time of implementing this particular code, we did not have the FE solver implemented that was written in PyTorch. This resulted in the library's autograd ability not include the computations in the solver in its graph.

We hence had to find a workaround . The implementation here works only for compliance minimization problem (ie, self-adjoint problem).

Subsequently, we have an implementation https://github.com/UW-ERSL/JAXTOuNN that has the solver also written with autograd capacity (JAX in this particular instance). I would recommend you follow along that code instead of this in case you were looking to extend the code in other aspects.

Cheers, Aadi