dodaltuin / soft-tissue-pignn

Physics-informed graph neural network (GNN) emulation of soft-tissue mechanics
MIT License
22 stars 4 forks source link

Data Preprocessing for Custom Soft Tissue Models? #1

Closed AlanConnorChen closed 4 months ago

AlanConnorChen commented 4 months ago

Hello, thank you for providing the open-source code!

Now I want to apply custom soft tissue data (such as kidney.stl and kidney.vtk) to the network, but I have some questions about the data preprocessing steps.

I noticed the TwistingBeamUnprocessed folder, which already contains files like elements.npy and interior-points.npy. The DATA_FORMAT_REQUIREMENTS.md also details the contents of each .npy file, but what preprocessing steps are needed to obtain these .npy files? Also, is the FE mesh mentioned in DATA_FORMAT_REQUIREMENTS.md generated by FEniCS as mentioned in the paper or obtained using other tools?

Could you please provide more detailed information regarding these aspects? Thank you in advance!

dodaltuin commented 4 months ago

Hi there,

Thanks for your message. You are absolutely correct, some pre-processing is required to obtain these .npy files. And yes, I used FEniCS in all cases.

To show how this pre-processing works, I have now uploaded a new Jupyter notebook FenicsMeshProcessingLV.ipynb. This notebook shows how to extract the coordinates, elements, fibre field, interior points and finally surface facets where the traction force is applied.

Unfortunately, the exact pre-processing steps will vary from mesh to mesh, so you will have to adjust slightly what I have done to get things working for your mesh.

If you want any more details, please feel free to comment below.

AlanConnorChen commented 4 months ago

Thank you for your reply! I have seen the files you uploaded and understand that the data preprocessing steps will vary depending on the original data. I would appreciate your ideas on two additional points:

Based on my current understanding of the files and naming guesses, fibre_dir.xml contains the fiber direction information; lvgmsh_fmaker.xml provides the node force information in the FE mesh; lvgmsh_mesh.xml provides the tetrahedral element mesh information. These files are also generated using FEniCS. Are there any related processing codes for these files?

How is the real-node-features.npy file generated? I checked FenicsMeshProcessingLV.ipynb and data_process_utils.py, and neither of them shows the method for generating this file.

Sorry for the late reply, and thank you again for taking the time to respond!

dodaltuin commented 4 months ago

I didn't actually write the processing code for the xml files - this was done by Hao Gao, one of my co-authors. I can try to find this code. Do you not have similar files though for your own problem?

Regarding the generation of the node feature vectors, I have now included details in Section 6 of FenicsMeshProcessingLV.ipynb. The feature values are found basically the concatenation of the arrays interior_points and fibre_field (see Eq. (24) of the paper).

Note that you can also include global variables such as material parameters in the node features. This will likely yield better accuracy - the reason I didn't do this was purely for computational efficiency at prediction time, as outlined in the last paragraph of Section 2.3 of the paper.

AlanConnorChen commented 4 months ago

Thanks for the detailed response! I really appreciate you taking the time to point me to the relevant sections. I'll be sure to read through them carefully and see if I can adapt the approach for my own problem. Cheers!

dodaltuin commented 4 months ago

No problem. I'll close this issue for now, but if you can't get things working for your own problem, I'm happy to be of further help.