Open liuxinxjtu opened 2 years ago
Hi @liuxinxjtu, thanks for your interest.
The PSF is by default approximated. See the function PsfOtf(w,scale)
in "MLSIM_datagen/SIMulator_functions.py".
If you wanted to fine-tune the model using a PSF more closely resembling your system, you could replace this function call and load an external file.
Whether to use real data to fine-tune the model comes down to whether you have good reconstruction output from other methods to use as targets. If you have a low signal-to-noise ratio and you train with e.g. FairSIM as target, you will end up reproducing a lot of the artefacts that FairSIM generates.
thanks, Generate the parameter opt.nrep in the SIM data from your codes. The purpose is to generate multiple sets of images with different directions and phases to prove the robustness of the network in terms of directions and angles?
@liuxinxjtu It seems that this setting actually augments the dataset and enables the network to learn a many-to-one mapping (to the GT). I guess, to learn this mapping, opt.nrep should be >= 2.
@Quma233 is correct that opt.nrep is intended for data augmentation. The robustness of the network comes from the large parameter space used for data synthetisation. This in turn requires a large source image set or, alternatively, opt.nrep >= 2. For fine-tuning you can probably ignore this option.
How many nanometers is the pixel size mentioned in the paper?
Hi, I'm very interested in your work, could you provide PSF simulate the process? After training with simulated data, whether to use real data to fine-tune the model, and then verify