Previously, I got fixed version from issue #22. It works like a charm for broadband wavelengths. To isolate the wavelength-specific depth traces, I attempted to simulate monochromatic light. However, changing from an array of wavelength to a scalar value caused siulation to freeze at 0%. Reverting to broadband wavelengths restored normal operation as usual.
Memory trace for broadband simulation: memory increases as progress increases
model.MC.wavelength=350:850;
Memory trace for monochromatic simulation: progress stuck at 0% and no memory increase and no GPU load but somehow CPU has some loads. (CPU <10% when idle)
model.MC.wavelength=550;
What I got from broadband simulation (refined NormalizedFluenceRate data)
I want to produce same data for monochromatic light source. What are causes and solutions?
Previously, I got fixed version from issue #22. It works like a charm for broadband wavelengths. To isolate the wavelength-specific depth traces, I attempted to simulate monochromatic light. However, changing from an array of wavelength to a scalar value caused siulation to freeze at 0%. Reverting to broadband wavelengths restored normal operation as usual.
Memory trace for broadband simulation: memory increases as progress increases
model.MC.wavelength=350:850;
Memory trace for monochromatic simulation: progress stuck at 0% and no memory increase and no GPU load but somehow CPU has some loads. (CPU <10% when idle)
model.MC.wavelength=550;
What I got from broadband simulation (refined NormalizedFluenceRate data)
I want to produce same data for monochromatic light source. What are causes and solutions?