HomerReid / scuff-em

A comprehensive and full-featured computational physics suite for boundary-element analysis of electromagnetic scattering, fluctuation-induced phenomena (Casimir forces and radiative heat transfer), nanophotonics, RF device engineering, electrostatics, and more. Includes a core library with C++ and python APIs as well as many command-line applications.
http://www.homerreid.com/scuff-em
GNU General Public License v2.0
126 stars 50 forks source link

Nanodisk array, local field evaluates to NaN #65

Closed gevero closed 8 years ago

gevero commented 8 years ago

Hi Homer

This issue is strongly related to #64, as the structure, setup and meshes are exactly the same. I try to use scuff-scatter to access the local field via an EPfile, at a wavelength of 12000nm. All I get in the output is NaN for every field component. If I leave everything exactly as is, but remove the periodic part

LATTICE
  VECTOR 12.0  0.0
  VECTOR 0.0  12.0
ENDLATTICE

from my .scuffgeo file, computation works beautifully as you can see below, therefore the faulty behavior is to be assigned to the periodic computation.

Best

Giovanni

image

HomerReid commented 8 years ago

I think I know the reason for this, and I am considering possible workarounds, but for the time being a workaround that has always worked for me is to displace the wavelength by a tiny amount away from being an integer divisor of the lattice period. For example, try a wavelength of 12.001 or 11.999 microns. Does that work better for you?

Another possibility is to try adding a small positive imaginary component to the frequency, corresponding to a small negative imaginary component to the wavelength: try --omega 0.52359+0.01i or --lambda 12-0.01i.

gevero commented 8 years ago

Thanks Homer, that almost did the trick. I proceeded as follows:

Now I can get the field, and as expected the lattice mode presents a much stronger field than the isolated nanodisk. Now the only problem is that the field inside the disk, as you can see below, is not computed correctly. It seems a silly bug, but I do not think I would be able to spot it. But in this case for you I guess it will be straightforward.

Thanks a lot

Giovanni

PS I would be happy to test hexagonal lattice properties, just let me know.

EDIT Regarding the bug...it seems that inside the compact object only the positive quadrant is considered.

image

gevero commented 8 years ago

One more update:

The plot below is exactly as above, with the exception of a t=(1.0,1.0) translation in the positive (x,y) direction. Now the cylinder resides entirely in the positive quadrant and the field is plotted correctly.

Best

Giovanni

image

HomerReid commented 8 years ago

These are nice pictures! Note that you can use the --FVMesh option to SCUFF-SCATTER to produce figures like this automatically---see, for example, here:

http://homerreid.github.io/scuff-em-documentation/examples/DiffractionPatterns/DiffractionPatterns/

As for the (1,1) translation: Yes, there is a constraint that any compact objects in a periodic geometry must lie entirely within the unit cell, with the mesh not straddling the unit-cell boundary. (This can always be arranged simply by defining the unit cell and the lattice vectors appropriately.) I need to update the documentation to state this. (For non-compact objects the meshes may of course straddle the unit-cell boundary.)

gevero commented 8 years ago

Hi Homer

Thanks. I will be happy to setup a pull request for a new example, including these plots. I am aware of the --FVMesh example, yet I didn't use it for two main reasons:

Best

HomerReid commented 8 years ago

I understand. The --FVMesh option should actually work fine even if the field-visualization mesh cuts through objects, but I see your point about wanting to keep everything within python.

Actually, since your example is an illustration of the use of the python interface, it would be very nice if you would be willing to contribute some documentation, because I don't use the python interface and am not qualified to document it. Whenever you get a chance, just create a new directory for your example in the doc/docs/examples subdirectory, mimicing the style of the other examples in that directory, and submit a pull request. I'll do the work of incorporating your example into the documentation tree.

gevero commented 8 years ago

Hi Homer

Just one more question before closing this issue. Whenever I try to plot a local field for a periodic structure, on a plotting plane close to, or crossing a physical surface, I always fall back to the infamous out of memory error. It does not matter how coarse I set the SCUFF_INTERPOLATION_TOLERANCE environment variable, the memory requirements will explode eventually. If my plotting plane is instead far from a physical surface everything is ok. Do you have any suggestion to cure this behavior?

Best

Giovanni

HomerReid commented 8 years ago

The size of the interpolation grid scales with the bounding box encompassing the evaluation points, and in some cases it can be more efficient to break up your list of evaluation points into small clumps.

For example, if you have two evaluation points at coordinates (+10, +10, +10) and (-10, -10, -10), then the interpolation grid will need to span the entire volume of a cube of side length 20. That would be a large grid. On the other hand, if you did the same calculation with two separate EPFiles then the interpolation grid for each one would consist of just a single point.

Could this be the explanation for what you are seeing? If not, I could take a look if you wanted to post an example in which memory is a problem.