Closed anecas1979 closed 1 year ago
@dpgrote I think you would be interested in this. The example above is for a 1d simulation and so I believe it uses the tridiagonal solver. Have you seen this same kind of thing with periodic boundary conditions before?
I can take a quick look at it this afternoon. Can you post the input file?
@dpgrote The attached file gives a more or less minimal failing example with the superLU solver also included for reference. PICMI_input.txt
The solver itself seems to be working ok. The problem is happening in the ParallelCopy
afterward. The tridiagonal solver copies the density into a temporary array, does the solver there, and then copies the result back to the phi multifab. The results in the temporary array are good. But something is going wrong in the copy. @WeiqunZhang is there something wrong in the ParallelCopy, at line 868 in Source/FieldSolver/ElectrostaticSolver.cpp? It works fine with PEC and Neumann boundaries, but for periodic, the first three values are being set to zero instead of doing the copy.
@roelof-groenewald Could you try
diff --git a/Source/FieldSolver/ElectrostaticSolver.cpp b/Source/FieldSolver/ElectrostaticSolver.cpp
index c227506f6..aaf561bd2 100644
--- a/Source/FieldSolver/ElectrostaticSolver.cpp
+++ b/Source/FieldSolver/ElectrostaticSolver.cpp
@@ -865,7 +865,7 @@ WarpX::computePhiTriDiagonal (const amrex::Vector<std::unique_ptr<amrex::MultiFa
// Copy phi1d to phi, including the x guard cell
const IntVect xghost(AMREX_D_DECL(1,0,0));
- phi[lev]->ParallelCopy(phi1d_mf, 0, 0, 1, xghost, xghost, Geom(lev).periodicity());
+ phi[lev]->ParallelCopy(phi1d_mf, 0, 0, 1, IntVect(0), xghost, Geom(lev).periodicity());
}
phi1d_mf
's ghost cells probably don't have valid data. So they should not be the source of the ParallelCopy.
@WeiqunZhang I tried that and it didn't help, but it did change the result a little. With that, now only the first two cells are being set to zero. This is strange since those cells being set to zero are not ghost cells, but are within the domain. Also, the ghost cells in phi1d_mf are set and do have valid data - see the lines just above the ParallelCopy.
@roelof-groenewald Could you provide an inputs file I can use without python?
@roelof-groenewald Could you provide an inputs file I can use without python?
Will do. I'll get that for you a bit later today.
@WeiqunZhang Here is the non-python input along with a python script to plot the resulting Ez field values.
max_step = 5
warpx.verbose = 1
warpx.const_dt = 1.7e-08
warpx.use_filter = 0
warpx.serialize_initial_conditions = 1
warpx.do_dynamic_scheduling = 0
warpx.do_electrostatic = labframe
warpx.self_fields_required_precision = 1e-10
amr.n_cell = 96
amr.max_grid_size = 32
amr.max_level = 0
geometry.dims = 1
geometry.prob_lo = 0.0
geometry.prob_hi = 564.0
boundary.field_lo = periodic
boundary.field_hi = periodic
boundary.particle_lo = periodic
boundary.particle_hi = periodic
algo.particle_shape = 1
particles.species_names = elec ions
elec.mass = m_e
elec.charge = -q_e
elec.injection_style = nuniformpercell
elec.num_particles_per_cell_each_dim = 100
elec.momentum_distribution_type = constant
elec.ux = 0.0
elec.uy = 0.0
elec.uz = 0.0
elec.profile = parse_density_function
elec.density_function(x,y,z) = 1.0e5*(1+0.5*cos(0.01114039948081487*z))
ions.mass = 1.0
ions.charge = q_e
ions.injection_style = nuniformpercell
ions.num_particles_per_cell_each_dim = 100
ions.momentum_distribution_type = constant
ions.ux = 0.0
ions.uy = 0.0
ions.uz = 0.0
ions.profile = parse_density_function
ions.density_function(x,y,z) = 1.0e5
particles.do_tiling = 1
amrex.verbose = 1
geometry.coord_sys = 0
diagnostics.diags_names = diag1
diag1.diag_type = Full
diag1.intervals = 5
diag1.fields_to_plot = Ez
The current ES solver generates spurious fields close to the left boundary. Please see plot of E_z below ran with periodic BC. Roelof already found a solution using a SuperLU solve as you can observe on the bottom .