Closed gopikrishnangs44 closed 2 years ago
Thanks for writing @gopikrishnangs44. For the 4x5 simulation, you should not specify:
Nested grid simulation? : T
Buffer zone (N S E W ) : 40 02 60 105
This should only be turned on for the nested-grid resolution at 0.25 x 0.3125 resolution with GEOS-FP.
Dear @yantosca,
Thank you.
I made the boundary conditions through a standard geosfp 4x5 simulation.
CopyRunDirs.input geosfp 4x5 - standard 2016070100 2016070200 -
and nested rundirs using: geosfp 025x03125 as tropchem 2016070100 2016070200 -
Can I run a tropchem simulation using BC from a standard geosfp 4x5 run? Or should I need to create BC using a 4x5 tropchem run for a nested run?
I tried running the nested grid. But the run failed with a Segmentation fault (core dumped) Output:
---> DATE: 2016/07/01 UTC: 00:00 X-HRS: 0.000000 HEMCO already called for this timestep. Returning. NASA-GSFC Tracer Transport Module successfully initialized HEMCO (VOLCANO): Opening /home/geoschem/ExtData/HEMCO/VOLCANO/v2019-08/2016/07/so2_volcanic_emissions_Carns.20160701.rc --- Initialize surface boundary conditions from input file --- --- Finished initializing surface boundary conditions --- %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%% USING O3 COLUMNS FROM THE MET FIELDS! %%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
###############################################################################
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
at /home/geoschem/GC/Code.12.9.3/GeosCore/strat_chem_mod.F90:676
at /home/geoschem/GC/Code.12.9.3/GeosCore/strat_chem_mod.F90:676
at /home/geoschem/GC/Code.12.9.3/GeosCore/strat_chem_mod.F90:676
at /home/geoschem/GC/Code.12.9.3/GeosCore/strat_chem_mod.F90:676
at /home/geoschem/GC/Code.12.9.3/GeosCore/strat_chem_mod.F90:676
at /tmp/geoschem/spack-stage/spack-stage-gcc-8.3.0-c73ig2rahj2nogs5oiuifwzjbhjhm2zy/spack-src/libgomp/team.c:120
Segmentation fault (core dumped)
This issue has been automatically marked as stale because it has not had recent activity. If there are no updates within 7 days it will be closed. You can add the "never stale" tag to prevent the Stale bot from closing this issue.
@gopikrishnangs44 ,Hello, I had the same problem when I set up the simulated Asia region, how can I fix it
@SLF2265127725 I DIDN'T FIXED IT YET.
@gopikrishnangs44 I tried it, and I found that as long as 0.5*0.625 is selected, an error will be reported. And I looked through the official website, he said that the amount of data is too large, and the array needs to be improved, but I don't know how to change it.
http://wiki.seas.harvard.edu/geos-chem/index.php/Segmentation_faults#Invalid_memory_access
Thanks for writing @SLF2265127725. Have you set the OMP_STACKSIZE
limit to unlimited
in your login environment? See https://geos-chem.readthedocs.io/en/latest/gcc-guide/01-startup/login-env-parallel.html
This issue has been automatically marked as stale because it has not had recent activity. If there are no updates within 7 days it will be closed. You can add the "never stale" tag to prevent the Stale bot from closing this issue.
Closing due to inactivity
I am trying to do a nested grid simulation in GEOS CHEM v12.9.3.
I am running a global 4x5 geosfp standard simulation to save the BC files.
4x5 simulations works quiet well for me and I am getting outputs as well.
Unfortunately, while turning on
Nested grid simulation? : T Buffer zone (N S E W ) : 40 02 60 105
I am getting core dumped issue. Attaching the log file here.
test_wo_BC.txt
Requesting help from the team.
Regards, Gopikrishnan
Software versions using spack spack load gcc@8.3.0 spack load openmpi spack load hdf5 spack load netcdf-c spack load netcdf-fortran spack load esmf