Open HasanAwad-Polito opened 10 months ago
This sounds like an instability issue. You can try decreasing the value of DT. If you look at OUTPUT_FILES/output_solver.txt, you will find the "Maximum suggested time step." If you have a complex model, some small elements may require even smaller time steps.
Best, Hom Nath
Hom Nath Gharti, PhD Assistant Professor | Digital Earth Scientist https://www.digitalearthscience.com/ Department of Geological Sciences and Geological Engineering Miller Hall 314, 36 Union St Queen’s University Kingston, ON K7L 3N6, Canada
Queen’s University is situated on traditional Anishinaabe and Haudenosaunee Territory.https://www.queensu.ca/encyclopedia/t/traditional-territories
From: HasanAwad-Polito @.> Sent: Wednesday, November 22, 2023 5:02 AM To: SPECFEM/specfem3d @.> Cc: Subscribed @.***> Subject: [SPECFEM/specfem3d] FLOATING POINT EXCEPTION (signal 8) when solving xspecfem3D (Issue #1654)
Program received signal SIGFPE: Floating-point exception - erroneous arithmetic operation.
Backtrace for this error:
at src/specfem3D/compute_forces_viscoelastic.F90:692
at src/specfem3D/compute_forces_viscoelastic_calling_routine.F90:125
at src/specfem3D/iterate_time.F90:279
at src/specfem3D/specfem3D.F90:405
at src/specfem3D/specfem3D.F90:356
=================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = PID 1134 RUNNING AT DESKTOP-F049AN8 = EXIT CODE: 8 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Floating point exception (signal 8) This typically refers to a problem with your application. Please see the FAQ page for debugging suggestions
COULD ANYONE ASSIST?
— Reply to this email directly, view it on GitHubhttps://github.com/SPECFEM/specfem3d/issues/1654, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMCQ4TCD3IRSDUN55RJXI3YFXERTAVCNFSM6AAAAAA7V6LNACVHI2DSMVQWIX3LMV43ASLTON2WKOZSGAYDKOJYGAYDMMA. You are receiving this because you are subscribed to this thread.Message ID: @.***>
Hi Homnath,
Thank you, yes this work perfectly fine! As I am new to specfem3D cartesian, would you kindly assist in this issue? I am trying to generate displacement waves for real earthquake but when plotting the .semd files I get straight lines not a normal seismic waves. Here are my configuration. MESH PAR FILE>
#-----------------------------------------------------------
#
# Meshing input parameters
#
#-----------------------------------------------------------
# coordinates of mesh block in latitude/longitude and depth in km
LATITUDE_MIN = -43.00
LATITUDE_MAX = 45.00
LONGITUDE_MIN = -11.59
LONGITUDE_MAX = 12.50
DEPTH_BLOCK_KM = 80.d0
UTM_PROJECTION_ZONE = 32
SUPPRESS_UTM_PROJECTION = .false.
#long max should be 11.61 buut because of station we put 12.50
# file that contains the interfaces of the model / mesh
INTERFACES_FILE = interfaces.txt
# file that contains the cavity
CAVITY_FILE = no_cavity.dat
# number of elements at the surface along edges of the mesh at the surface
# (must be 8 * multiple of NPROC below if mesh is not regular and contains mesh doublings)
# (must be multiple of NPROC below if mesh is regular)
NEX_XI = 36
NEX_ETA = 36
# number of MPI processors along xi and eta (can be different)
NPROC_XI = 1
NPROC_ETA = 1
#-----------------------------------------------------------
#
# Doubling layers
#
#-----------------------------------------------------------
# Regular/irregular mesh
USE_REGULAR_MESH = .true.
# Only for irregular meshes, number of doubling layers and their position
NDOUBLINGS = 0
# NZ_DOUBLING_1 is the parameter to set up if there is only one doubling layer
# (more doubling entries can be added if needed to match NDOUBLINGS value)
NZ_DOUBLING_1 = 40
NZ_DOUBLING_2 = 48
#-----------------------------------------------------------
#
# Visualization
#
#-----------------------------------------------------------
# create mesh files for visualisation or further checking
CREATE_ABAQUS_FILES = .false.
CREATE_DX_FILES = .false.
CREATE_VTK_FILES = .true.
# stores mesh files as cubit-exported files into directory MESH/ (for single process run)
SAVE_MESH_AS_CUBIT = .false.
# path to store the databases files
LOCAL_PATH = ./OUTPUT_FILES/DATABASES_MPI
#-----------------------------------------------------------
#
# CPML
#
#-----------------------------------------------------------
# CPML perfectly matched absorbing layers
THICKNESS_OF_X_PML = 12.3d0
THICKNESS_OF_Y_PML = 12.3d0
THICKNESS_OF_Z_PML = 12.3d0
#-----------------------------------------------------------
#
# Domain materials
#
#-----------------------------------------------------------
# number of materials
NMATERIALS = 1
# define the different materials in the model as:
# #material_id #rho #vp #vs #Q_Kappa #Q_mu #anisotropy_flag #domain_id
# Q_Kappa : Q_Kappa attenuation quality factor
# Q_mu : Q_mu attenuation quality factor
# anisotropy_flag : 0 = no anisotropy / 1,2,... check the implementation in file aniso_model.f90
# domain_id : 1 = acoustic / 2 = elastic
1 3012.0 7100.0 3990.0 2444.4 300.0 0 2
#-----------------------------------------------------------
#
# Domain regions
#
#-----------------------------------------------------------
# number of regions
NREGIONS = 1
# define the different regions of the model as :
#NEX_XI_BEGIN #NEX_XI_END #NEX_ETA_BEGIN #NEX_ETA_END #NZ_BEGIN #NZ_END #material_id
1 36 1 36 1 16 1
CMTSOLUTION FILE>
PDE 2023 09 18 03 10 14.00 44.05 11.59 8.0 4.9 4.9 Marradi_2023
event name: Marradi_2023
time shift: 0.0000
half duration: 0.0000
latorUTM: 44.05
longorUTM: 11.59
depth: 6.0
Mrr: -3.001497E+23
Mtt: 3.041725E+23
Mpp: -4.0229E+21
Mrt: -4.7888E+21
Mrp: -4.03642E+22
Mtp: -9.30516E+22
STATIONS FILE
CPGN IV 43.8011 12.3205 0.0 0.0
X30 IV -44.05 -11.59 0.0 0.0
the X30 is the epi center. Do you have any idea?
What does your Par_file look like? You may have set a very short duration so that the wave hasn't reached the seismogram yet.
Hom Nath Gharti, PhD Assistant Professor | Digital Earth Scientist https://www.digitalearthscience.com/ Department of Geological Sciences and Geological Engineering Miller Hall 314, 36 Union St Queen’s University Kingston, ON K7L 3N6, Canada
Queen’s University is situated on traditional Anishinaabe and Haudenosaunee Territory.https://www.queensu.ca/encyclopedia/t/traditional-territories
From: HasanAwad-Polito @.> Sent: Wednesday, November 22, 2023 10:28 AM To: SPECFEM/specfem3d @.> Cc: Hom Nath Gharti @.>; Comment @.> Subject: Re: [SPECFEM/specfem3d] FLOATING POINT EXCEPTION (signal 8) when solving xspecfem3D (Issue #1654)
Hi Homnath,
Thank you, yes this work perfectly fine! As I am new to specfem3D cartesian, would you kindly assist in this issue? I am trying to generate displacement waves for real earthquake but when plotting the .semd files I get straight lines not a normal seismic waves. Here are my configuration. MESH PAR FILE>
Meshing input parameters
coordinates of mesh block in latitude/longitude and depth in km
LATITUDE_MIN = -43.00 LATITUDE_MAX = 45.00 LONGITUDE_MIN = -11.59 LONGITUDE_MAX = 12.50 DEPTH_BLOCK_KM = 80.d0 UTM_PROJECTION_ZONE = 32 SUPPRESS_UTM_PROJECTION = .false.
file that contains the interfaces of the model / mesh
INTERFACES_FILE = interfaces.txt
file that contains the cavity
CAVITY_FILE = no_cavity.dat
number of elements at the surface along edges of the mesh at the surface (must be 8 * multiple of NPROC below if mesh is not regular and contains mesh doublings) (must be multiple of NPROC below if mesh is regular)
NEX_XI = 36 NEX_ETA = 36
number of MPI processors along xi and eta (can be different)
NPROC_XI = 1 NPROC_ETA = 1
Doubling layers
Regular/irregular mesh
USE_REGULAR_MESH = .true.
Only for irregular meshes, number of doubling layers and their position
NDOUBLINGS = 0
NZ_DOUBLING_1 is the parameter to set up if there is only one doubling layer (more doubling entries can be added if needed to match NDOUBLINGS value)
NZ_DOUBLING_1 = 40 NZ_DOUBLING_2 = 48
Visualization
create mesh files for visualisation or further checking
CREATE_ABAQUS_FILES = .false. CREATE_DX_FILES = .false. CREATE_VTK_FILES = .true.
stores mesh files as cubit-exported files into directory MESH/ (for single process run)
SAVE_MESH_AS_CUBIT = .false.
path to store the databases files
LOCAL_PATH = ./OUTPUT_FILES/DATABASES_MPI
CPML
CPML perfectly matched absorbing layers
THICKNESS_OF_X_PML = 12.3d0 THICKNESS_OF_Y_PML = 12.3d0 THICKNESS_OF_Z_PML = 12.3d0
Domain materials
number of materials
NMATERIALS = 1
define the different materials in the model as:
Q_Kappa : Q_Kappa attenuation quality factor Q_mu : Q_mu attenuation quality factor anisotropy_flag : 0 = no anisotropy / 1,2,... check the implementation in file aniso_model.f90 domain_id : 1 = acoustic / 2 = elastic
1 3012.0 7100.0 3990.0 2444.4 300.0 0 2
Domain regions
number of regions
NREGIONS = 1
define the different regions of the model as :
1 36 1 36 1 16 1
CMTSOLUTION FILE> PDE 2023 09 18 03 10 14.00 44.05 11.59 8.0 4.9 4.9 Marradi_2023 event name: Marradi_2023 time shift: 0.0000 half duration: 0.0000 latorUTM: 44.05 longorUTM: 11.59 depth: 6.0 Mrr: -3.001497E+23 Mtt: 3.041725E+23 Mpp: -4.0229E+21 Mrt: -4.7888E+21 Mrp: -4.03642E+22 Mtp: -9.30516E+22
STATIONS FILE CPGN IV 43.8011 12.3205 0.0 0.0 X30 IV -44.05 -11.59 0.0 0.0
the X30 is the epi center. Do you have any idea?
— Reply to this email directly, view it on GitHubhttps://github.com/SPECFEM/specfem3d/issues/1654#issuecomment-1822986965, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMCQ4SPBEEAKC2PPLLJCEDYFYKYLAVCNFSM6AAAAAA7V6LNACVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRSHE4DMOJWGU. You are receiving this because you commented.Message ID: @.***>
Here is my par file
#-----------------------------------------------------------
#
# Simulation input parameters
#
#-----------------------------------------------------------
# forward or adjoint simulation
# 1 = forward, 2 = adjoint, 3 = both simultaneously
SIMULATION_TYPE = 1
# 0 = earthquake simulation, 1/2/3 = three steps in noise simulation
NOISE_TOMOGRAPHY = 0
SAVE_FORWARD = .false.
# solve a full FWI inverse problem from a single calling program with no I/Os, storing everything in memory,
# or run a classical forward or adjoint problem only and save the seismograms and/or sensitivity kernels to disk (with costlier I/Os)
INVERSE_FWI_FULL_PROBLEM = .false.
# UTM projection parameters
# Use a negative zone number for the Southern hemisphere:
# The Northern hemisphere corresponds to zones +1 to +60,
# The Southern hemisphere corresponds to zones -1 to -60.
UTM_PROJECTION_ZONE = 32
SUPPRESS_UTM_PROJECTION = .false.
# number of MPI processors
NPROC = 1
# time step parameters
NSTEP = 5000
DT = 0.01
# set to true to use local-time stepping (LTS)
LTS_MODE = .false.
# Partitioning algorithm for decompose_mesh
# choose partitioner: 1==SCOTCH (default), 2==METIS, 3==PATOH, 4==ROWS_PART
PARTITIONING_TYPE = 1
#-----------------------------------------------------------
#
# LDDRK time scheme
#
#-----------------------------------------------------------
USE_LDDRK = .false.
INCREASE_CFL_FOR_LDDRK = .false.
RATIO_BY_WHICH_TO_INCREASE_IT = 1.4
#-----------------------------------------------------------
#
# Mesh
#
#-----------------------------------------------------------
# Number of nodes for 2D and 3D shape functions for hexahedra.
# We use either 8-node mesh elements (bricks) or 27-node elements.
# If you use our internal mesher, the only option is 8-node bricks (27-node elements are not supported).
NGNOD = 8
# models:
# available options are:
# default (model parameters described by mesh properties)
# 1D models available are:
# 1d_prem,1d_socal,1d_cascadia
# 3D models available are:
# aniso,external,gll,salton_trough,tomo,SEP,coupled,...
MODEL = default
# path for external tomographic models files
TOMOGRAPHY_PATH = DATA/tomo_files/
# if you are using a SEP model (oil-industry format)
SEP_MODEL_DIRECTORY = DATA/my_SEP_model/
#-----------------------------------------------------------
# parameters describing the model
APPROXIMATE_OCEAN_LOAD = .false.
TOPOGRAPHY = .false.
ATTENUATION = .false.
ANISOTROPY = .false.
GRAVITY = .false.
# in case of attenuation, reference frequency in Hz at which the velocity values in the velocity model are given (unused otherwise)
ATTENUATION_f0_REFERENCE = 18.d0
# attenuation period range over which we try to mimic a constant Q factor
MIN_ATTENUATION_PERIOD = 999999998.d0
MAX_ATTENUATION_PERIOD = 999999999.d0
# ignore this range and ask the code to compute it automatically instead based on the estimated resolution of the mesh (use this unless you know what you are doing)
COMPUTE_FREQ_BAND_AUTOMATIC = .true.
# Olsen's constant for Q_mu = constant * V_s attenuation rule
USE_OLSEN_ATTENUATION = .false.
OLSEN_ATTENUATION_RATIO = 0.05
#-----------------------------------------------------------
#
# Absorbing boundary conditions
#
#-----------------------------------------------------------
# C-PML boundary conditions for a regional simulation
# (if set to .false., and STACEY_ABSORBING_CONDITIONS is also set to .false., you get a free surface instead
# in the case of elastic or viscoelastic mesh elements, and a rigid surface in the case of acoustic (fluid) elements
PML_CONDITIONS = .false.
# C-PML top surface
PML_INSTEAD_OF_FREE_SURFACE = .false.
# C-PML dominant frequency
f0_FOR_PML = 0.05555
# parameters used to rotate C-PML boundary conditions by a given angle (not completed yet)
# ROTATE_PML_ACTIVATE = .false.
# ROTATE_PML_ANGLE = 0.
# absorbing boundary conditions for a regional simulation
# (if set to .false., and PML_CONDITIONS is also set to .false., you get a free surface instead
# in the case of elastic or viscoelastic mesh elements, and a rigid surface in the case of acoustic (fluid) elements
STACEY_ABSORBING_CONDITIONS = .true.
# absorbing top surface (defined in mesh as 'free_surface_file')
STACEY_INSTEAD_OF_FREE_SURFACE = .false.
# When STACEY_ABSORBING_CONDITIONS is set to .true. :
# absorbing conditions are defined in xmin, xmax, ymin, ymax and zmin
# this option BOTTOM_FREE_SURFACE can be set to .true. to
# make zmin free surface instead of absorbing condition
BOTTOM_FREE_SURFACE = .false.
#-----------------------------------------------------------
#
# undoing attenuation and/or PMLs for sensitivity kernel calculations
#
#-----------------------------------------------------------
# to undo attenuation and/or PMLs for sensitivity kernel calculations or forward runs with SAVE_FORWARD
# use the flag below. It performs undoing of attenuation and/or of PMLs in an exact way for sensitivity kernel calculations
# but requires disk space for temporary storage, and uses a significant amount of memory used as buffers for temporary storage.
# When that option is on the second parameter indicates how often the code dumps restart files to disk (if in doubt, use something between 100 and 1000).
UNDO_ATTENUATION_AND_OR_PML = .false.
NT_DUMP_ATTENUATION = 500
#-----------------------------------------------------------
#
# Visualization
#
#-----------------------------------------------------------
# save AVS or OpenDX movies
# MOVIE_TYPE = 1 to show the top surface
# MOVIE_TYPE = 2 to show all the external faces of the mesh
CREATE_SHAKEMAP = .false.
MOVIE_SURFACE = .false.
MOVIE_TYPE = 1
MOVIE_VOLUME = .false.
SAVE_DISPLACEMENT = .false.
MOVIE_VOLUME_STRESS = .false.
USE_HIGHRES_FOR_MOVIES = .false.
NTSTEP_BETWEEN_FRAMES = 200
HDUR_MOVIE = 0.0
# save AVS or OpenDX mesh files to check the mesh
SAVE_MESH_FILES = .true.
# path to store the local database file on each node
LOCAL_PATH = OUTPUT_FILES/DATABASES_MPI
# interval at which we output time step info and max of norm of displacement
NTSTEP_BETWEEN_OUTPUT_INFO = 500
#-----------------------------------------------------------
#
# Sources
#
#-----------------------------------------------------------
# sources and receivers Z coordinates given directly (i.e. as their true position) instead of as their depth
USE_SOURCES_RECEIVERS_Z = .false.
# use a (tilted) FORCESOLUTION force point source (or several) instead of a CMTSOLUTION moment-tensor source.
# This can be useful e.g. for oil industry foothills simulations or asteroid simulations
# in which the source is a vertical force, normal force, tilted force, impact etc.
# If this flag is turned on, the FORCESOLUTION file must be edited by giving:
# - the corresponding time-shift parameter,
# - the half duration (hdur, in s) for Gaussian/Step function, dominant frequency (f0, in Hz) for Ricker,
# - the coordinates of the source,
# - the source time function type (0=Gaussian function, 1=Ricker wavelet, 2=Step function),
# - the magnitude of the force source,
# - the components of a (non necessarily unitary) direction vector for the force source in the E/N/Z_UP basis.
# The direction vector is made unitary internally in the code and thus only its direction matters here;
# its norm is ignored and the norm of the force used is the factor force source times the source time function.
USE_FORCE_POINT_SOURCE = .false.
# set to true to use a Ricker source time function instead of the source time functions set by default
# to represent a (tilted) FORCESOLUTION force point source or a CMTSOLUTION moment-tensor source.
USE_RICKER_TIME_FUNCTION = .false.
# use an external source time function
# you must add a file with your source time function and the file name path
# relative to working directory at the end of CMTSOLUTION or FORCESOLUTION file
# (with multiple sources, one file per source is required).
# This file must have a single column containing the amplitudes of the source at all time steps;
# time step size used must be equal to DT as defined at the beginning of this Par_file.
# Be sure when this option is .false. to remove the name of stf file in CMTSOLUTION or FORCESOLUTION
USE_EXTERNAL_SOURCE_FILE = .false.
# print source time function
PRINT_SOURCE_TIME_FUNCTION = .false.
# source encoding
# (for acoustic simulations only for now) determines source encoding factor +/-1 depending on sign of moment tensor
# (see e.g. Krebs et al., 2009. Fast full-wavefield seismic inversion using encoded sources, Geophysics, 74 (6), WCC177-WCC188.)
USE_SOURCE_ENCODING = .false.
#-----------------------------------------------------------
#
# Seismograms
#
#-----------------------------------------------------------
# interval in time steps for writing of seismograms
NTSTEP_BETWEEN_OUTPUT_SEISMOS = 10000
# set to n to reduce the sampling rate of output seismograms by a factor of n
# defaults to 1, which means no down-sampling
NTSTEP_BETWEEN_OUTPUT_SAMPLE = 1
# decide if we save displacement, velocity, acceleration and/or pressure in forward runs (they can be set to true simultaneously)
# currently pressure seismograms are implemented in acoustic (i.e. fluid) elements only
SAVE_SEISMOGRAMS_DISPLACEMENT = .true.
SAVE_SEISMOGRAMS_VELOCITY = .false.
SAVE_SEISMOGRAMS_ACCELERATION = .false.
SAVE_SEISMOGRAMS_PRESSURE = .false. # currently implemented in acoustic (i.e. fluid) elements only
# option to save strain seismograms
# this option is useful for strain Green's tensor
SAVE_SEISMOGRAMS_STRAIN = .false.
# save seismograms also when running the adjoint runs for an inverse problem
# (usually they are unused and not very meaningful, leave this off in almost all cases)
SAVE_SEISMOGRAMS_IN_ADJOINT_RUN = .true.
# save seismograms in binary or ASCII format (binary is smaller but may not be portable between machines)
USE_BINARY_FOR_SEISMOGRAMS = .false.
# output seismograms in Seismic Unix format (binary with 240-byte-headers)
SU_FORMAT = .false.
# output seismograms in ASDF (requires asdf-library)
ASDF_FORMAT = .false.
# output seismograms in HDF5 (requires hdf5-library and WRITE_SEISMOGRAMS_BY_MAIN)
HDF5_FORMAT = .false.
# decide if main process writes all the seismograms or if all processes do it in parallel
WRITE_SEISMOGRAMS_BY_MAIN = .false.
# save all seismograms in one large combined file instead of one file per seismogram
# to avoid overloading shared non-local file systems such as LUSTRE or GPFS for instance
SAVE_ALL_SEISMOS_IN_ONE_FILE = .false.
# use a trick to increase accuracy of pressure seismograms in fluid (acoustic) elements:
# use the second derivative of the source for the source time function instead of the source itself,
# and then record -potential_acoustic() as pressure seismograms instead of -potential_dot_dot_acoustic();
# this is mathematically equivalent, but numerically significantly more accurate because in the explicit
# Newmark time scheme acceleration is accurate at zeroth order while displacement is accurate at second order,
# thus in fluid elements potential_dot_dot_acoustic() is accurate at zeroth order while potential_acoustic()
# is accurate at second order and thus contains significantly less numerical noise.
USE_TRICK_FOR_BETTER_PRESSURE = .false.
#-----------------------------------------------------------
#
# Energy calculation
#
#-----------------------------------------------------------
# to plot energy curves, for instance to monitor how CPML absorbing layers behave;
# should be turned OFF in most cases because a bit expensive
OUTPUT_ENERGY = .false.
# every how many time steps we compute energy (which is a bit expensive to compute)
NTSTEP_BETWEEN_OUTPUT_ENERGY = 10
#-----------------------------------------------------------
#
# Adjoint kernel outputs
#
#-----------------------------------------------------------
# interval in time steps for reading adjoint traces
# 0 = read the whole adjoint sources at start time
NTSTEP_BETWEEN_READ_ADJSRC = 0
# read adjoint sources using ASDF (requires asdf-library)
READ_ADJSRC_ASDF = .false.
# this parameter must be set to .true. to compute anisotropic kernels
# in crust and mantle (related to the 21 Cij in geographical coordinates)
# default is .false. to compute isotropic kernels (related to alpha and beta)
ANISOTROPIC_KL = .false.
# compute transverse isotropic kernels (alpha_v,alpha_h,beta_v,beta_h,eta,rho)
# rather than fully anisotropic kernels in case ANISOTROPIC_KL is set to .true.
SAVE_TRANSVERSE_KL = .false.
# this parameter must be set to .true. to compute anisotropic kernels for
# cost function using velocity observable rather than displacement
ANISOTROPIC_VELOCITY_KL = .false.
# outputs approximate Hessian for preconditioning
APPROXIMATE_HESS_KL = .false.
# save Moho mesh and compute Moho boundary kernels
SAVE_MOHO_MESH = .false.
#-----------------------------------------------------------
#
# Coupling with an injection technique (DSM, AxiSEM, or FK)
#
#-----------------------------------------------------------
COUPLE_WITH_INJECTION_TECHNIQUE = .false.
INJECTION_TECHNIQUE_TYPE = 3 # 1 = DSM, 2 = AxiSEM, 3 = FK
MESH_A_CHUNK_OF_THE_EARTH = .false.
TRACTION_PATH = DATA/AxiSEM_tractions/3/
FKMODEL_FILE = FKmodel
RECIPROCITY_AND_KH_INTEGRAL = .false. # does not work yet
#-----------------------------------------------------------
#
# Run modes
#
#-----------------------------------------------------------
# Simultaneous runs
# added the ability to run several calculations (several earthquakes)
# in an embarrassingly-parallel fashion from within the same run;
# this can be useful when using a very large supercomputer to compute
# many earthquakes in a catalog, in which case it can be better from
# a batch job submission point of view to start fewer and much larger jobs,
# each of them computing several earthquakes in parallel.
# To turn that option on, set parameter NUMBER_OF_SIMULTANEOUS_RUNS to a value greater than 1.
# To implement that, we create NUMBER_OF_SIMULTANEOUS_RUNS MPI sub-communicators,
# each of them being labeled "my_local_mpi_comm_world", and we use them
# in all the routines in "src/shared/parallel.f90", except in MPI_ABORT() because in that case
# we need to kill the entire run.
# When that option is on, of course the number of processor cores used to start
# the code in the batch system must be a multiple of NUMBER_OF_SIMULTANEOUS_RUNS,
# all the individual runs must use the same number of processor cores,
# which as usual is NPROC in the Par_file,
# and thus the total number of processor cores to request from the batch system
# should be NUMBER_OF_SIMULTANEOUS_RUNS * NPROC.
# All the runs to perform must be placed in directories called run0001, run0002, run0003 and so on
# (with exactly four digits).
#
# Imagine you have 10 independent calculations to do, each of them on 100 cores; you have three options:
#
# 1/ submit 10 jobs to the batch system
#
# 2/ submit a single job on 1000 cores to the batch, and in that script create a sub-array of jobs to start 10 jobs,
# each running on 100 cores (see e.g. http://www.schedmd.com/slurmdocs/job_array.html )
#
# 3/ submit a single job on 1000 cores to the batch, start SPECFEM3D on 1000 cores, create 10 sub-communicators,
# cd into one of 10 subdirectories (called e.g. run0001, run0002,... run0010) depending on the sub-communicator
# your MPI rank belongs to, and run normally on 100 cores using that sub-communicator.
#
# The option below implements 3/.
#
NUMBER_OF_SIMULTANEOUS_RUNS = 1
# if we perform simultaneous runs in parallel, if only the source and receivers vary between these runs
# but not the mesh nor the model (velocity and density) then we can also read the mesh and model files
# from a single run in the beginning and broadcast them to all the others; for a large number of simultaneous
# runs for instance when solving inverse problems iteratively this can DRASTICALLY reduce I/Os to disk in the solver
# (by a factor equal to NUMBER_OF_SIMULTANEOUS_RUNS), and reducing I/Os is crucial in the case of huge runs.
# Thus, always set this option to .true. if the mesh and the model are the same for all simultaneous runs.
# In that case there is no need to duplicate the mesh and model file database (the content of the DATABASES_MPI
# directories) in each of the run0001, run0002,... directories, it is sufficient to have one in run0001
# and the code will broadcast it to the others)
BROADCAST_SAME_MESH_AND_MODEL = .true.
#-----------------------------------------------------------
# set to true to use GPUs
GPU_MODE = .false.
# ADIOS Options for I/Os
ADIOS_ENABLED = .false.
ADIOS_FOR_DATABASES = .false.
ADIOS_FOR_MESH = .false.
ADIOS_FOR_FORWARD_ARRAYS = .false.
ADIOS_FOR_KERNELS = .false.
ADIOS_FOR_UNDO_ATTENUATION = .false.
# HDF5 Database I/O
# (note the flags for HDF5 and ADIOS are mutually exclusive, only one can be used)
HDF5_ENABLED = .false.
HDF5_FOR_MOVIES = .false.
HDF5_IO_NODES = 0 # HDF5 IO server with number of IO dedicated procs
Can you send me OUTPUT_FILES/output_solver.txt file?
Hom Nath Gharti, PhD Assistant Professor | Digital Earth Scientist https://www.digitalearthscience.com/ Department of Geological Sciences and Geological Engineering Miller Hall 314, 36 Union St Queen’s University Kingston, ON K7L 3N6, Canada
Queen’s University is situated on traditional Anishinaabe and Haudenosaunee Territory.https://www.queensu.ca/encyclopedia/t/traditional-territories
From: HasanAwad-Polito @.> Sent: Wednesday, November 22, 2023 11:01 AM To: SPECFEM/specfem3d @.> Cc: Hom Nath Gharti @.>; Comment @.> Subject: Re: [SPECFEM/specfem3d] FLOATING POINT EXCEPTION (signal 8) when solving xspecfem3D (Issue #1654)
Here is my par file
Simulation input parameters
forward or adjoint simulation 1 = forward, 2 = adjoint, 3 = both simultaneously
SIMULATION_TYPE = 1
0 = earthquake simulation, 1/2/3 = three steps in noise simulation
NOISE_TOMOGRAPHY = 0 SAVE_FORWARD = .false.
solve a full FWI inverse problem from a single calling program with no I/Os, storing everything in memory, or run a classical forward or adjoint problem only and save the seismograms and/or sensitivity kernels to disk (with costlier I/Os)
INVERSE_FWI_FULL_PROBLEM = .false.
UTM projection parameters Use a negative zone number for the Southern hemisphere: The Northern hemisphere corresponds to zones +1 to +60, The Southern hemisphere corresponds to zones -1 to -60.
UTM_PROJECTION_ZONE = 32 SUPPRESS_UTM_PROJECTION = .false.
number of MPI processors
NPROC = 1
time step parameters
NSTEP = 5000 DT = 0.01
set to true to use local-time stepping (LTS)
LTS_MODE = .false.
Partitioning algorithm for decompose_mesh choose partitioner: 1==SCOTCH (default), 2==METIS, 3==PATOH, 4==ROWS_PART
PARTITIONING_TYPE = 1
LDDRK time scheme
USE_LDDRK = .false. INCREASE_CFL_FOR_LDDRK = .false. RATIO_BY_WHICH_TO_INCREASE_IT = 1.4
Mesh
Number of nodes for 2D and 3D shape functions for hexahedra. We use either 8-node mesh elements (bricks) or 27-node elements. If you use our internal mesher, the only option is 8-node bricks (27-node elements are not supported).
NGNOD = 8
models: available options are: default (model parameters described by mesh properties) 1D models available are: 1d_prem,1d_socal,1d_cascadia 3D models available are: aniso,external,gll,salton_trough,tomo,SEP,coupled,...
MODEL = default
path for external tomographic models files
TOMOGRAPHY_PATH = DATA/tomo_files/
if you are using a SEP model (oil-industry format)
SEP_MODEL_DIRECTORY = DATA/my_SEP_model/
parameters describing the model
APPROXIMATE_OCEAN_LOAD = .false. TOPOGRAPHY = .false. ATTENUATION = .false. ANISOTROPY = .false. GRAVITY = .false.
in case of attenuation, reference frequency in Hz at which the velocity values in the velocity model are given (unused otherwise)
ATTENUATION_f0_REFERENCE = 18.d0
attenuation period range over which we try to mimic a constant Q factor
MIN_ATTENUATION_PERIOD = 999999998.d0 MAX_ATTENUATION_PERIOD = 999999999.d0
ignore this range and ask the code to compute it automatically instead based on the estimated resolution of the mesh (use this unless you know what you are doing)
COMPUTE_FREQ_BAND_AUTOMATIC = .true.
Olsen's constant for Q_mu = constant * V_s attenuation rule
USE_OLSEN_ATTENUATION = .false. OLSEN_ATTENUATION_RATIO = 0.05
Absorbing boundary conditions
C-PML boundary conditions for a regional simulation (if set to .false., and STACEY_ABSORBING_CONDITIONS is also set to .false., you get a free surface instead in the case of elastic or viscoelastic mesh elements, and a rigid surface in the case of acoustic (fluid) elements
PML_CONDITIONS = .false.
C-PML top surface
PML_INSTEAD_OF_FREE_SURFACE = .false.
C-PML dominant frequency
f0_FOR_PML = 0.05555
parameters used to rotate C-PML boundary conditions by a given angle (not completed yet) ROTATE_PML_ACTIVATE = .false. ROTATE_PML_ANGLE = 0. absorbing boundary conditions for a regional simulation (if set to .false., and PML_CONDITIONS is also set to .false., you get a free surface instead in the case of elastic or viscoelastic mesh elements, and a rigid surface in the case of acoustic (fluid) elements
STACEY_ABSORBING_CONDITIONS = .true.
absorbing top surface (defined in mesh as 'free_surface_file')
STACEY_INSTEAD_OF_FREE_SURFACE = .false.
When STACEY_ABSORBING_CONDITIONS is set to .true. : absorbing conditions are defined in xmin, xmax, ymin, ymax and zmin this option BOTTOM_FREE_SURFACE can be set to .true. to make zmin free surface instead of absorbing condition
BOTTOM_FREE_SURFACE = .false.
undoing attenuation and/or PMLs for sensitivity kernel calculations
to undo attenuation and/or PMLs for sensitivity kernel calculations or forward runs with SAVE_FORWARD use the flag below. It performs undoing of attenuation and/or of PMLs in an exact way for sensitivity kernel calculations but requires disk space for temporary storage, and uses a significant amount of memory used as buffers for temporary storage. When that option is on the second parameter indicates how often the code dumps restart files to disk (if in doubt, use something between 100 and 1000).
UNDO_ATTENUATION_AND_OR_PML = .false. NT_DUMP_ATTENUATION = 500
Visualization
save AVS or OpenDX movies MOVIE_TYPE = 1 to show the top surface MOVIE_TYPE = 2 to show all the external faces of the mesh
CREATE_SHAKEMAP = .false. MOVIE_SURFACE = .false. MOVIE_TYPE = 1 MOVIE_VOLUME = .false. SAVE_DISPLACEMENT = .false. MOVIE_VOLUME_STRESS = .false. USE_HIGHRES_FOR_MOVIES = .false. NTSTEP_BETWEEN_FRAMES = 200 HDUR_MOVIE = 0.0
save AVS or OpenDX mesh files to check the mesh
SAVE_MESH_FILES = .true.
path to store the local database file on each node
LOCAL_PATH = OUTPUT_FILES/DATABASES_MPI
interval at which we output time step info and max of norm of displacement
NTSTEP_BETWEEN_OUTPUT_INFO = 500
Sources
sources and receivers Z coordinates given directly (i.e. as their true position) instead of as their depth
USE_SOURCES_RECEIVERS_Z = .false.
use a (tilted) FORCESOLUTION force point source (or several) instead of a CMTSOLUTION moment-tensor source. This can be useful e.g. for oil industry foothills simulations or asteroid simulations in which the source is a vertical force, normal force, tilted force, impact etc. If this flag is turned on, the FORCESOLUTION file must be edited by giving:
USE_FORCE_POINT_SOURCE = .false.
set to true to use a Ricker source time function instead of the source time functions set by default to represent a (tilted) FORCESOLUTION force point source or a CMTSOLUTION moment-tensor source.
USE_RICKER_TIME_FUNCTION = .false.
use an external source time function you must add a file with your source time function and the file name path relative to working directory at the end of CMTSOLUTION or FORCESOLUTION file (with multiple sources, one file per source is required). This file must have a single column containing the amplitudes of the source at all time steps; time step size used must be equal to DT as defined at the beginning of this Par_file. Be sure when this option is .false. to remove the name of stf file in CMTSOLUTION or FORCESOLUTION
USE_EXTERNAL_SOURCE_FILE = .false.
print source time function
PRINT_SOURCE_TIME_FUNCTION = .false.
source encoding (for acoustic simulations only for now) determines source encoding factor +/-1 depending on sign of moment tensor (see e.g. Krebs et al., 2009. Fast full-wavefield seismic inversion using encoded sources, Geophysics, 74 (6), WCC177-WCC188.)
USE_SOURCE_ENCODING = .false.
Seismograms
interval in time steps for writing of seismograms
NTSTEP_BETWEEN_OUTPUT_SEISMOS = 10000
set to n to reduce the sampling rate of output seismograms by a factor of n defaults to 1, which means no down-sampling
NTSTEP_BETWEEN_OUTPUT_SAMPLE = 1
decide if we save displacement, velocity, acceleration and/or pressure in forward runs (they can be set to true simultaneously) currently pressure seismograms are implemented in acoustic (i.e. fluid) elements only
SAVE_SEISMOGRAMS_DISPLACEMENT = .true. SAVE_SEISMOGRAMS_VELOCITY = .false. SAVE_SEISMOGRAMS_ACCELERATION = .false. SAVE_SEISMOGRAMS_PRESSURE = .false. # currently implemented in acoustic (i.e. fluid) elements only
option to save strain seismograms this option is useful for strain Green's tensor
SAVE_SEISMOGRAMS_STRAIN = .false.
save seismograms also when running the adjoint runs for an inverse problem (usually they are unused and not very meaningful, leave this off in almost all cases)
SAVE_SEISMOGRAMS_IN_ADJOINT_RUN = .true.
save seismograms in binary or ASCII format (binary is smaller but may not be portable between machines)
USE_BINARY_FOR_SEISMOGRAMS = .false.
output seismograms in Seismic Unix format (binary with 240-byte-headers)
SU_FORMAT = .false.
output seismograms in ASDF (requires asdf-library)
ASDF_FORMAT = .false.
output seismograms in HDF5 (requires hdf5-library and WRITE_SEISMOGRAMS_BY_MAIN)
HDF5_FORMAT = .false.
decide if main process writes all the seismograms or if all processes do it in parallel
WRITE_SEISMOGRAMS_BY_MAIN = .false.
save all seismograms in one large combined file instead of one file per seismogram to avoid overloading shared non-local file systems such as LUSTRE or GPFS for instance
SAVE_ALL_SEISMOS_IN_ONE_FILE = .false.
use a trick to increase accuracy of pressure seismograms in fluid (acoustic) elements: use the second derivative of the source for the source time function instead of the source itself, and then record -potential_acoustic() as pressure seismograms instead of -potential_dot_dot_acoustic(); this is mathematically equivalent, but numerically significantly more accurate because in the explicit Newmark time scheme acceleration is accurate at zeroth order while displacement is accurate at second order, thus in fluid elements potential_dot_dot_acoustic() is accurate at zeroth order while potential_acoustic() is accurate at second order and thus contains significantly less numerical noise.
USE_TRICK_FOR_BETTER_PRESSURE = .false.
Energy calculation
to plot energy curves, for instance to monitor how CPML absorbing layers behave; should be turned OFF in most cases because a bit expensive
OUTPUT_ENERGY = .false.
every how many time steps we compute energy (which is a bit expensive to compute)
NTSTEP_BETWEEN_OUTPUT_ENERGY = 10
Adjoint kernel outputs
interval in time steps for reading adjoint traces 0 = read the whole adjoint sources at start time
NTSTEP_BETWEEN_READ_ADJSRC = 0
read adjoint sources using ASDF (requires asdf-library)
READ_ADJSRC_ASDF = .false.
this parameter must be set to .true. to compute anisotropic kernels in crust and mantle (related to the 21 Cij in geographical coordinates) default is .false. to compute isotropic kernels (related to alpha and beta)
ANISOTROPIC_KL = .false.
compute transverse isotropic kernels (alpha_v,alpha_h,beta_v,beta_h,eta,rho) rather than fully anisotropic kernels in case ANISOTROPIC_KL is set to .true.
SAVE_TRANSVERSE_KL = .false.
this parameter must be set to .true. to compute anisotropic kernels for cost function using velocity observable rather than displacement
ANISOTROPIC_VELOCITY_KL = .false.
outputs approximate Hessian for preconditioning
APPROXIMATE_HESS_KL = .false.
save Moho mesh and compute Moho boundary kernels
SAVE_MOHO_MESH = .false.
Coupling with an injection technique (DSM, AxiSEM, or FK)
COUPLE_WITH_INJECTION_TECHNIQUE = .false. INJECTION_TECHNIQUE_TYPE = 3 # 1 = DSM, 2 = AxiSEM, 3 = FK MESH_A_CHUNK_OF_THE_EARTH = .false. TRACTION_PATH = DATA/AxiSEM_tractions/3/ FKMODEL_FILE = FKmodel RECIPROCITY_AND_KH_INTEGRAL = .false. # does not work yet
Run modes
Simultaneous runs added the ability to run several calculations (several earthquakes) in an embarrassingly-parallel fashion from within the same run; this can be useful when using a very large supercomputer to compute many earthquakes in a catalog, in which case it can be better from a batch job submission point of view to start fewer and much larger jobs, each of them computing several earthquakes in parallel. To turn that option on, set parameter NUMBER_OF_SIMULTANEOUS_RUNS to a value greater than 1. To implement that, we create NUMBER_OF_SIMULTANEOUS_RUNS MPI sub-communicators, each of them being labeled "my_local_mpi_comm_world", and we use them in all the routines in "src/shared/parallel.f90", except in MPI_ABORT() because in that case we need to kill the entire run. When that option is on, of course the number of processor cores used to start the code in the batch system must be a multiple of NUMBER_OF_SIMULTANEOUS_RUNS, all the individual runs must use the same number of processor cores, which as usual is NPROC in the Par_file, and thus the total number of processor cores to request from the batch system should be NUMBER_OF_SIMULTANEOUS_RUNS * NPROC. All the runs to perform must be placed in directories called run0001, run0002, run0003 and so on (with exactly four digits). Imagine you have 10 independent calculations to do, each of them on 100 cores; you have three options: 1/ submit 10 jobs to the batch system 2/ submit a single job on 1000 cores to the batch, and in that script create a sub-array of jobs to start 10 jobs, each running on 100 cores (see e.g. http://www.schedmd.com/slurmdocs/job_array.html ) 3/ submit a single job on 1000 cores to the batch, start SPECFEM3D on 1000 cores, create 10 sub-communicators, cd into one of 10 subdirectories (called e.g. run0001, run0002,... run0010) depending on the sub-communicator your MPI rank belongs to, and run normally on 100 cores using that sub-communicator. The option below implements 3/.
NUMBER_OF_SIMULTANEOUS_RUNS = 1
if we perform simultaneous runs in parallel, if only the source and receivers vary between these runs but not the mesh nor the model (velocity and density) then we can also read the mesh and model files from a single run in the beginning and broadcast them to all the others; for a large number of simultaneous runs for instance when solving inverse problems iteratively this can DRASTICALLY reduce I/Os to disk in the solver (by a factor equal to NUMBER_OF_SIMULTANEOUS_RUNS), and reducing I/Os is crucial in the case of huge runs. Thus, always set this option to .true. if the mesh and the model are the same for all simultaneous runs. In that case there is no need to duplicate the mesh and model file database (the content of the DATABASES_MPI directories) in each of the run0001, run0002,... directories, it is sufficient to have one in run0001 and the code will broadcast it to the others)
BROADCAST_SAME_MESH_AND_MODEL = .true.
set to true to use GPUs
GPU_MODE = .false.
ADIOS Options for I/Os
ADIOS_ENABLED = .false. ADIOS_FOR_DATABASES = .false. ADIOS_FOR_MESH = .false. ADIOS_FOR_FORWARD_ARRAYS = .false. ADIOS_FOR_KERNELS = .false. ADIOS_FOR_UNDO_ATTENUATION = .false.
HDF5 Database I/O (note the flags for HDF5 and ADIOS are mutually exclusive, only one can be used)
HDF5_ENABLED = .false. HDF5_FOR_MOVIES = .false. HDF5_IO_NODES = 0 # HDF5 IO server with number of IO dedicated procs
— Reply to this email directly, view it on GitHubhttps://github.com/SPECFEM/specfem3d/issues/1654#issuecomment-1823046241, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMCQ4S3SNKN5AENCLZYRNTYFYOWJAVCNFSM6AAAAAA7V6LNACVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRTGA2DMMRUGE. You are receiving this because you commented.Message ID: @.***>
I tried to widen the range of the mesh block no I see reandom oscillation but not a normal seismic even with p and s wave here is the output solver .txt
**********************************************
**** Specfem 3-D Solver - MPI version f90 ****
**********************************************
Running Git package version of the code: v4.0.0-109-gee9914a3
which is Git ee9914a3303dbe0855edcc1272689d91cb9e84ff
dating 2023-11-13 15:34:00 +0100
There are 1 MPI processes
Processes are numbered from 0 to 0
There is a total of 1 slices
NDIM = 3
NGLLX = 5
NGLLY = 5
NGLLZ = 5
using single precision for the calculations
smallest and largest possible floating-point numbers are: 1.17549435E-38 3.40282347E+38
velocity model: default
Reading mesh databases...
reads binary mesh files: proc***_external_mesh.bin
from directory : OUTPUT_FILES/DATABASES_MPI
simulation w/ acoustic domain: F
simulation w/ elastic domain: T
simulation w/ poroelastic domain: F
slice 0 has:
number of elements acoustic : 0
number of elements elastic : 20736
number of elements poroelastic: 0
done
total acoustic elements : 0
total elastic elements : 20736
total poroelastic elements : 0
Mesh resolution:
********
minimum and maximum number of elements
and points in the CUBIT + SCOTCH mesh:
NSPEC_global_min = 20736
NSPEC_global_max = 20736
NSPEC_global_max / NSPEC_global_min imbalance = 1.00000000 = 0.00000000 %
NSPEC_global_sum = 20736
NGLOB_global_min = 1366625
NGLOB_global_max = 1366625
NGLOB_global_max / NGLOB_global_min imbalance = 1.00000000 = 0.00000000 %
NGLOB_global_sum = 1366625
If you have elements of a single type (all acoustic, all elastic, all poroelastic, and without CPML)
in the whole mesh, then there should be no significant imbalance in the above numbers.
Otherwise, it is normal to have imbalance in elements and points because the domain decomposer
compensates for the different cost of different elements by partitioning them unevenly among processes.
********
********
Model: P velocity min,max = 7100.00000 7100.00000
Model: S velocity min,max = 3989.99976 3989.99976
Model: Poisson's ratio min,max = 0.269206375 0.269206375
********
*********************************************
*** Verification of simulation parameters ***
*********************************************
*** Xmin and Xmax of the model = -1180177.25 775853.750
*** Ymin and Ymax of the model = -4971431.50 4988912.00
*** Zmin and Zmax of the model = -80000.0000 0.00000000
*** Max GLL point distance = 90564.0000
*** Min GLL point distance = 863.365234
*** Max/min ratio = 104.896507
*** Max element size = 276676.500
*** Min element size = 5000.00000
*** Max/min ratio = 55.3353004
*** Minimum period resolved = 86.6781006
*** Maximum suggested time step = 6.04999997E-02
*** for DT : 1.0000000000000000E-002
*** Max stability for wave velocities = 8.22363421E-02
Elapsed time for checking mesh resolution in seconds = 2.4259737999999999E-002
saving VTK files for Courant number and minimum period
******************************************
There is a total of 1 slices
******************************************
kd-tree:
total data points: 20736
theoretical number of nodes: 41469
tree memory size: 1.26553345 MB
actual number of nodes: 41471
tree memory size: 1.26559448 MB
maximum depth : 16
creation timing : 5.74797392E-03 (s)
sources: 1
********************
locating sources
********************
reading source information from ./DATA/CMTSOLUTION file
UTM projection:
UTM zone: 32
source # 1
source located in slice 0
in element 19439
in elastic domain
using moment tensor source:
xi coordinate of source in that element: 0.48317567362510605
eta coordinate of source in that element: 0.21768486960810993
gamma coordinate of source in that element: 0.59999999999999976
source time function:
using (quasi) Heaviside source time function
half duration: 5.0000000000000003E-002 seconds
Source time function is a Heaviside, convolve later
time shift: 0.0000000000000000 seconds
magnitude of the source:
scalar moment M0 = 3.1878561676307634E+023 dyne-cm
moment magnitude Mw = 4.9689993362703930
original (requested) position of the source:
latitude: 44.049999999999997
longitude: 11.590000000000000
UTM x: 707478.93142233253
UTM y: 4880687.8939130642
depth: 6.0000000000000000 km
topo elevation: 0.0000000000000000
position of the source that will be used:
UTM x: 707478.93142233253
UTM y: 4880687.8939130642
depth: 6.0000000000000000 km
z: -6000.0000000000000
error in location of the source: 9.38570333E-10 m
maximum error in location of the sources: 9.38570333E-10 m
Elapsed time for detection of sources in seconds = 1.5197033999999943E-002
End of source detection - done
receivers:
there are 2 stations in file ./DATA/STATIONS
saving 1 stations inside the model in file ./DATA/STATIONS_FILTERED
excluding 1 stations located outside the model
Total number of receivers = 1
********************
locating receivers
********************
reading receiver information from ./DATA/STATIONS_FILTERED file
station # 1 IV CPGN
original latitude: 43.8011017
original longitude: 12.3205004
original UTM x: 767112.312
original UTM y: 4855142.50
original depth: 0.00000000 m
horizontal distance: 64.8746567
target x, y, z: 767112.312 4855142.50 0.00000000
closest estimate found: 0.00000000 m away
receiver located in slice 0
in element 20736
in elastic domain
at coordinates:
xi = 0.67823493510274269
eta = 3.3024253507015912E-002
gamma = 1.0000000000000000
rotation matrix:
nu1 = 1.00000000 0.00000000 0.00000000
nu2 = 0.00000000 1.00000000 0.00000000
nu3 = 0.00000000 0.00000000 1.00000000
UTM x: 767112.32831646129
UTM y: 4855142.2674377160
depth: 0.0000000000000000 m
z: 0.0000000000000000
maximum error in location of all the receivers: 0.00000000 m
Elapsed time for receiver detection in seconds = 3.6823504999999979E-002
End of receiver detection - done
found a total of 1 receivers in all the slices
source arrays:
number of sources is 1
size of source array = 1.43051147E-03 MB
= 1.39698386E-06 GB
seismograms:
seismograms written by all processes
Total number of simulation steps (NSTEP) = 5000
writing out seismograms at every NTSTEP_BETWEEN_OUTPUT_SEISMOS = 5000
number of subsampling steps for seismograms = 1
Total number of samples for seismograms = 5000
maximum number of local receivers is 1 in slice 0
size of maximum seismogram array = 5.72204590E-02 MB
= 5.58793545E-05 GB
Total number of samples for seismograms = 5000
Simulation setup:
no acoustic simulation
incorporating elastic simulation
no poroelastic simulation
no attenuation
no anisotropy
no oceans
no gravity
no movie simulation
preparing mass matrices
preparing constants
preparing wavefields
preparing fault simulation
no dynamic faults
no kinematic faults
no fault simulation
preparing gravity
no gravity simulation
preparing Stacey absorbing boundaries
preparing optimized arrays
number of regular shaped elements : 0
number of irregular shaped elements: 20736
fused array done
bandwidth test (STREAM TRIAD):
memory accesses = 46.9193459 MB
timing min/max = 1.55624305E-03 s / 1.62914000E-03 s
timing avg = 1.59210828E-03 s
bandwidth = 28.7792435 GB/s
Elapsed time for preparing timerun in seconds = 9.8996417000000059E-002
************
time loop
************
scheme: Newmark
time step: 9.99999978E-03 s
number of time steps: 5000
total simulated time: 50.0000000 seconds
start time: -0.100000001 seconds
All processes are synchronized before the time loop
Starting time iteration loop...
Time step # 5
Time: -5.99999987E-02 seconds
Elapsed time in seconds = 0.36766370399999992
Elapsed time in hh:mm:ss = 0 h 00 m 00 s
Mean elapsed time per time step in seconds = 7.35327378E-02
Max norm displacement vector U in all slices (m) = 5.45225184E-11
Time steps done = 5 out of 5000
Time steps remaining = 4995
Estimated remaining time in seconds = 367.296051
Estimated remaining time in hh:mm:ss = 0 h 06 m 07 s
Estimated total run time in seconds = 367.663696
Estimated total run time in hh:mm:ss = 0 h 06 m 07 s
We have done 0.100000001 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
************************************************************
**** BEWARE: the above time estimates are not very reliable
**** because fewer than 100 iterations have been performed
************************************************************
Time step # 500
Time: 4.88999987 seconds
Elapsed time in seconds = 40.461380478000002
Elapsed time in hh:mm:ss = 0 h 00 m 40 s
Mean elapsed time per time step in seconds = 8.09227601E-02
Max norm displacement vector U in all slices (m) = 1.18286182E-04
Time steps done = 500 out of 5000
Time steps remaining = 4500
Estimated remaining time in seconds = 364.152435
Estimated remaining time in hh:mm:ss = 0 h 06 m 04 s
Estimated total run time in seconds = 404.613800
Estimated total run time in hh:mm:ss = 0 h 06 m 44 s
We have done 10.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 1000
Time: 9.89000034 seconds
Elapsed time in seconds = 80.704015229999996
Elapsed time in hh:mm:ss = 0 h 01 m 20 s
Mean elapsed time per time step in seconds = 8.07040185E-02
Max norm displacement vector U in all slices (m) = 1.11700239E-04
Time steps done = 1000 out of 5000
Time steps remaining = 4000
Estimated remaining time in seconds = 322.816071
Estimated remaining time in hh:mm:ss = 0 h 05 m 22 s
Estimated total run time in seconds = 403.520081
Estimated total run time in hh:mm:ss = 0 h 06 m 43 s
We have done 20.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 1500
Time: 14.8900003 seconds
Elapsed time in seconds = 121.63306029600000
Elapsed time in hh:mm:ss = 0 h 02 m 01 s
Mean elapsed time per time step in seconds = 8.10887069E-02
Max norm displacement vector U in all slices (m) = 1.18085656E-04
Time steps done = 1500 out of 5000
Time steps remaining = 3500
Estimated remaining time in seconds = 283.810486
Estimated remaining time in hh:mm:ss = 0 h 04 m 43 s
Estimated total run time in seconds = 405.443542
Estimated total run time in hh:mm:ss = 0 h 06 m 45 s
We have done 30.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 2000
Time: 19.8899994 seconds
Elapsed time in seconds = 162.23003953500000
Elapsed time in hh:mm:ss = 0 h 02 m 42 s
Mean elapsed time per time step in seconds = 8.11150223E-02
Max norm displacement vector U in all slices (m) = 1.07805106E-04
Time steps done = 2000 out of 5000
Time steps remaining = 3000
Estimated remaining time in seconds = 243.345062
Estimated remaining time in hh:mm:ss = 0 h 04 m 03 s
Estimated total run time in seconds = 405.575104
Estimated total run time in hh:mm:ss = 0 h 06 m 45 s
We have done 40.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 2500
Time: 24.8899994 seconds
Elapsed time in seconds = 209.44900112099998
Elapsed time in hh:mm:ss = 0 h 03 m 29 s
Mean elapsed time per time step in seconds = 8.37796032E-02
Max norm displacement vector U in all slices (m) = 1.10632427E-04
Time steps done = 2500 out of 5000
Time steps remaining = 2500
Estimated remaining time in seconds = 209.449005
Estimated remaining time in hh:mm:ss = 0 h 03 m 29 s
Estimated total run time in seconds = 418.898010
Estimated total run time in hh:mm:ss = 0 h 06 m 58 s
We have done 50.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 3000
Time: 29.8899994 seconds
Elapsed time in seconds = 250.85066614400000
Elapsed time in hh:mm:ss = 0 h 04 m 10 s
Mean elapsed time per time step in seconds = 8.36168900E-02
Max norm displacement vector U in all slices (m) = 1.08961394E-04
Time steps done = 3000 out of 5000
Time steps remaining = 2000
Estimated remaining time in seconds = 167.233780
Estimated remaining time in hh:mm:ss = 0 h 02 m 47 s
Estimated total run time in seconds = 418.084442
Estimated total run time in hh:mm:ss = 0 h 06 m 58 s
We have done 60.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 3500
Time: 34.8899994 seconds
Elapsed time in seconds = 292.26057090199998
Elapsed time in hh:mm:ss = 0 h 04 m 52 s
Mean elapsed time per time step in seconds = 8.35030228E-02
Max norm displacement vector U in all slices (m) = 1.16462295E-04
Time steps done = 3500 out of 5000
Time steps remaining = 1500
Estimated remaining time in seconds = 125.254532
Estimated remaining time in hh:mm:ss = 0 h 02 m 05 s
Estimated total run time in seconds = 417.515106
Estimated total run time in hh:mm:ss = 0 h 06 m 57 s
We have done 70.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 4000
Time: 39.8899994 seconds
Elapsed time in seconds = 333.55028477100001
Elapsed time in hh:mm:ss = 0 h 05 m 33 s
Mean elapsed time per time step in seconds = 8.33875686E-02
Max norm displacement vector U in all slices (m) = 1.16677249E-04
Time steps done = 4000 out of 5000
Time steps remaining = 1000
Estimated remaining time in seconds = 83.3875732
Estimated remaining time in hh:mm:ss = 0 h 01 m 23 s
Estimated total run time in seconds = 416.937866
Estimated total run time in hh:mm:ss = 0 h 06 m 56 s
We have done 80.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 4500
Time: 44.8899994 seconds
Elapsed time in seconds = 374.66117114799999
Elapsed time in hh:mm:ss = 0 h 06 m 14 s
Mean elapsed time per time step in seconds = 8.32580402E-02
Max norm displacement vector U in all slices (m) = 1.18399788E-04
Time steps done = 4500 out of 5000
Time steps remaining = 500
Estimated remaining time in seconds = 41.6290207
Estimated remaining time in hh:mm:ss = 0 h 00 m 41 s
Estimated total run time in seconds = 416.290192
Estimated total run time in hh:mm:ss = 0 h 06 m 56 s
We have done 90.0000000 % of that
The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 5000
Time: 49.8899994 seconds
Elapsed time in seconds = 416.14767053899999
Elapsed time in hh:mm:ss = 0 h 06 m 56 s
Mean elapsed time per time step in seconds = 8.32295343E-02
Max norm displacement vector U in all slices (m) = 1.18382930E-04
Time steps done = 5000 out of 5000
Time steps remaining = 0
Estimated remaining time in seconds = 0.00000000
Estimated remaining time in hh:mm:ss = 0 h 00 m 00 s
Estimated total run time in seconds = 416.147675
Estimated total run time in hh:mm:ss = 0 h 06 m 56 s
We have done 100.000000 % of that
Writing the seismograms
Component: .semd
Total number of receivers saved is 1
Total number of time steps written: 5000
Writing the seismograms in parallel took 0.113481387 seconds
Time loop finished. Timing info:
Total elapsed time in seconds = 416.36161097199999
Total elapsed time in hh:mm:ss = 0 h 06 m 56 s
finalizing simulation
End of the simulation
everything kept the same only this range changes LATITUDE_MIN = -43.00 LATITUDE_MAX = 45.00 LONGITUDE_MIN = -11.59 LONGITUDE_MAX = 12.50 DEPTH_BLOCK_KM = 80.d0 UTM_PROJECTION_ZONE = 32 SUPPRESS_UTM_PROJECTION = .false.
Yes, that was the issue. Set the NSTEP = 20000 or even 50000 and see the seismograms. Note that only a single station was properly located. Your other station is outside the model range.
Hom Nath Gharti, PhD Assistant Professor | Digital Earth Scientist https://www.digitalearthscience.com/ Department of Geological Sciences and Geological Engineering Miller Hall 314, 36 Union St Queen’s University Kingston, ON K7L 3N6, Canada
Queen’s University is situated on traditional Anishinaabe and Haudenosaunee Territory.https://www.queensu.ca/encyclopedia/t/traditional-territories
From: HasanAwad-Polito @.> Sent: Wednesday, November 22, 2023 11:19 AM To: SPECFEM/specfem3d @.> Cc: Hom Nath Gharti @.>; Comment @.> Subject: Re: [SPECFEM/specfem3d] FLOATING POINT EXCEPTION (signal 8) when solving xspecfem3D (Issue #1654)
I tried to widen the range of the mesh block no I see reandom oscillation but not a normal seismic even with p and s wave here is the output solver .txt
Specfem 3-D Solver - MPI version f90
Running Git package version of the code: v4.0.0-109-gee9914a3 which is Git ee9914ahttps://github.com/SPECFEM/specfem3d/commit/ee9914a3303dbe0855edcc1272689d91cb9e84ff dating 2023-11-13 15:34:00 +0100
There are 1 MPI processes Processes are numbered from 0 to 0
There is a total of 1 slices
NDIM = 3
NGLLX = 5 NGLLY = 5 NGLLZ = 5
using single precision for the calculations
smallest and largest possible floating-point numbers are: 1.17549435E-38 3.40282347E+38
velocity model: default
Reading mesh databases... reads binary mesh files: proc***_external_mesh.bin from directory : OUTPUT_FILES/DATABASES_MPI
simulation w/ acoustic domain: F simulation w/ elastic domain: T simulation w/ poroelastic domain: F
slice 0 has: number of elements acoustic : 0 number of elements elastic : 20736 number of elements poroelastic: 0 done
total acoustic elements : 0 total elastic elements : 20736 total poroelastic elements : 0
Mesh resolution:
minimum and maximum number of elements and points in the CUBIT + SCOTCH mesh:
NSPEC_global_min = 20736 NSPEC_global_max = 20736 NSPEC_global_max / NSPEC_global_min imbalance = 1.00000000 = 0.00000000 % NSPEC_global_sum = 20736
NGLOB_global_min = 1366625 NGLOB_global_max = 1366625 NGLOB_global_max / NGLOB_global_min imbalance = 1.00000000 = 0.00000000 % NGLOB_global_sum = 1366625
If you have elements of a single type (all acoustic, all elastic, all poroelastic, and without CPML) in the whole mesh, then there should be no significant imbalance in the above numbers. Otherwise, it is normal to have imbalance in elements and points because the domain decomposer compensates for the different cost of different elements by partitioning them unevenly among processes.
Model: P velocity min,max = 7100.00000 7100.00000 Model: S velocity min,max = 3989.99976 3989.99976
Model: Poisson's ratio min,max = 0.269206375 0.269206375
Verification of simulation parameters
Xmin and Xmax of the model = -1180177.25 775853.750 Ymin and Ymax of the model = -4971431.50 4988912.00 *** Zmin and Zmax of the model = -80000.0000 0.00000000
Max GLL point distance = 90564.0000 Min GLL point distance = 863.365234 *** Max/min ratio = 104.896507
Max element size = 276676.500 Min element size = 5000.00000 *** Max/min ratio = 55.3353004
Minimum period resolved = 86.6781006 Maximum suggested time step = 6.04999997E-02
for DT : 1.0000000000000000E-002 Max stability for wave velocities = 8.22363421E-02
Elapsed time for checking mesh resolution in seconds = 2.4259737999999999E-002 saving VTK files for Courant number and minimum period
There is a total of 1 slices
kd-tree: total data points: 20736 theoretical number of nodes: 41469 tree memory size: 1.26553345 MB actual number of nodes: 41471 tree memory size: 1.26559448 MB maximum depth : 16 creation timing : 5.74797392E-03 (s)
sources: 1
locating sources
reading source information from ./DATA/CMTSOLUTION file
UTM projection: UTM zone: 32
source # 1 source located in slice 0 in element 19439 in elastic domain
using moment tensor source: xi coordinate of source in that element: 0.48317567362510605 eta coordinate of source in that element: 0.21768486960810993 gamma coordinate of source in that element: 0.59999999999999976
source time function: using (quasi) Heaviside source time function half duration: 5.0000000000000003E-002 seconds
Source time function is a Heaviside, convolve later
time shift: 0.0000000000000000 seconds
magnitude of the source: scalar moment M0 = 3.1878561676307634E+023 dyne-cm moment magnitude Mw = 4.9689993362703930
original (requested) position of the source:
latitude: 44.049999999999997
longitude: 11.590000000000000
UTM x: 707478.93142233253
UTM y: 4880687.8939130642
depth: 6.0000000000000000 km
topo elevation: 0.0000000000000000
position of the source that will be used:
UTM x: 707478.93142233253
UTM y: 4880687.8939130642
depth: 6.0000000000000000 km
z: -6000.0000000000000
error in location of the source: 9.38570333E-10 m
maximum error in location of the sources: 9.38570333E-10 m
Elapsed time for detection of sources in seconds = 1.5197033999999943E-002
End of source detection - done
receivers:
there are 2 stations in file ./DATA/STATIONS saving 1 stations inside the model in file ./DATA/STATIONS_FILTERED excluding 1 stations located outside the model
Total number of receivers = 1
locating receivers
reading receiver information from ./DATA/STATIONS_FILTERED file
station # 1 IV CPGN original latitude: 43.8011017 original longitude: 12.3205004 original UTM x: 767112.312 original UTM y: 4855142.50 original depth: 0.00000000 m horizontal distance: 64.8746567 target x, y, z: 767112.312 4855142.50 0.00000000 closest estimate found: 0.00000000 m away
receiver located in slice 0 in element 20736 in elastic domain at coordinates: xi = 0.67823493510274269 eta = 3.3024253507015912E-002 gamma = 1.0000000000000000 rotation matrix: nu1 = 1.00000000 0.00000000 0.00000000 nu2 = 0.00000000 1.00000000 0.00000000 nu3 = 0.00000000 0.00000000 1.00000000 UTM x: 767112.32831646129 UTM y: 4855142.2674377160 depth: 0.0000000000000000 m z: 0.0000000000000000
maximum error in location of all the receivers: 0.00000000 m
Elapsed time for receiver detection in seconds = 3.6823504999999979E-002
End of receiver detection - done
found a total of 1 receivers in all the slices
source arrays: number of sources is 1 size of source array = 1.43051147E-03 MB = 1.39698386E-06 GB
seismograms: seismograms written by all processes
Total number of simulation steps (NSTEP) = 5000 writing out seismograms at every NTSTEP_BETWEEN_OUTPUT_SEISMOS = 5000 number of subsampling steps for seismograms = 1 Total number of samples for seismograms = 5000
maximum number of local receivers is 1 in slice 0 size of maximum seismogram array = 5.72204590E-02 MB = 5.58793545E-05 GB
Total number of samples for seismograms = 5000
Simulation setup:
no acoustic simulation incorporating elastic simulation no poroelastic simulation
no attenuation no anisotropy no oceans no gravity no movie simulation
preparing mass matrices preparing constants preparing wavefields preparing fault simulation no dynamic faults no kinematic faults no fault simulation preparing gravity no gravity simulation preparing Stacey absorbing boundaries preparing optimized arrays number of regular shaped elements : 0 number of irregular shaped elements: 20736 fused array done bandwidth test (STREAM TRIAD): memory accesses = 46.9193459 MB timing min/max = 1.55624305E-03 s / 1.62914000E-03 s timing avg = 1.59210828E-03 s bandwidth = 28.7792435 GB/s
Elapsed time for preparing timerun in seconds = 9.8996417000000059E-002
time loop
scheme: Newmark
time step: 9.99999978E-03 s
number of time steps: 5000 total simulated time: 50.0000000 seconds start time: -0.100000001 seconds
All processes are synchronized before the time loop
Starting time iteration loop...
Time step # 5 Time: -5.99999987E-02 seconds Elapsed time in seconds = 0.36766370399999992 Elapsed time in hh:mm:ss = 0 h 00 m 00 s Mean elapsed time per time step in seconds = 7.35327378E-02 Max norm displacement vector U in all slices (m) = 5.45225184E-11 Time steps done = 5 out of 5000 Time steps remaining = 4995 Estimated remaining time in seconds = 367.296051 Estimated remaining time in hh:mm:ss = 0 h 06 m 07 s Estimated total run time in seconds = 367.663696 Estimated total run time in hh:mm:ss = 0 h 06 m 07 s We have done 0.100000001 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
BEWARE: the above time estimates are not very reliable because fewer than 100 iterations have been performed
Time step # 500 Time: 4.88999987 seconds Elapsed time in seconds = 40.461380478000002 Elapsed time in hh:mm:ss = 0 h 00 m 40 s Mean elapsed time per time step in seconds = 8.09227601E-02 Max norm displacement vector U in all slices (m) = 1.18286182E-04 Time steps done = 500 out of 5000 Time steps remaining = 4500 Estimated remaining time in seconds = 364.152435 Estimated remaining time in hh:mm:ss = 0 h 06 m 04 s Estimated total run time in seconds = 404.613800 Estimated total run time in hh:mm:ss = 0 h 06 m 44 s We have done 10.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 1000 Time: 9.89000034 seconds Elapsed time in seconds = 80.704015229999996 Elapsed time in hh:mm:ss = 0 h 01 m 20 s Mean elapsed time per time step in seconds = 8.07040185E-02 Max norm displacement vector U in all slices (m) = 1.11700239E-04 Time steps done = 1000 out of 5000 Time steps remaining = 4000 Estimated remaining time in seconds = 322.816071 Estimated remaining time in hh:mm:ss = 0 h 05 m 22 s Estimated total run time in seconds = 403.520081 Estimated total run time in hh:mm:ss = 0 h 06 m 43 s We have done 20.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 1500 Time: 14.8900003 seconds Elapsed time in seconds = 121.63306029600000 Elapsed time in hh:mm:ss = 0 h 02 m 01 s Mean elapsed time per time step in seconds = 8.10887069E-02 Max norm displacement vector U in all slices (m) = 1.18085656E-04 Time steps done = 1500 out of 5000 Time steps remaining = 3500 Estimated remaining time in seconds = 283.810486 Estimated remaining time in hh:mm:ss = 0 h 04 m 43 s Estimated total run time in seconds = 405.443542 Estimated total run time in hh:mm:ss = 0 h 06 m 45 s We have done 30.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 2000 Time: 19.8899994 seconds Elapsed time in seconds = 162.23003953500000 Elapsed time in hh:mm:ss = 0 h 02 m 42 s Mean elapsed time per time step in seconds = 8.11150223E-02 Max norm displacement vector U in all slices (m) = 1.07805106E-04 Time steps done = 2000 out of 5000 Time steps remaining = 3000 Estimated remaining time in seconds = 243.345062 Estimated remaining time in hh:mm:ss = 0 h 04 m 03 s Estimated total run time in seconds = 405.575104 Estimated total run time in hh:mm:ss = 0 h 06 m 45 s We have done 40.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 2500 Time: 24.8899994 seconds Elapsed time in seconds = 209.44900112099998 Elapsed time in hh:mm:ss = 0 h 03 m 29 s Mean elapsed time per time step in seconds = 8.37796032E-02 Max norm displacement vector U in all slices (m) = 1.10632427E-04 Time steps done = 2500 out of 5000 Time steps remaining = 2500 Estimated remaining time in seconds = 209.449005 Estimated remaining time in hh:mm:ss = 0 h 03 m 29 s Estimated total run time in seconds = 418.898010 Estimated total run time in hh:mm:ss = 0 h 06 m 58 s We have done 50.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 3000 Time: 29.8899994 seconds Elapsed time in seconds = 250.85066614400000 Elapsed time in hh:mm:ss = 0 h 04 m 10 s Mean elapsed time per time step in seconds = 8.36168900E-02 Max norm displacement vector U in all slices (m) = 1.08961394E-04 Time steps done = 3000 out of 5000 Time steps remaining = 2000 Estimated remaining time in seconds = 167.233780 Estimated remaining time in hh:mm:ss = 0 h 02 m 47 s Estimated total run time in seconds = 418.084442 Estimated total run time in hh:mm:ss = 0 h 06 m 58 s We have done 60.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 3500 Time: 34.8899994 seconds Elapsed time in seconds = 292.26057090199998 Elapsed time in hh:mm:ss = 0 h 04 m 52 s Mean elapsed time per time step in seconds = 8.35030228E-02 Max norm displacement vector U in all slices (m) = 1.16462295E-04 Time steps done = 3500 out of 5000 Time steps remaining = 1500 Estimated remaining time in seconds = 125.254532 Estimated remaining time in hh:mm:ss = 0 h 02 m 05 s Estimated total run time in seconds = 417.515106 Estimated total run time in hh:mm:ss = 0 h 06 m 57 s We have done 70.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 4000 Time: 39.8899994 seconds Elapsed time in seconds = 333.55028477100001 Elapsed time in hh:mm:ss = 0 h 05 m 33 s Mean elapsed time per time step in seconds = 8.33875686E-02 Max norm displacement vector U in all slices (m) = 1.16677249E-04 Time steps done = 4000 out of 5000 Time steps remaining = 1000 Estimated remaining time in seconds = 83.3875732 Estimated remaining time in hh:mm:ss = 0 h 01 m 23 s Estimated total run time in seconds = 416.937866 Estimated total run time in hh:mm:ss = 0 h 06 m 56 s We have done 80.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:25
Time step # 4500 Time: 44.8899994 seconds Elapsed time in seconds = 374.66117114799999 Elapsed time in hh:mm:ss = 0 h 06 m 14 s Mean elapsed time per time step in seconds = 8.32580402E-02 Max norm displacement vector U in all slices (m) = 1.18399788E-04 Time steps done = 4500 out of 5000 Time steps remaining = 500 Estimated remaining time in seconds = 41.6290207 Estimated remaining time in hh:mm:ss = 0 h 00 m 41 s Estimated total run time in seconds = 416.290192 Estimated total run time in hh:mm:ss = 0 h 06 m 56 s We have done 90.0000000 % of that The run will finish approximately on (in local time): Wed Nov 22, 2023 15:26
Time step # 5000 Time: 49.8899994 seconds Elapsed time in seconds = 416.14767053899999 Elapsed time in hh:mm:ss = 0 h 06 m 56 s Mean elapsed time per time step in seconds = 8.32295343E-02 Max norm displacement vector U in all slices (m) = 1.18382930E-04 Time steps done = 5000 out of 5000 Time steps remaining = 0 Estimated remaining time in seconds = 0.00000000 Estimated remaining time in hh:mm:ss = 0 h 00 m 00 s Estimated total run time in seconds = 416.147675 Estimated total run time in hh:mm:ss = 0 h 06 m 56 s We have done 100.000000 % of that
Writing the seismograms Component: .semd Total number of receivers saved is 1 Total number of time steps written: 5000 Writing the seismograms in parallel took 0.113481387 seconds
Time loop finished. Timing info: Total elapsed time in seconds = 416.36161097199999 Total elapsed time in hh:mm:ss = 0 h 06 m 56 s
finalizing simulation
End of the simulation
— Reply to this email directly, view it on GitHubhttps://github.com/SPECFEM/specfem3d/issues/1654#issuecomment-1823078849, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMCQ4VA4IVCGNKW7AEEVJDYFYQYDAVCNFSM6AAAAAA7V6LNACVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRTGA3TQOBUHE. You are receiving this because you commented.Message ID: @.***>
Dear Homnath,
Did not work as well. Could you kindly check your email? I sent you an email with all the details.
Thank you again for your time, appreciated.
I hope it will finds you well. Please let me know in case not.
Best, Hasan
please note that with your coarse mesh setup, you resolve only signals down to a minimum period of ~86s:
...
*** Minimum period resolved = 86.6781006
and your CMTSOLUTION sets a half duration of 0:
..
half duration: 0.0000
..
the simulation output will give you Green's functions as traces. you will have to convolve these with an appropriate source time function (STF) later on in a separate step. and again, your simulation only resolves periods down to ~86s, so it must be quite low-frequency STFs. and be aware that with such long-period simulations, it is hard to see sharp onsets for body waves, like P- and S-wave signals.
Program received signal SIGFPE: Floating-point exception - erroneous arithmetic operation.
COULD ANYONE ASSIST?