Closed Lukas113 closed 1 year ago
I also got the very same error during the usual workflow tests. It was random to some extend, meaning re-running the job solved it for the moment.
It might be worth to explore why this happened.
It might be worth to explore why this happened.
Yup, we probably should. Flaky tests are terrible.
solved by building Pinocchio correctly.
The following tests
TestPinocchio.testSimpleInstance
andMyTestCase.test_compare_karabo_oskar
seem to fail sometimes (usually they pass) for a not yet discovered reason. They can fail on the local test or on the GitHub runners. The rest of the lines is just the test-output:0 1509.2 ============================= test session starts ==============================
0 1509.2 platform linux -- Python 3.9.16, pytest-7.2.1, pluggy-1.0.0
0 1509.2 rootdir: /workspace/tmp
0 1509.2 plugins: anyio-3.6.2, openfiles-0.5.0, mock-3.10.0, remotedata-0.4.0, arraydiff-0.3, doctestplus-0.12.1, astropy-header-0.1.2, hypothesis-6.70.0, filter-subpackage-0.1.1, asdf-2.14.4, cov-4.0.0, astropy-0.10.0
0 1509.2 collected 66 items
0 1509.2
0 1509.2 karabo/test/test_beam.py F.... [ 7%]
0 1509.2 karabo/test/test_data.py .. [ 10%]
0 1509.2 karabo/test/test_examples.py . [ 12%]
0 1509.2 karabo/test/test_file_handle.py ..... [ 19%]
0 1509.2 karabo/test/test_image.py ..... [ 27%]
0 1509.2 karabo/test/test_long_observation.py .... [ 33%]
0 1509.2 karabo/test/test_mock_mightee.py .. [ 36%]
0 1509.2 karabo/test/test_notebooks.py s.. [ 40%]
0 1509.2 karabo/test/test_observation.py .. [ 43%]
0 1509.2 karabo/test/test_pinocchio.py F [ 45%]
0 1509.2 karabo/test/test_simulation.py . [ 46%]
0 1509.2 karabo/test/test_skymodel.py ........ [ 59%]
0 1509.2 karabo/test/test_source_detection.py ..ssss. [ 69%]
0 1509.2 karabo/test/test_telescope.py ................... [ 98%]
0 1509.2 karabo/test/test_telescope_baselines.py . [100%]
0 1509.2
0 1509.2 =================================== FAILURES ===================================
0 1509.2 _____ MyTestCase.test_compare_karabooskar ____
0 1509.2
0 1509.2 self =
0 1509.2
0 1509.2 def test_compare_karabo_oskar(self):
0 1509.2 """
0 1509.2 We test the that oskar and karabo give the same output when
0 1509.2 using the same imager -> resulting plot should be zero everywhere.
0 1509.2 """
0 1509.2 # KARABO ----------------------------
0 1509.2 freq = 8.0e8
0 1509.2 precision = "single"
0 1509.2 beam_type = "Isotropic beam"
0 1509.2 vis_path = "./karabo/test/data/beam_vis"
0 1509.2 sky_txt = "./karabo/test/data/sky_model.txt"
0 1509.2 telescope_tm = "./karabo/data/meerkat.tm"
0 1509.2
0 1509.2 sky = SkyModel()
0 1509.2 sky_data = np.zeros((81, 12))
0 1509.2 a = np.arange(-32, -27.5, 0.5)
0 1509.2 b = np.arange(18, 22.5, 0.5)
0 1509.2 dec_arr, ra_arr = np.meshgrid(a, b)
0 1509.2 sky_data[:, 0] = ra_arr.flatten()
0 1509.2 sky_data[:, 1] = dec_arr.flatten()
0 1509.2 sky_data[:, 2] = 1
0 1509.2
0 1509.2 sky.add_point_sources(sky_data)
0 1509.2
0 1509.2 telescope = Telescope.get_MEERKAT_Telescope()
0 1509.2 # Remove beam if already present
0 1509.2 test = os.listdir(telescope.path)
0 1509.2 for item in test:
0 1509.2 if item.endswith(".bin"):
0 1509.2 os.remove(os.path.join(telescope.path, item))
0 1509.2 # ------------- Simulation Begins
0 1509.2 simulation = InterferometerSimulation(
0 1509.2 vis_path=vis_path + ".vis",
0 1509.2 channel_bandwidth_hz=2e7,
0 1509.2 time_average_sec=8,
0 1509.2 noise_enable=False,
0 1509.2 ignore_w_components=True,
0 1509.2 precision=precision,
0 1509.2 use_gpus=False,
0 1509.2 station_type=beam_type,
0 1509.2 gauss_beam_fwhm_deg=1.0,
0 1509.2 gauss_ref_freq_hz=1.5e9,
0 1509.2 )
0 1509.2 observation = Observation(
0 1509.2 mode="Tracking",
0 1509.2 phase_centre_ra_deg=20.0,
0 1509.2 start_date_and_time=datetime(2000, 3, 20, 12, 6, 39, 0),
0 1509.2 length=timedelta(hours=3, minutes=5, seconds=0, milliseconds=0),
0 1509.2 phase_centre_dec_deg=-30.0,
0 1509.2 number_of_time_steps=10,
0 1509.2 start_frequency_hz=freq,
0 1509.2 frequency_increment_hz=2e7,
0 1509.2 number_of_channels=1,
0 1509.2 )
0 1509.2 visibility = simulation.run_simulation(telescope, sky, observation)
0 1509.2 visibility.write_to_file(path=vis_path + ".ms")
0 1509.2
0 1509.2 # RASCIL IMAGING
0 1509.2 uvmax = 3000 / (3.0e8 / freq) # in wavelength units
0 1509.2 imager = Imager(
0 1509.2 visibility,
0 1509.2 imaging_npixel=4096,
0 1509.2 imaging_cellsize=2.13e-5,
0 1509.2 imaging_dopsf=True,
0 1509.2 imaging_weighting="uniform",
0 1509.2 imaging_uvmax=uvmax,
0 1509.2 imaging_uvmin=1,
0 1509.2 ) # imaging cellsize is over-written in the Imager based on max uv dist.
0 1509.2 dirty = imager.get_dirty_image()
0 1509.2 image_karabo = dirty.data[0][0]
0 1509.2
0 1509.2 # OSKAR -------------------------------------
0 1509.2
0 1509.2 # Setting tree
0 1509.2 params = {
0 1509.2 "simulator": {"use_gpus": True},
0 1509.2 "observation": {
0 1509.2 "num_channels": 1,
0 1509.2 "start_frequency_hz": freq,
0 1509.2 "frequency_inc_hz": 2e7,
0 1509.2 "phase_centre_ra_deg": 20,
0 1509.2 "phase_centre_dec_deg": -30,
0 1509.2 "num_time_steps": 10,
0 1509.2 "start_time_utc": "2000-03-20 12:06:39",
0 1509.2 "length": "03:05:00.000",
0 1509.2 },
0 1509.2 "telescope": {
0 1509.2 "input_directory": telescope_tm,
0 1509.2 "normalise_beams_at_phase_centre": True,
0 1509.2 "pol_mode": "Full",
0 1509.2 "allow_station_beam_duplication": True,
0 1509.2 "station_type": beam_type,
0 1509.2 "gaussian_beam/fwhm_deg": 1,
0 1509.2 "gaussian_beam/ref_freq_hz": 1.5e9, # Mid-frequency in
0 1509.2 # the redshift range
0 1509.2 },
0 1509.2 "interferometer": {
0 1509.2 "oskar_vis_filename": vis_path + ".vis",
0 1509.2 "channel_bandwidth_hz": 2e7,
0 1509.2 "time_average_sec": 8,
0 1509.2 "ignore_w_components": True,
0 1509.2 },
0 1509.2 }
0 1509.2
0 1509.2 settings = oskar.SettingsTree("oskar_sim_interferometer")
0 1509.2 settings.from_dict(params)
0 1509.2
0 1509.2 # Choose the numerical precision
0 1509.2 if precision == "single":
0 1509.2 settings["simulator/double_precision"] = False
0 1509.2
0 1509.2 # The following line depends on the mode with which we're loading the sky
0 1509.2 # (explained in documentation)
0 1509.2 np.savetxt(sky_txt, sky.sources[:, :3])
0 1509.2 sky_sim = oskar.Sky.load(sky_txt, precision)
0 1509.2
0 1509.2 sim = oskar.Interferometer(settings=settings)
0 1509.2 sim.set_sky_model(sky_sim)
0 1509.2 > sim.run()
0 1509.2
0 1509.2 karabo/test/test_beam.py:173:
0 1509.2
0 1509.2 /opt/conda/lib/python3.9/site-packages/oskar/interferometer.py:290: in run
0 1509.2 self.check_init()
0 1509.2 /opt/conda/lib/python3.9/site-packages/oskar/interferometer.py:166: in check_init
0 1509.2 self.set_telescope_model(self._settings.to_telescope())
0 1509.2
0 1509.2
0 1509.2 self = <oskar.settings_tree.SettingsTree object at 0x7fb35ad74610>
0 1509.2
0 1509.2 def to_telescope(self):
0 1509.2 """Returns a new telescope model from the current settings.
0 1509.2
0 1509.2 Returns:
0 1509.2 oskar.Telescope: A configured telescope model.
0 1509.2 """
0 1509.2 tel = Telescope()
0 1509.2 > tel.capsule = _apps_lib.settings_to_telescope(self._capsule)
0 1509.2 E RuntimeError: oskar_settings_to_telescope() failed with code -2 (file I/O error).
0 1509.2
0 1509.2 /opt/conda/lib/python3.9/site-packages/oskar/settings_tree.py:215: RuntimeError
0 1509.2 ----------------------------- Captured stdout call -----------------------------
0 1509.2 Parameter 'use_gpu' is None! Using function 'karabo.util.is_cuda_available()' to overwrite parameter 'use_gpu' to False.
0 1509.2 W|
0 1509.2 W|== WARNING: No GPU capability available.
0 1509.2 W|
0 1509.2 ----------------------------- Captured stderr call -----------------------------
0 1509.2 E|
0 1509.2 E|== ERROR: Telescope model directory './karabo/data/meerkat.tm' does not exist.
0 1509.2 E|
0 1509.2 TestPinocchio.testSimpleInstance
0 1509.2
0 1509.2 self =
0 1509.2
0 1509.2 def testSimpleInstance(self) -> None:
0 1509.2 p = Pinocchio()
0 1509.2 p.setRunName("unittest")
0 1509.2 p.printConfig()
0 1509.2 p.printRedShiftRequest()
0 1509.2 p.runPlanner(16, 1)
0 1509.2 p.run(mpiThreads=2)
0 1509.2
0 1509.2 p.save(TestPinocchio.RESULT_FOLDER)
0 1509.2 > sky = p.getSkyModel()
0 1509.2
0 1509.2 /workspace/tmp/karabo/test/test_pinocchio.py:30:
0 1509.2
0 1509.2 /opt/conda/lib/python3.9/site-packages/karabo/simulation/pinocchio.py:690: in getSkyModel
0 1509.2 return Pinocchio.getSkyModelFromFiles(self.outLightConePath, near, far)
0 1509.2 /opt/conda/lib/python3.9/site-packages/karabo/simulation/pinocchio.py:721: in getSkyModelFromFiles
0 1509.2 (x, y, z) = np.loadtxt(path, unpack=True, usecols=(2, 3, 4))
0 1509.2 /opt/conda/lib/python3.9/site-packages/numpy/lib/npyio.py:1042: in loadtxt
0 1509.2 fh = np.lib._datasource.open(fname, 'rt', encoding=encoding)
0 1509.2 /opt/conda/lib/python3.9/site-packages/numpy/lib/_datasource.py:193: in open
0 1509.2 return ds.open(path, mode, encoding=encoding, newline=newline)
0 1509.2
0 1509.2
0 1509.2 self = <numpy.DataSource object at 0x7fb2fd960b50>
0 1509.2 path = '/workspace/tmp/karabo_folder/89f02710-00a1-442b-bc62-e7768fd6c11a/pinocchio.unittest.plc.out'
0 1509.2 mode = 'rt', encoding = None, newline = None
0 1509.2
0 1509.2 def open(self, path, mode='r', encoding=None, newline=None):
0 1509.2 """
0 1509.2 Open and return file-like object.
0 1509.2
0 1509.2 If
path
is an URL, it will be downloaded, stored in the0 1509.2
DataSource
directory and opened from there.0 1509.2
0 1509.2 Parameters
0 1509.2 ----------
0 1509.2 path : str
0 1509.2 Local file path or URL to open.
0 1509.2 mode : {'r', 'w', 'a'}, optional
0 1509.2 Mode to open
path
. Mode 'r' for reading, 'w' for writing,0 1509.2 'a' to append. Available modes depend on the type of object
0 1509.2 specified by
path
. Default is 'r'.0 1509.2 encoding : {None, str}, optional
0 1509.2 Open text file with given encoding. The default encoding will be
0 1509.2 what
io.open
uses.0 1509.2 newline : {None, str}, optional
0 1509.2 Newline to use when reading text file.
0 1509.2
0 1509.2 Returns
0 1509.2 -------
0 1509.2 out : file object
0 1509.2 File object.
0 1509.2
0 1509.2 """
0 1509.2
0 1509.2 # TODO: There is no support for opening a file for writing which
0 1509.2 # doesn't exist yet (creating a file). Should there be?
0 1509.2
0 1509.2 # TODO: Add a
subdir
parameter for specifying the subdirectory0 1509.2 # used to store URLs in self._destpath.
0 1509.2
0 1509.2 if self._isurl(path) and self._iswritemode(mode):
0 1509.2 raise ValueError("URLs are not writeable")
0 1509.2
0 1509.2 # NOTE: _findfile will fail on a new file opened for writing.
0 1509.2 found = self._findfile(path)
0 1509.2 if found:
0 1509.2 _fname, ext = self._splitzipext(found)
0 1509.2 if ext == 'bz2':
0 1509.2 mode.replace("+", "")
0 1509.2 return _file_openers[ext](found, mode=mode,
0 1509.2 encoding=encoding, newline=newline)
0 1509.2 else:
0 1509.2 > raise FileNotFoundError(f"{path} not found.")
0 1509.2 E FileNotFoundError: /workspace/tmp/karabo_folder/89f02710-00a1-442b-bc62-e7768fd6c11a/pinocchio.unittest.plc.out not found.
0 1509.2
0 1509.2 /opt/conda/lib/python3.9/site-packages/numpy/lib/_datasource.py:532: FileNotFoundError
0 1509.2 ----------------------------- Captured stdout call -----------------------------
0 1509.2 RunFlag: has value = unittest and is active, comment = name of the run
0 1509.2 OutputList: has value = outputs and is active, comment = name of file with required output redshifts
0 1509.2 BoxSize: has value = 500.0 and is active, comment = physical size of the box in Mpc
0 1509.2 BoxInH100: is a flag and is active, comment = specify that the box is in Mpc/h
0 1509.2 GridSize: has value = 200 and is active, comment = number of grid points per side
0 1509.2 RandomSeed: has value = 486604 and is active, comment = random seed for initial conditions
0 1509.2 Omega0: has value = 0.25 and is active, comment = Omega_0 (total matter)
0 1509.2 OmegaLambda: has value = 0.75 and is active, comment = Omega_Lambda
0 1509.2 OmegaBaryon: has value = 0.044 and is active, comment = Omega_b (baryonic matter)
0 1509.2 Hubble100: has value = 0.70 and is active, comment = little h
0 1509.2 Sigma8: has value = 0.8 and is active, comment = sigma8; if 0, it is computed from the provided P(k)
0 1509.2 PrimordialIndex: has value = 0.96 and is active, comment = n_s
0 1509.2 DEw0: has value = -1.0 and is active, comment = w0 of parametric dark energy equation of state
0 1509.2 DEwa: has value = 0.0 and is active, comment = wa of parametric dark energy equation of state
0 1509.2 TabulatedEoSfile: has value = no and is active, comment = equation of state of dark energy tabulated in a file
0 1509.2 FileWithInputSpectrum: has value = no and is active, comment = P(k) tabulated in a file
0 1509.2 InputSpectrum_UnitLength_in_cm: has value = 0 and is active, comment = units of tabulated P(k), or 0 if it is in h/Mpc
0 1509.2 WDM_PartMass_in_kev: has value = 0.0 and is active, comment = WDM cut following Bode, Ostriker & Turok (2001)
0 1509.2 BoundaryLayerFactor: has value = 3.0 and is active, comment = width of the boundary layer for fragmentation
0 1509.2 MaxMem: has value = 3600 and is active, comment = max available memory to an MPI task in Mbyte
0 1509.2 MaxMemPerParticle: has value = 150 and is active, comment = max available memory in bytes per particle
0 1509.2 PredPeakFactor: has value = 0.8 and is active, comment = guess for the number of peaks in the subvolume
0 1509.2 CatalogInAscii: is a flag and is active, comment = catalogs are written in ascii and not in binary format
0 1509.2 OutputInH100: is a flag and is active, comment = units are in H=100 instead of the true H value
0 1509.2 NumFiles: has value = 1 and is active, comment = number of files in which each catalog is written
0 1509.2 MinHaloMass: has value = 10 and is active, comment = smallest halo that is given in output
0 1509.2 AnalyticMassFunction: has value = 9 and is active, comment = form of analytic mass function given in the .mf.out files
0 1509.2 WriteSnapshot: is a flag and is inactive, comment = writes a Gadget2 snapshot as an output
0 1509.2 DoNotWriteCatalogs: is a flag and is inactive, comment = skips the writing of full catalogs (including PLC)
0 1509.2 DoNotWriteHistories: is a flag and is inactive, comment = skips the writing of merger histories
0 1509.2 WriteFmax: is a flag and is inactive, comment = writes the values of the Fmax field, particle by particle
0 1509.2 WriteVmax: is a flag and is inactive, comment = writes the values of the Vmax field, particle by particle
0 1509.2 WriteRmax: is a flag and is inactive, comment = writes the values of the Rmax field, particle by particle
0 1509.2 WriteDensity: is a flag and is inactive, comment = writes the linear density, particle by particle
0 1509.2 StartingzForPLC: has value = 0.3 and is active, comment = starting (highest) redshift for the past light cone
0 1509.2 LastzForPLC: has value = 0.0 and is active, comment = final (lowest) redshift for the past light cone
0 1509.2 PLCAperture: has value = 30 and is active, comment = cone aperture for the past light cone
0 1509.2 DeltaF_PLC: has value = 0.90 and is active, comment = build_groups.c:102
0 1509.2 PLCProvideConeData: is a flag and is inactive, comment = read vertex and direction of cone from paramter file
0 1509.2 PLCCenter: has value = 0. 0. 0. and is inactive, comment = cone vertex in the same coordinates as the BoxSize
0 1509.2 PLCAxis: has value = 1. 1. 0. and is inactive, comment = un-normalized direction of the cone axis
0 1509.2 Redshift active: 0.0
0 1509.2 run_planner planning a run on nodes with 16 Gb or RAM each, with 1 tasks per node
0 1509.2
0 1509.2 **
0 1509.2 This is the standard PINOCCHIO output
0 1509.2 **
0 1509.2 [Thu Mar 23 2023 12:54:51] This is pinocchio V5.0, running on 1 MPI tasks
0 1509.2
0 1509.2 This version uses 3LPT displacements
0 1509.2 Radiation is not included in the Friedmann equations
0 1509.2 Ellipsoidal collapse will be computed as Monaco (1995)
0 1509.2
0 1509.2 Reading parameters from file /workspace/tmp/karabo_folder/89f02710-00a1-442b-bc62-e7768fd6c11a/parameter_file
0 1509.2 Flag for this run: unittest
0 1509.2
0 1509.2 PARAMETER VALUES from file /workspace/tmp/karabo_folder/89f02710-00a1-442b-bc62-e7768fd6c11a/parameter_file:
0 1509.2 Omega0 0.250000
0 1509.2 OmegaLambda 0.750000
0 1509.2 OmegaBaryon 0.044000
0 1509.2 DE EoS parameters -1.000000 0.000000
0 1509.2 Hubble100 0.700000
0 1509.2 Sigma8 0.800000
0 1509.2 PrimordialIndex 0.960000
0 1509.2 RandomSeed 486604
0 1509.2 OutputList outputs
0 1509.2 Number of outputs 1
0 1509.2 Output redshifts 0.000000
0 1509.2 GridSize 200 200 200
0 1509.2 BoxSize (true Mpc) 714.285714
0 1509.2 BoxSize (Mpc/h) 500.000000
0 1509.2 Particle Mass (true Msun) 1.54883e+12
0 1509.2 Particle Mass (Msun/h) 1.08418e+12
0 1509.2 Inter-part dist (true Mpc) 3.571429
0 1509.2 Inter-part dist (Mpc/h) 2.500000
0 1509.2 MinHaloMass (particles) 10
0 1509.2 MinHaloMass (Msun/h) 1.08418e+13
0 1509.2 BoundaryLayerFactor 3.000000
0 1509.2 MaxMem per task (Mb) 3600
0 1509.2 MaxMem per particle (b) 150.000000
0 1509.2 CatalogInAscii 1
0 1509.2 NumFiles 1
0 1509.2 DoNotWriteCatalogs 0
0 1509.2 DoNotWriteHistories 0
0 1509.2 WriteTimelessSnapshot 0
0 1509.2 OutputInH100 1
0 1509.2 WriteDensity 0
0 1509.2 WriteProducts 0
0 1509.2 DumpProducts 0
0 1509.2 ReadProductsFromDumps 0
0 1509.2 DeltaF_PLC 0.900000
0 1509.2 Using Watson et al. (2013) for the analytic mass function
0 1509.2
0 1509.2
0 1509.2 GENIC parameters:
0 1509.2 InputSpectrum_UnitLength_in_cm 0.000000
0 1509.2 FileWithInputSpectrum no
0 1509.2 WDM_PartMass_in_kev 0.000000
0 1509.2
0 1509.2 Power spectrum will be given by the Einsenstein & Hu fit
0 1509.2 Normalization constant for the power spectrum: 2.03146e+07
0 1509.2
0 1509.2 **
0 1509.2 End of standard PINOCCHIO output
0 1509.2 **
0 1509.2
0 1509.2 needed overhead: 0.403200
0 1509.2
0 1509.2 **
0 1509.2 set_subboxes output, IGNORE ERROR MESSAGES
0 1509.2 **
0 1509.2
0 1509.2
0 1509.2 FRAGMENTATION:
0 1509.2 Reference number of particles: 8000000
0 1509.2 Requested bytes per particle: 155
0 1509.2 Number of sub-boxes per dimension: 1 1 1
0 1509.2 Periodic boundary conditions: 1 1 1
0 1509.2 Core 0 will work on a grid: 200 200 200
0 1509.2 The resolved box will be: 200 200 200
0 1509.2 Boundary layer: 0 0 0
0 1509.2 Boundary layer factor: 3.000000
0 1509.2 Number of total particles for core 0: 8000000
0 1509.2 Number of good particles for core 0: 8000000
0 1509.2 Particles that core 0 will allocate: 7634991
0 1509.2 Allowed overhead for boundary layer: 0.954374
0 1509.2 Largest halo expected in this box at z=0.000000: 6.771607e+15 Msun
0 1509.2 its Lagrangian size: 14.982622 Mpc ( 4.20 grid points)
0 1509.2 this requires a boundary layer of 12.59 grid points
0 1509.2
0 1509.2 The mass function will be computed from Log M=13.189953 to Log M=15.980692 (56 bins)
0 1509.2
0 1509.2
0 1509.2 **
0 1509.2
0 1509.2 I tried MaxMemPerParticle = 155; Nnodes = 1; NTasks = 1; MyGrids[0].ParticlesPerTask = 8000000; new MaxMemPerParticle = 155
0 1509.2
0 1509.2 **
0 1509.2 Density std dev on grid for this run: 1.954428
0 1509.2 We assume a value of MaxMemPerParticle of 155
0 1509.2 The number of nodes needed for this run is: 1
0 1509.2 Number of MPI tasks used for the run: 1
0 1509.2
0 1509.2 A successful PINOCCHIO run is determined by these parameters:
0 1509.2 - MaxMem: is the memory available to the MPI task;
0 1509.2 - MaxMemPerParticle: is the memory per particle that can be allocated;
0 1509.2 - BoundaryLayerFactor: is the depth of the boundary layer.
0 1509.2
0 1509.2 Each task adds to the fragmentation sub-box a layer as deep
0 1509.2 as BoundaryLayerFactor times the Lagrangian size of the largest
0 1509.2 halo expected in the box. The augmented sub-box will be smaller than the whole box.
0 1509.2 In the new fragmentation, halos at the border of the fragmentation sub-volume
0 1509.2 are augmented by BoundaryLayerFactor times their Lagrangian size.
0 1509.2 Suggested value: 1.0 for the classic fragmentation, 3.0 for the new fragmentation.
0 1509.2
0 1509.2 - PredPeakFactor: the number of allocated peaks will be PredPeakFactor times 1/6 of
0 1509.2 the particles in the sub-box (without boundary layer). If this is kept small
0 1509.2 there might be not enough space to allocate peaks.
0 1509.2 Suggested value: 0.6 at low resolution (Mpart~1e11 Msun/h),
0 1509.2 0.8 at medium resolution (Mpart~1e9 Msun/h), >1.0 at higher resolution.
0 1509.2 At the end of a run, the code suggests a minimum value for PredPeakFactor,
0 1509.2 but keep a margin to it.
0 1509.2
0 1509.2
0 1509.2 Map of memory usage for Task 0:
0 1509.2 Task N. mem(MB) overhead products fields ffts fmax frag pr. groups fragment total bytes per particle
0 1509.2 0 1183 1.0 56.0 80.2 16.1 152.2 53.4 45.6 155.0 155.0
0 1509.2
0 1509.2 Complete memory map
0 1509.2 memory.prods: 448000000, 56.0 bpp
0 1509.2 memory.fields_to_keep 257280000, 32.2 bpp
0 1509.2 memory.fields 384000000, 48.0 bpp
0 1509.2 memory.first_allocated: 1089280000, 136.2 bpp
0 1509.2 memory.fft: 128640000, 16.1 bpp
0 1509.2 memory.fmax_total: 1217920000, 152.2 bpp
0 1509.2 memory.frag_prods: 427559496, 53.4 bpp
0 1509.2 memory.frag_arrays: 183239784, 22.9 bpp
0 1509.2 memory.groups: 181199904, 22.6 bpp
0 1509.2 memory.frag_allocated: 1239999184, 155.0 bpp
0 1509.2 memory.frag_total: 1239999184, 155.0 bpp
0 1509.2 memory.all: 1239999184, 155.0 bpp
0 1509.2
0 1509.2 Number of nodes needed for this run: 1
0 1509.2 Number of MPI tasks used for the run: 1
0 1509.2 Each MPI task will need at least 1183.556152 Mb
0 1509.2 This run will occupy memory of 0.07 nodes, 7.22 percent of available memory
0 1509.2 Density standard deviation on the grid: 1.954428
0 1509.2 Predicted overhead: 0.403200
0 1509.2 You can copy and paste these into the parameter file:
0 1509.2 MaxMem 16384
0 1509.2 MaxMemPerParticle 155
0 1509.2 PredPeakFactor 0.6
0 1509.2 BoundaryLayerFactor 3.0
0 1509.2
0 1509.2
0 1509.2 **
0 1509.2 I'm now checking parameters and directives, this may give error messages
0 1509.2 **
0 1509.2 **
0 1509.2 This is the number of sub-boxes per dimension: 1 1 1
0 1509.2 Their products MUST give the number of tasks.
0 1509.2 The more similar they are, the better (unless some of them is 1).
0 1509.2 If the run is large and the three numbers are not as similar as possible,
0 1509.2 try to change the number of tasks per node or to ask for a specific number of nodes to achieve a better balance.
0 1509.2
0 1509.2 run_planner done!
0 1509.2 past light cone at /workspace/tmp/karabo_folder/89f02710-00a1-442b-bc62-e7768fd6c11a/pinocchio.unittest.plc.out
0 1509.2 ----------------------------- Captured stderr call -----------------------------
0 1509.2 --------------------------------------------------------------------------
0 1509.2 mpirun has detected an attempt to run as root.
0 1509.2
0 1509.2 Running as root is strongly discouraged as any mistake (e.g., in
0 1509.2 defining TMPDIR) or bug can result in catastrophic damage to the OS
0 1509.2 file system, leaving your system in an unusable state.
0 1509.2
0 1509.2 We strongly suggest that you run mpirun as a non-root user.
0 1509.2
0 1509.2 You can override this protection by adding the --allow-run-as-root option
0 1509.2 to the cmd line or by setting two environment variables in the following way:
0 1509.2 the variable OMPI_ALLOW_RUN_AS_ROOT=1 to indicate the desire to override this
0 1509.2 protection, and OMPI_ALLOW_RUN_AS_ROOT_CONFIRM=1 to confirm the choice and
0 1509.2 add one more layer of certainty that you want to do so.
0 1509.2 We reiterate our advice against doing so - please proceed at your own risk.
0 1509.2 --------------------------------------------------------------------------
0 1509.2 =============================== warnings summary ===============================
0 1509.2 ../../opt/conda/lib/python3.9/site-packages/astropy/_erfa/init.py:12
0 1509.2 /opt/conda/lib/python3.9/site-packages/astropy/_erfa/init.py:12: AstropyDeprecationWarning: The private astropy._erfa module has been made into its own package, pyerfa, which is a dependency of astropy and can be imported directly using "import erfa"
0 1509.2 warnings.warn('The private astropy._erfa module has been made into its '
0 1509.2
0 1509.2 karabo/test/test_beam.py: 2 warnings
0 1509.2 karabo/test/test_image.py: 5 warnings
0 1509.2 karabo/test/test_long_observation.py: 1 warning
0 1509.2 karabo/test/test_mock_mightee.py: 9 warnings
0 1509.2 karabo/test/test_source_detection.py: 1 warning
0 1509.2 karabo/test/test_telescope_baselines.py: 1 warning
0 1509.2 /opt/conda/lib/python3.9/site-packages/ska_sdp_datamodels/visibility/vis_model.py:430: DeprecationWarning: flagged_imaging_weight is deprecated, please use flagged_weight instead
0 1509.2 warnings.warn(
0 1509.2
0 1509.2 karabo/test/test_beam.py: 2 warnings
0 1509.2 karabo/test/test_image.py: 5 warnings
0 1509.2 karabo/test/test_long_observation.py: 1 warning
0 1509.2 karabo/test/test_mock_mightee.py: 9 warnings
0 1509.2 karabo/test/test_source_detection.py: 1 warning
0 1509.2 karabo/test/test_telescope_baselines.py: 1 warning
0 1509.2 /opt/conda/lib/python3.9/site-packages/ska_sdp_datamodels/visibility/vis_model.py:179: DeprecationWarning: imaging_weight is deprecated, please use weight instead
0 1509.2 warnings.warn(
0 1509.2
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/create_beam.py:23: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
0 1509.2 return np.array(params), C['nu']
0 1509.2
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/spatial.py:28: DeprecationWarning:
np.complex
is a deprecated alias for the builtincomplex
. To silence this warning, usecomplex
by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, usenp.complex128
here.0 1509.2 Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
0 1509.2 self.coeffs_J = self.coeffs_trunc_J = np.zeros((data.shape[0], data.shape[1], self.Nmodes), dtype=np.complex)
0 1509.2
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/spatial.py:29: DeprecationWarning:
np.complex
is a deprecated alias for the builtincomplex
. To silence this warning, usecomplex
by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, usenp.complex128
here.0 1509.2 Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
0 1509.2 self.recon_full_J = self.recon_trunc_J = np.zeros(data.shape, dtype=np.complex)
0 1509.2
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_beam.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_eidosbeam
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/spatial.py:30: DeprecationWarning:
np.complex
is a deprecated alias for the builtincomplex
. To silence this warning, usecomplex
by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, usenp.complex128
here.0 1509.2 Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
0 1509.2 if 'recon' in self.mode: self.recon = np.zeros((2,2,self.npix,self.npix), dtype=np.complex)
0 1509.2
0 1509.2 karabo/test/test_beam.py: 16 warnings
0 1509.2 karabo/test/test_long_observation.py: 16 warnings
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/spatial.py:108: DeprecationWarning:
np.float
is a deprecated alias for the builtinfloat
. To silence this warning, usefloat
by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, usenp.float64
here.0 1509.2 Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
0 1509.2 grid = (np.indices((nx, ny), dtype=np.float) - nx/2) / (nx*1./2) # create unit grid [-1,1]
0 1509.2
0 1509.2 karabo/test/test_beam.py: 16 warnings
0 1509.2 karabo/test/test_long_observation.py: 16 warnings
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/spatial.py:154: DeprecationWarning: Calling np.sum(generator) is deprecated, and in the future will give a different result. Use np.sum(np.fromiter(generator)) or the python sum builtin instead.
0 1509.2 return np.sum(C[i] self.zernikel(val, self.grid_rho, self.grid_phi)self.grid_mask for (i, val) in enumerate(I))
0 1509.2
0 1509.2 karabo/test/test_beam.py: 6812 warnings
0 1509.2 karabo/test/test_long_observation.py: 6812 warnings
0 1509.2 /opt/conda/lib/python3.9/site-packages/eidos/spatial.py:63: DeprecationWarning: Using factorial() with floats is deprecated
0 1509.2 pre_fac = lambda k: (-1.0)*k fac(n-k) / ( fac(k) fac( (n+m)/2.0 - k ) fac( (n-m)/2.0 - k ) )
0 1509.2
0 1509.2 karabo/test/test_data.py::TestData::test_download_gleam_and_make_sky_model
0 1509.2 karabo/test/test_image.py::TestImage::test_explore_sky
0 1509.2 karabo/test/test_skymodel.py::TestSkyModel::test_plot_gleam
0 1509.2 karabo/test/test_skymodel.py::TestSkyModel::test_read_healpix_map
0 1509.2 /opt/conda/lib/python3.9/site-packages/karabo/simulation/sky_model.py:453: RuntimeWarning: invalid value encountered in log10
0 1509.2 flux = cfun(flux)
0 1509.2
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_long_observations
0 1509.2 karabo/test/test_long_observation.py::MyTestCase::test_long_observations
0 1509.2 /opt/conda/lib/python3.9/site-packages/erfa/core.py:154: ErfaWarning: ERFA function "d2dtf" yielded 1 of "dubious year (Note 5)"
0 1509.2 warnings.warn('ERFA function "{}" yielded {}'.format(func_name, wmsg),
0 1509.2
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mock_mightee
0 1509.2 /opt/conda/lib/python3.9/site-packages/astropy/units/core.py:2042: UnitsWarning: 'DEG' did not parse as fits unit: At col 0, Unit 'DEG' not supported by the FITS standard. Did you mean EG, Eg, dG, deg or dg? If this is meant to be a custom unit, define it with 'u.def_unit'. To have it recognized inside a file reader or other code, enable it with 'u.add_enabled_units'. For details, see https://docs.astropy.org/en/latest/units/combining_and_defining.html
0 1509.2 warnings.warn(msg, UnitsWarning)
0 1509.2
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mock_mightee
0 1509.2 /opt/conda/lib/python3.9/site-packages/astropy/units/core.py:2042: UnitsWarning: 'JY' did not parse as fits unit: At col 0, Unit 'JY' not supported by the FITS standard. Did you mean Jy, YJy, ZJy, yJy or zJy? If this is meant to be a custom unit, define it with 'u.def_unit'. To have it recognized inside a file reader or other code, enable it with 'u.add_enabled_units'. For details, see https://docs.astropy.org/en/latest/units/combining_and_defining.html
0 1509.2 warnings.warn(msg, UnitsWarning)
0 1509.2
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mock_mightee
0 1509.2 /opt/conda/lib/python3.9/site-packages/astropy/units/core.py:2042: UnitsWarning: 'JY/BEAM' did not parse as fits unit: At col 0, Unit 'JY' not supported by the FITS standard. Did you mean Jy, YJy, ZJy, yJy or zJy? If this is meant to be a custom unit, define it with 'u.def_unit'. To have it recognized inside a file reader or other code, enable it with 'u.add_enabled_units'. For details, see https://docs.astropy.org/en/latest/units/combining_and_defining.html
0 1509.2 warnings.warn(msg, UnitsWarning)
0 1509.2
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mightee_download
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mock_mightee
0 1509.2 /opt/conda/lib/python3.9/site-packages/astropy/units/core.py:2042: UnitsWarning: 'HZ' did not parse as fits unit: At col 0, Unit 'HZ' not supported by the FITS standard. Did you mean Hz, YHz, ZHz, yHz or zHz? If this is meant to be a custom unit, define it with 'u.def_unit'. To have it recognized inside a file reader or other code, enable it with 'u.add_enabled_units'. For details, see https://docs.astropy.org/en/latest/units/combining_and_defining.html
0 1509.2 warnings.warn(msg, UnitsWarning)
0 1509.2
0 1509.2 karabo/test/test_mock_mightee.py::TestSystemNoise::test_mock_mightee
0 1509.2 /opt/conda/lib/python3.9/site-packages/astropy/wcs/wcsapi/fitswcs.py:604: AstropyUserWarning: No observer defined on WCS, SpectralCoord will be converted without any velocity frame change
0 1509.2 warnings.warn(f'{msg}, SpectralCoord '
0 1509.2
0 1509.2 karabo/test/test_notebooks.py::TestJupyterNotebooks::test_source_detection_assesment_notebook
0 1509.2 /opt/conda/lib/python3.9/site-packages/jupyter_client/connect.py:20: DeprecationWarning: Jupyter is migrating its paths to use standard platformdirs
0 1509.2 given by the platformdirs library. To remove this warning and
0 1509.2 see the appropriate new directories, set the environment variable
0 1509.2
JUPYTER_PLATFORM_DIRS=1
and then runjupyter --paths
.0 1509.2 The use of platformdirs will be the default in
jupyter_core
v60 1509.2 from jupyter_core.paths import jupyter_data_dir, jupyter_runtime_dir, secure_write
0 1509.2
0 1509.2 karabo/test/test_source_detection.py::TestSourceDetection::test_bdsf_image_blanked
0 1509.2 /opt/conda/lib/python3.9/site-packages/bdsf/collapse.py:234: RuntimeWarning: invalid value encountered in true_divide
0 1509.2 ch0 = ch0/sumwts
0 1509.2
0 1509.2 karabo/test/test_source_detection.py::TestSourceDetection::test_bdsf_image_blanked
0 1509.2 /opt/conda/lib/python3.9/site-packages/bdsf/collapse.py:304: RuntimeWarning: invalid value encountered in double_scalars
0 1509.2 img.frequency = sumfrq / sumwts
0 1509.2
0 1509.2 karabo/test/test_source_detection.py::TestSourceDetection::test_source_detection_plot
0 1509.2 /opt/conda/lib/python3.9/site-packages/karabo/sourcedetection/evaluation.py:444: RuntimeWarning: More than 20 figures have been opened. Figures created through the pyplot interface (
matplotlib.pyplot.figure
) are retained until explicitly closed and may consume too much memory. (To control this warning, see the rcParamfigure.max_open_warning
). Consider usingmatplotlib.pyplot.close()
.0 1509.2 fig, ax = plt.subplots()
0 1509.2
0 1509.2 -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
0 1509.2 =========================== short test summary info ============================
0 1509.2 FAILED karabo/test/test_beam.py::MyTestCase::test_compare_karabo_oskar - Runt...
0 1509.2 FAILED karabo/test/test_pinocchio.py::TestPinocchio::testSimpleInstance - Fil...
0 1509.2 ===== 2 failed, 59 passed, 5 skipped, 13776 warnings in 1494.79s (0:24:54) =====
0 1509.2 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
0 1509.2
0 1509.2