Closed MGJvanGroeningen closed 3 years ago
Hi @MGJvanGroeningen,
Per my email from earlier today, I would actually suggest making sure you can get the sloshing example to work by itself first in pyXSIM.
What version of yt are you using for this SPH dataset? We will probably have to get you going on the beta version of yt 4.
Could you also post the sample dataset somewhere (say, Dropbox or somewhere) for me to look at? I just want to make sure the extract from Magneticum has everything we need in the header, etc.
I am using version 3.6.0
I sent you an email with the snapshot dataset.
@MGJvanGroeningen it looks like this dataset has some negative smoothing lengths. Can you check this out?
Yes, I also find this with the following code,
ds = yt.load(filename)
ad = ds.all_data()
print(min(ad[('Gas', 'SmoothingLength')]), max(ad[('Gas', 'SmoothingLength')]))
I have downloaded 6 clusters (the first three are very big, the last three smaller/average sized) and they give the following minima and maxima from the previous code,
-2.000000238418579 code_length 3.4022080980779138e+38 code_length -3.8101744086522266e+17 code_length 2.0000009536743164 code_length -32720715776.0 code_length 3.3248459172471854e+38 code_length -4.611116137110241e+36 code_length 2.0 code_length -3.6941305848172413e+31 code_length 2.000000238418579 code_length -2.0 code_length 2.9278974591254463e+38 code_length
These don't seem like reasonable values for smoothing length to me, so I'm not sure what's going on. A pattern I can see is the recurring value 2.0 and -2.0.
I've previously been using the package pynbody to visualize the snapshots. With the following code
import pynbody as nb
ds = nb.load(filename)
print(min(ds.gas['smooth']), min(ds.gas['smooth']))
I get (in the same file order) in kpc a h^-1
3.4977043 868.9412 3.623284 1062.2195 3.0887933 874.7076 11.889386 660.1877 6.2942877 415.8486 10.250403 497.17657
which seems more reasonable.
Something I also don't understand is that with the command 'ds.field_list' from yt, I can see there is a ('Gas', 'SmoothingLength') field available, but for example ('Halo', 'SmoothingLength') and ('Stars', 'SmoothingLength') are not, but I presume that the halo and star particles also have a smoothing length.
If yt loading and gridding data is a problem, we could use @jdonnert's sph2grid code to grid the data. It would be interesting to see how various gridding kernels impact the simulated images. E.g. see how D20 compares to SPH kernel of Price+2012. This is extra work though, so I would only go down this route if yt really poses a problem.
@MGJvanGroeningen the halo and star particles should not have a smoothing length in yt, because this is only necessary for gas.
I'll be looking at this today.
@MGJvanGroeningen @Joshuaalbert by the way, one other quick question--does magneticum allow you to dump the particle data in HDF5 format? this is far superior to the unformatted gadget format and it would make things a heck of a lot easier to diagnose.
So something is going on in the file which yt does not expect but pynbody does. It's either an issue with the header or the data structure itself. It may be an endian issue, or size of data structure, etc. But yt is supposed to be able to catch all of these things, so I'm not sure what's going on.
I have a question out to Klaus Dolag about this.
Thank you @jzuhone for your help in solving this! Re: hdf5 format, I think we can only get snapshots in gadget. Klaus may say otherwise though.
I doubt there is hdf5 for magneticum, that library is not fun to use and doesn’t really perform well on large scales (I have heard). This is almost certainly Gadget Format 2 with extra Extensions, for e.g. Piano Hilbert index. It should be backwards compatible though, so if you have a tool that knows format 2 and the other one doesn’t, that could explain the issue. Format 2 includes a small header before every block containing 4 chars and a 4 byte int, name and Blocksize. So it would give you an offset of 8 Bytes per Block.
@jdonnert yt knows format 2, so there's something about this particular file that's odd. I'm looking into it.
Ok, so one of the things is that the particle IDs are uint64, and there's a switch to account for that:
ds = yt.load("snap_026", long_ids=True)
That allows me to load the data, and the smoothing lengths look fine, but then I get negative particle masses. I'll keep digging, but this is a start.
@Joshuaalbert @MGJvanGroeningen found the fix, more details tomorrow
@Joshuaalbert @MGJvanGroeningen
Ok, so here's the deal. First, you need to install yt 4.0, which is in beta and is awfully close to being released. You'll have to do this from the GitHub source.
If your Python installation is Conda, use these instructions:
https://yt-project.org/docs/dev/installing.html#building-yt-from-source-for-conda-based-installs
Otherwise, do this:
https://yt-project.org/docs/dev/installing.html#installing-yt-from-source
For that second option, you may not need the --user
or --install-option
flags, see the instructions there.
Once you have that going, the SimCut file can be loaded in the usual way, but there are a couple of tricks to make yt understand the dataset. They are:
These can be easily remedied--this notebook shows how:
https://gist.github.com/jzuhone/af699eb8d2cc1bfd63938307ccda933b
Once you can get that notebook working let me know and we can proceed to pyXSIM.
@jzuhone Thank you! I've been able to run most of the code in the snap_026_yt.ipynb notebook you posted, after installing yt 4.0.
The Sliceplot and Phaseplot give very nice images, but yt.ProjectionPlot gives me an error unfortunately. In the IOHandlerGadgetBinary._read_particle_coords()
function (the last one in the 2nd Traceback), the variable pos
gets a None
value, because tp[ptype]
is zero in pos = self._read_field_from_file(f, tp[ptype], "Coordinates")
. It seems the keyword 'Gas'
in the dictionary created by tp = data_file.total_particles
has value zero at that moment.
The IOHandlerGadgetBinary._read_particle_coords()
function completes without errors about ~50 times in the cell with the ProjectionPlot function before giving the error. In those iterations, the value of the 'Gas'
keyword is not zero, but 262144 for the first ~40 iterations and 91118 for the last ~10.
I also get an error while trying to import pyxsim, when using yt 4.0.
Traceback (most recent call last):
File "/home/matthijs/PycharmProjects/TensorflowTest/pyXSIM/snap_026_yt.py", line 3, in <module>
import pyxsim
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/pyxsim/__init__.py", line 7, in <module>
from pyxsim.source_generators import \
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/pyxsim/source_generators/__init__.py", line 1, in <module>
from .xray_binaries import \
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/pyxsim/source_generators/xray_binaries.py", line 2, in <module>
from yt.frontends.stream.api import load_particles
ImportError: cannot import name 'load_particles' from 'yt.frontends.stream.api' (/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/frontends/stream/api.py)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-6-72e5947b4bee> in <module>
1 # Make a projection of the gas density and the hot gas density.
2 # We set the center to the maximum density location, and the width to 4 Mpc.
----> 3 p = yt.ProjectionPlot(ds, "z", [("gas", "density"), ("hot_gas","density")], center='max', width=(4.0,"Mpc"))
4 p.show()
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/visualization/plot_window.py in __init__(self, ds, axis, fields, center, width, axes_unit, weight_field, max_level, origin, right_handed, fontsize, field_parameters, data_source, method, proj_style, window_size, buff_size, aspect)
1769 max_level=max_level,
1770 )
-> 1771 PWViewerMPL.__init__(
1772 self,
1773 proj,
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/visualization/plot_window.py in __init__(self, *args, **kwargs)
816 self._plot_type = kwargs.pop("plot_type")
817 self._splat_color = kwargs.pop("splat_color", None)
--> 818 PlotWindow.__init__(self, *args, **kwargs)
819
820 def _setup_origin(self):
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/visualization/plot_window.py in __init__(self, data_source, bounds, buff_size, antialias, periodic, origin, oblique, right_handed, window_size, fields, fontsize, aspect, setup)
238 self._field_transform[field] = linear_transform
239 self.setup_callbacks()
--> 240 self._setup_plots()
241
242 def __iter__(self):
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/visualization/plot_window.py in _setup_plots(self)
949 zlim = (None, None)
950
--> 951 image = self.frb[f]
952 if self._field_transform[f] == log_transform:
953 msg = None
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/visualization/fixed_resolution.py in __getitem__(self, item)
144 bounds.append(b)
145
--> 146 buff = self.ds.coordinates.pixelize(
147 self.data_source.axis,
148 self.data_source,
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/geometry/coordinates/cartesian_coordinates.py in pixelize(self, dimension, data_source, field, bounds, size, antialias, periodic)
215
216 elif self.axis_id.get(dimension, dimension) < 3:
--> 217 return self._ortho_pixelize(
218 data_source, field, bounds, size, antialias, dimension, periodic
219 )
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/geometry/coordinates/cartesian_coordinates.py in _ortho_pixelize(self, data_source, field, bounds, size, antialias, dim, periodic)
366 pixelize_sph_kernel_projection(
367 buff,
--> 368 chunk[ptype, px_name].to("code_length"),
369 chunk[ptype, py_name].to("code_length"),
370 chunk[ptype, "smoothing_length"].to("code_length"),
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/data_objects/data_containers.py in __getitem__(self, key)
246 return self.field_data[f]
247 else:
--> 248 self.get_data(f)
249 # fi.units is the unit expression string. We depend on the registry
250 # hanging off the dataset to define this unit object.
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/data_objects/selection_objects/data_selection_objects.py in get_data(self, fields)
200 self.field_data[f].convert_to_units(finfos[f].output_units)
201
--> 202 read_particles, gen_particles = self.index._read_particle_fields(
203 particles, self, self._current_chunk
204 )
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/geometry/geometry_handler.py in _read_particle_fields(self, fields, dobj, chunk)
206 self._identify_base_chunk(dobj)
207 chunks = self._chunk_io(dobj, cache=False)
--> 208 fields_to_return = self.io._read_particle_selection(
209 chunks, selector, fields_to_read
210 )
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/utilities/io_handler.py in _read_particle_selection(self, chunks, selector, fields)
183 # psize maps the names of particle types to the number of
184 # particles of each type
--> 185 self._count_particles_chunks(psize, chunks, ptf, selector)
186
187 # Now we allocate
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/frontends/sph/io.py in _count_particles_chunks(self, psize, chunks, ptf, selector)
29 psize[ptype] += data_file.total_particles[ptype]
30 else:
---> 31 for ptype, (x, y, z), hsml in self._read_particle_coords(chunks, ptf):
32 psize[ptype] += selector.count_points(x, y, z, hsml)
33 return dict(psize)
~/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/yt/frontends/gadget/io.py in _read_particle_coords(self, chunks, ptf)
368 else:
369 hsml = 0.0
--> 370 yield ptype, (pos[:, 0], pos[:, 1], pos[:, 2]), hsml
371 f.close()
372
TypeError: 'NoneType' object is not subscriptable
@MGJvanGroeningen the yt bug you showed is one I fixed in one of my own branches, which I'm going to submit a PR for to the main yt branch soon. If you want to try installing that, do this:
git clone https://github.com/jzuhone/yt yt-jz
cd yt-jz
git checkout gadget_fixes
pip install .
That should fix the yt bug. As for the pyxsim bug, I'll have to look into this, which I'll do today. yt-4.0 is a moving target (slowly moving, but moving nonetheless) and so the API appears to have changed recently.
I've also been working on pyxsim-3.0, which is designed for better usage with yt-4.0, which we may want to try.
@MGJvanGroeningen actually, we don't need pyxsim-3.0, we just need to install it from source at the master branch. Let's get yt working first, though.
@jzuhone Your yt branch indeed fixes the bug. I can run your notebook without errors now! Thanks again.
Hi @MGJvanGroeningen,
Sorry for the delay on this, due to the Thanksgiving holiday and other issues.
You can install pyxsim from Github at the current master branch and it should work with yt-4.0:
git clone https://github.com/jzuhone/pyxsim
cd pyxsim
pip install .
Then you can try adapting the cookbook recipe for the galaxy cluster for your own use and see if it works.
Hi @jzuhone
Thanks for the update, I get the following output and error after installing the master branch of pyxsim and running the (adapted) cookbook recipe on the galaxy cluster.
WARNING: AstropyDeprecationWarning: block_reduce was moved to the astropy.nddata.blocks module. Please update your import statement. [astropy.nddata.utils]
yt : [INFO ] 2020-12-03 15:56:27,870 Calculating time from 6.802e-01 to be 2.815e+17 seconds
yt : [INFO ] 2020-12-03 15:56:27,871 Assuming length units are in kpc/h (comoving)
yt : [INFO ] 2020-12-03 15:56:27,921 Parameters: current_time = 2.8150762954803088e+17 s
yt : [INFO ] 2020-12-03 15:56:27,921 Parameters: domain_dimensions = [1 1 1]
yt : [INFO ] 2020-12-03 15:56:27,921 Parameters: domain_left_edge = [0. 0. 0.]
yt : [INFO ] 2020-12-03 15:56:27,921 Parameters: domain_right_edge = [640000. 640000. 640000.]
yt : [INFO ] 2020-12-03 15:56:27,922 Parameters: cosmological_simulation = 1
yt : [INFO ] 2020-12-03 15:56:27,922 Parameters: current_redshift = 0.4701940848858872
yt : [INFO ] 2020-12-03 15:56:27,922 Parameters: omega_lambda = 0.728
yt : [INFO ] 2020-12-03 15:56:27,922 Parameters: omega_matter = 0.272
yt : [INFO ] 2020-12-03 15:56:27,922 Parameters: omega_radiation = 0.0
yt : [INFO ] 2020-12-03 15:56:27,922 Parameters: hubble_constant = 0.704
yt : [INFO ] 2020-12-03 15:56:27,927 Allocating for 3.504e+06 particles
Loading particle index: 100%|██████████| 6/6 [00:00<00:00, 117597.31it/s]
Loading particle index: 100%|██████████| 6/6 [00:00<00:00, 207982.02it/s]
yt : [INFO ] 2020-12-03 15:56:29,137 max value is 2.14216e-23 at 416853.6562500000000000 456102.5625000000000000 72669.3359375000000000
soxs : [INFO ] 2020-12-03 15:56:29,138 Using APEC version 3.0.9.
soxs : [INFO ] 2020-12-03 15:56:29,138 Using apec_v3.0.9_line.fits for generating spectral lines.
soxs : [INFO ] 2020-12-03 15:56:29,138 Using apec_v3.0.9_coco.fits for generating the continuum.
pyxsim : [INFO ] 2020-12-03 15:56:29,196 Cosmology: h = 0.704, omega_matter = 0.272, omega_lambda = 0.728
pyxsim : [INFO ] 2020-12-03 15:56:29,200 Using emission measure field '(Gas, emission_measure)'.
pyxsim : [INFO ] 2020-12-03 15:56:29,200 Using temperature field '(Gas, Temperature)'.
Preparing spectrum table : 100%|██████████| 51/51 [00:28<00:00, 1.78it/s]
Processing cells/particles : 24%|██▍ | 256871/1075654 [00:09<00:53, 15204.68it/s]Traceback (most recent call last):
File "/home/matthijs/PycharmProjects/TensorflowTest/pyXSIM/pyXSIM_test2.py", line 86, in <module>
photons = pyxsim.PhotonList.from_data_source(sp, redshift, area, exp_time, source_model)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/pyxsim/photon_list.py", line 427, in from_data_source
photons["pos"].append(np.array([chunk[p_fields[0]].d[idxs],
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/data_containers.py", line 248, in __getitem__
self.get_data(f)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/selection_objects/data_selection_objects.py", line 211, in get_data
self._generate_fields(fields_to_generate)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/selection_objects/data_selection_objects.py", line 232, in _generate_fields
fd = self._generate_field(field)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/data_containers.py", line 289, in _generate_field
tr = self._generate_fluid_field(field)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/data_containers.py", line 308, in _generate_fluid_field
rv = finfo(gen_obj)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/fields/derived_field.py", line 288, in __call__
dd = self._function(self, data)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/geometry/coordinates/coordinate_handler.py", line 21, in _coords
rv = data.ds.arr(data.fcoords[..., axi].copy(), units)
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/selection_objects/data_selection_objects.py", line 418, in fcoords
return self._current_chunk.fcoords
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/geometry/geometry_handler.py", line 253, in cached_func
tr = self._accumulate_values(n[1:])
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/geometry/geometry_handler.py", line 290, in _accumulate_values
arrs.append(f(self.dobj))
File "/home/matthijs/anaconda3/envs/tf_py/lib/python3.8/site-packages/yt/data_objects/index_subobjects/particle_container.py", line 16, in _func_non_indexed
raise YTNonIndexedDataContainer(self)
yt.utilities.exceptions.YTNonIndexedDataContainer: The data container (<class 'yt.data_objects.index_subobjects.particle_container.ParticleContainer'>) is an unindexed type. Operations such as ires, icoords, fcoords and fwidth will not work on it.
@MGJvanGroeningen try pulling from the latest github master of pyxsim now, it should work now. I've created a notebook which goes through the steps I used--crucially, when creating the ThermalSourceModel
I've set a minimum temperature to avoid using gas which shouldn't be X-ray emitting. It's here:
https://gist.github.com/jzuhone/56d51566e6d8d51494e19bf8295ccefe
Do you know if there are ElectronAbundance
or Metallicity
fields in this dataset? We can make this better by using those.
@jzuhone it works! I can run your notebook.
As for the electron abundance or metallicity, I think yes. The following paper has a table (Table 1) which I think lists the fields contained in the dataset.
https://ui.adsabs.harvard.edu/abs/2017A%26C....20...52R/abstract
I think the 'Zs' are the (11) metallicities (from Helium to Magnesium), but I'm not super sure.
Hi @MGJvanGroeningen, @Joshuaalbert,
Please note that there is an example using Magneticum snapshots on the website now:
http://hea-www.cfa.harvard.edu/~jzuhone/pyxsim/cookbook/Advanced_Thermal_Emission.html
which includes using variable elements.
Important info regarding reading magneticum snapshots in yt is in these places:
https://yt-project.org/doc/examining/loading_data.html#field-specifications https://yt-project.org/doc/examining/loading_data.html#long-particle-ids
I'm going to close this issue for now but please feel free to open a new one with any new questions.
Hi,
I'm doing a master's project with @joshuaalbert where I'm trying to simulate Chandra images from Magneticum cosmological simulation snapshots (boxsize = 640000 kpc), which use SPH particles. I've been looking at the cookbook mentioned in issue #14
http://hea-www.cfa.harvard.edu/~jzuhone/pyxsim/cookbook/sloshing.html
to visualize the data. I'm using the following code
ds = yt.load(file_name)
slc = yt.SlicePlot(ds, "z", ["density"], center=[417284, 455760, 72878], width=(6000.0,"kpc")) proj = yt.ProjectionPlot(ds, "z", ["density"], center=[417284, 455760, 72878], width=(6000.0,"kpc")) part = yt.ParticleProjectionPlot(ds, "z", [("Halo", "Mass")], center=[417284, 455760, 72878], width=(6000.0,"kpc"))
slc.display() proj.display() part.display()
Which gives the output given below. The ParticleProjectionPlot is mostly to verify that I'm actually looking at the right position, but I expected something similar looking from the SlicePlot and ProjectionPlot in term of distribution of the mass. Do you have any idea what might be going wrong here?
Cheers,
Matthijs