yt-project / yt

Main yt repository
http://yt-project.org
Other
468 stars 280 forks source link

missing Groups when loading gadget data #2089

Closed astro313 closed 6 years ago

astro313 commented 6 years ago

Hi,

I am trying to load some simulation output files from gadget stored in .hdf5 format.

When I run h5ls -r <file>, it shows all the fields.

h5ls -r snap_m25n256_151.hdf5
/                        Group
/Header                  Group
/PartType0               Group
/PartType0/AGS-Softening Dataset {15642025}
/PartType0/Coordinates   Dataset {15642025, 3}
/PartType0/DelayTime     Dataset {15642025}
/PartType0/Density       Dataset {15642025}
/PartType0/Dust_Masses   Dataset {15642025}
/PartType0/Dust_Metallicity Dataset {15642025, 11}
/PartType0/ElectronAbundance Dataset {15642025}
/PartType0/FractionH2    Dataset {15642025}
/PartType0/GrackleHI     Dataset {15642025}
/PartType0/GrackleHII    Dataset {15642025}
/PartType0/GrackleHM     Dataset {15642025}
/PartType0/GrackleHeI    Dataset {15642025}
/PartType0/GrackleHeII   Dataset {15642025}
/PartType0/GrackleHeIII  Dataset {15642025}
/PartType0/HaloID        Dataset {15642025}
/PartType0/ID_Generations Dataset {15642025}
/PartType0/InternalEnergy Dataset {15642025}
/PartType0/Masses        Dataset {15642025}
/PartType0/Metallicity   Dataset {15642025, 11}
/PartType0/NWindLaunches Dataset {15642025}
/PartType0/NeutralHydrogenAbundance Dataset {15642025}
/PartType0/ParticleIDs   Dataset {15642025}
/PartType0/Potential     Dataset {15642025}
/PartType0/Sigma         Dataset {15642025}
/PartType0/SmoothingLength Dataset {15642025}
/PartType0/StarFormationRate Dataset {15642025}
/PartType0/Velocities    Dataset {15642025, 3}
/PartType1               Group
/PartType1/AGS-Softening Dataset {16777216}
/PartType1/Coordinates   Dataset {16777216, 3}
/PartType1/HaloID        Dataset {16777216}
/PartType1/ID_Generations Dataset {16777216}
/PartType1/Masses        Dataset {16777216}
/PartType1/ParticleIDs   Dataset {16777216}
/PartType1/Potential     Dataset {16777216}
/PartType1/Velocities    Dataset {16777216, 3}
/PartType4               Group
/PartType4/AGS-Softening Dataset {1113046}
/PartType4/Coordinates   Dataset {1113046, 3}
/PartType4/Dust_Masses   Dataset {1113046}
/PartType4/Dust_Metallicity Dataset {1113046, 11}
/PartType4/HaloID        Dataset {1113046}
/PartType4/ID_Generations Dataset {1113046}
/PartType4/Masses        Dataset {1113046}
/PartType4/Metallicity   Dataset {1113046, 11}
/PartType4/ParticleIDs   Dataset {1113046}
/PartType4/Potential     Dataset {1113046}
/PartType4/StellarFormationTime Dataset {1113046}
/PartType4/Velocities    Dataset {1113046, 3}
/PartType5               Group
/PartType5/AGS-Softening Dataset {718}
/PartType5/BH_AccretionLength Dataset {718}
/PartType5/BH_Mass       Dataset {718}
/PartType5/BH_Mass_AlphaDisk Dataset {718}
/PartType5/BH_Mdot       Dataset {718}
/PartType5/BH_NProgs     Dataset {718}
/PartType5/Coordinates   Dataset {718, 3}
/PartType5/HaloID        Dataset {718}
/PartType5/ID_Generations Dataset {718}
/PartType5/Masses        Dataset {718}
/PartType5/ParticleIDs   Dataset {718}
/PartType5/Potential     Dataset {718}
/PartType5/StellarFormationTime Dataset {718}
/PartType5/Velocities    Dataset {718, 3}

But when I run ds = yt.load(file); ds.fields_list, it only showed PartType0 and PartType1.

In [62]: ds.field_list
yt : [INFO     ] 2018-11-07 00:40:32,667 Allocating for 3.355e+07 particles (index particle type 'all')
yt : [INFO     ] 2018-11-07 00:40:37,390 Identified 2.397e+06 octs
Out[62]: 
[('PartType0', 'AGS-Softening'),
 ('PartType0', 'Coordinates'),
 ('PartType0', 'DelayTime'),
 ('PartType0', 'Density'),
 ('PartType0', 'Dust_Masses'),
 ('PartType0', 'Dust_Metallicity'),
 ('PartType0', 'ElectronAbundance'),
 ('PartType0', 'FractionH2'),
 ('PartType0', 'GrackleHI'),
 ('PartType0', 'GrackleHII'),
 ('PartType0', 'GrackleHM'),
 ('PartType0', 'GrackleHeI'),
 ('PartType0', 'GrackleHeII'),
 ('PartType0', 'GrackleHeIII'),
 ('PartType0', 'HaloID'),
 ('PartType0', 'ID_Generations'),
 ('PartType0', 'InternalEnergy'),
 ('PartType0', 'Masses'),
 ('PartType0', 'Metallicity_00'),
 ('PartType0', 'Metallicity_01'),
 ('PartType0', 'Metallicity_02'),
 ('PartType0', 'Metallicity_03'),
 ('PartType0', 'Metallicity_04'),
 ('PartType0', 'Metallicity_05'),
 ('PartType0', 'Metallicity_06'),
 ('PartType0', 'Metallicity_07'),
 ('PartType0', 'Metallicity_08'),
 ('PartType0', 'Metallicity_09'),
 ('PartType0', 'Metallicity_10'),
 ('PartType0', 'NWindLaunches'),
 ('PartType0', 'NeutralHydrogenAbundance'),
 ('PartType0', 'ParticleIDs'),
 ('PartType0', 'Potential'),
 ('PartType0', 'Sigma'),
 ('PartType0', 'SmoothingLength'),
 ('PartType0', 'StarFormationRate'),
 ('PartType0', 'Velocities'),
 ('PartType1', 'AGS-Softening'),
 ('PartType1', 'Coordinates'),
 ('PartType1', 'HaloID'),
 ('PartType1', 'ID_Generations'),
 ('PartType1', 'Masses'),
 ('PartType1', 'ParticleIDs'),
 ('PartType1', 'Potential'),
 ('PartType1', 'Velocities'),
 ('all', 'AGS-Softening'),
 ('all', 'Coordinates'),
 ('all', 'HaloID'),
 ('all', 'ID_Generations'),
 ('all', 'Masses'),
 ('all', 'ParticleIDs'),
 ('all', 'Potential'),
 ('all', 'Velocities')]

Is there a way to force yt to load all the Groups from the hdf5 files?

Thanks!

ngoldbaum commented 6 years ago

We should be loading all of them.

Any chance you can share this output? You can use yt upload some_file.tar.gz to publicly share it. I'm also happy to do that privately if you're not comfortable publicly sharing the data.

astro313 commented 6 years ago

I just uploaded the file: test.tgz.

astro313 commented 6 years ago

I am using yt version 3.4.1.

ngoldbaum commented 6 years ago

You need to share the URL that gets printed out. The latest yt version is 3.5.0, you might also try updating yt.

astro313 commented 6 years ago

Here it is: http://use.yt/upload/7d143578

ngoldbaum commented 6 years ago

When I try to unzip that file I get an error:

$ tar xzvf test.tgz
s48/snap_m25n256_151.hdf5
tar: s48/snap_m25n256_151.hdf5: Wrote only 512 of 10240 bytes
tar: Exiting with failure status due to previous errors

Are you able to unzip that file?

ngoldbaum commented 6 years ago
$ ls -l s48/snap_m25n256_151.hdf5
-rw-r--r--. 1 goldbaum goldbaum 3852308480 Nov  6 18:02 s48/snap_m25n256_151.hdf5

Is that the right size?

astro313 commented 6 years ago

file size of original hdf5 is 4089575170.

after uncompressing the test.tgz, I get the same size. Should I try uploading the file another way?

ngoldbaum commented 6 years ago

Sure, maybe try google drive or dropbox?

astro313 commented 6 years ago

could you please try: https://www.dropbox.com/s/2jobx9dosihrts8/test.tgz?dl=0

ngoldbaum commented 6 years ago

So I'm still seeing the same errors from tar when I try to unzip the file. Can you just upload the raw unzipped file?

astro313 commented 6 years ago

how about https://www.dropbox.com/s/6i53yyud39qnvlp/snap_m25n256_151.hdf5?dl=0

ngoldbaum commented 6 years ago

ok, with yt 3.5.0 I see:

In [1]: ds.field_list
Out[1]:
[('PartType0', 'AGS-Softening'),
 ('PartType0', 'Coordinates'),
 ('PartType0', 'DelayTime'),
 ('PartType0', 'Density'),
 ('PartType0', 'Dust_Masses'),
 ('PartType0', 'Dust_Metallicity'),
 ('PartType0', 'ElectronAbundance'),
 ('PartType0', 'FractionH2'),
 ('PartType0', 'GrackleHI'),
 ('PartType0', 'GrackleHII'),
 ('PartType0', 'GrackleHM'),
 ('PartType0', 'GrackleHeI'),
 ('PartType0', 'GrackleHeII'),
 ('PartType0', 'GrackleHeIII'),
 ('PartType0', 'HaloID'),
 ('PartType0', 'ID_Generations'),
 ('PartType0', 'InternalEnergy'),
 ('PartType0', 'Masses'),
 ('PartType0', 'Metallicity_00'),
 ('PartType0', 'Metallicity_01'),
 ('PartType0', 'Metallicity_02'),
 ('PartType0', 'Metallicity_03'),
 ('PartType0', 'Metallicity_04'),
 ('PartType0', 'Metallicity_05'),
 ('PartType0', 'Metallicity_06'),
 ('PartType0', 'Metallicity_07'),
 ('PartType0', 'Metallicity_08'),
 ('PartType0', 'Metallicity_09'),
 ('PartType0', 'Metallicity_10'),
 ('PartType0', 'NWindLaunches'),
 ('PartType0', 'NeutralHydrogenAbundance'),
 ('PartType0', 'ParticleIDs'),
 ('PartType0', 'Potential'),
 ('PartType0', 'Sigma'),
 ('PartType0', 'SmoothingLength'),
 ('PartType0', 'StarFormationRate'),
 ('PartType0', 'Velocities'),
 ('PartType1', 'AGS-Softening'),
 ('PartType1', 'Coordinates'),
 ('PartType1', 'HaloID'),
 ('PartType1', 'ID_Generations'),
 ('PartType1', 'Masses'),
 ('PartType1', 'ParticleIDs'),
 ('PartType1', 'Potential'),
 ('PartType1', 'Velocities'),
 ('PartType4', 'AGS-Softening'),
 ('PartType4', 'Coordinates'),
 ('PartType4', 'Dust_Masses'),
 ('PartType4', 'Dust_Metallicity'),
 ('PartType4', 'HaloID'),
 ('PartType4', 'ID_Generations'),
 ('PartType4', 'Masses'),
 ('PartType4', 'Metallicity_00'),
 ('PartType4', 'Metallicity_01'),
 ('PartType4', 'Metallicity_02'),
 ('PartType4', 'Metallicity_03'),
 ('PartType4', 'Metallicity_04'),
 ('PartType4', 'Metallicity_05'),
 ('PartType4', 'Metallicity_06'),
 ('PartType4', 'Metallicity_07'),
 ('PartType4', 'Metallicity_08'),
 ('PartType4', 'Metallicity_09'),
 ('PartType4', 'Metallicity_10'),
 ('PartType4', 'ParticleIDs'),
 ('PartType4', 'Potential'),
 ('PartType4', 'StellarFormationTime'),
 ('PartType4', 'Velocities'),
 ('PartType5', 'AGS-Softening'),
 ('PartType5', 'BH_AccretionLength'),
 ('PartType5', 'BH_Mass'),
 ('PartType5', 'BH_Mass_AlphaDisk'),
 ('PartType5', 'BH_Mdot'),
 ('PartType5', 'BH_NProgs'),
 ('PartType5', 'Coordinates'),
 ('PartType5', 'HaloID'),
 ('PartType5', 'ID_Generations'),
 ('PartType5', 'Masses'),
 ('PartType5', 'ParticleIDs'),
 ('PartType5', 'Potential'),
 ('PartType5', 'StellarFormationTime'),
 ('PartType5', 'Velocities'),
 ('all', 'AGS-Softening'),
 ('all', 'Coordinates'),
 ('all', 'HaloID'),
 ('all', 'ID_Generations'),
 ('all', 'Masses'),
 ('all', 'ParticleIDs'),
 ('all', 'Potential'),
 ('all', 'Velocities')]

Can you try updating your yt installation? I'd also strongly encourage you to try using the yt-4.0 branch when working with SPH datasets of this size, since that branch will produce higher fidelity visualizations and will be much, much faster. I can share more detail about yt-4.0 if you're curious about that.

astro313 commented 6 years ago

Thanks! Indeed, the latest version fixed this! Thanks for your help.