Open torrinba opened 3 months ago
I think I might have done something to address this inside the machine mapping function here: https://github.com/gafusion/omas/blob/e0bd766352d6b9ac6ae2b65ff7fa9060268b8942/omas/machine_mappings/d3d.py#L1386 Essentially having one function for all core_profile fields, but I think there is still conflict here and we might need to pull that part in there too https://github.com/gafusion/omas/blob/e0bd766352d6b9ac6ae2b65ff7fa9060268b8942/omas/machine_mappings/d3d.json#L116-L117 https://github.com/gafusion/omas/blob/e0bd766352d6b9ac6ae2b65ff7fa9060268b8942/omas/machine_mappings/d3d.json#L119-L121
If you have a test we can use to see this issue that would be helpful (maybe something we should add to the OMFIT regression tests since those currently seem to be insensitive to this problem, somewhat surprisingly)
A specific test would be to load first something that core_profiles_profile_1d
provides and then try to access core_profiles.global_quantities.v_loop
Do you have an example of where the core_profiles
machine mapping works?
with ods.open('d3d', 1795780001):
psi = ods[f'core_profiles.profiles_1d.:.grid.psi']
I've tried several shots but always see errors like this:
DEBUG (dynamic): Dynamic open {'machine': 'd3d', 'pulse': 1795780001, 'options': {}, 'branch': '', 'user_machine_mappings': None}
DEBUG (dynamic): Dynamic read {'machine': 'd3d', 'pulse': 1795780001, 'options': {}, 'branch': '', 'user_machine_mappings': None}: core_profiles.profiles_1d.:
size(\ZIPFIT01::TOP.PROFILES.EDENSFIT,1)
core_profiles.profiles_1d.: issue:TreeFOPENR('%TREE-E-FOPENR, Error opening file read-only.\n - server: atlas.gat.com:8000\n - treename: ZIPFIT01\n - pulse: 1795780001\n - TDI: size(\\ZIPFIT01::TOP.PROFILES.EDENSFIT,1)')
DEBUG (dynamic): Dynamic close {'machine': 'd3d', 'pulse': 1795780001, 'options': {}, 'branch': '', 'user_machine_mappings': None}
Traceback (most recent call last):
File "test.py", line 11, in <module>
psi = ods[f'core_profiles.profiles_1d.:.grid.psi']
File "/fusion/projects/codes/atom/omfit_omega_v3.x/atom_git/OMFIT-source_unstable/omas/omas/omas_core.py", line 1324, in __getitem__
return value.__getitem__(key[1:], cocos_and_coords)
File "/fusion/projects/codes/atom/omfit_omega_v3.x/atom_git/OMFIT-source_unstable/omas/omas/omas_core.py", line 1324, in __getitem__
return value.__getitem__(key[1:], cocos_and_coords)
File "/fusion/projects/codes/atom/omfit_omega_v3.x/atom_git/OMFIT-source_unstable/omas/omas/omas_core.py", line 1239, in __getitem__
raise ValueError('`%s` has no data' % self.location)
ValueError: `core_profiles.profiles_1d` has no data
Stale issue message
@AreWeDreaming reports that the machine mapping for DIII-D profiles which OMAS autogenerates pulls from 2 different places so the MDS+ location changes every time it's regenerated.
What's the best way to sort this out @orso82?