Closed prashjet closed 2 months ago
Looking at the model folder, there are only two files in case of the Python NNLS
solver (the weights file orbit_weights.ecsv
and the copy of the config file), but many additional nn*
files when using LegacyWeightSolver
, including the mentioned nn_orbmat.out
as the largest one. Here's an assessment:
orbit_weights.ecsv
together with the orbit data in datfil/
suffice and DYNAMITE is not using any of the nn*
files after LegacyWeightSolver.solve()
is finished, so in principle they can all be deleted at the end of LegacyWeightSolver.solve()
after generating orbit_weights.ecsv
.LegacyWeightSolver
. Here is an assessment of the legacy files with comments, some may help to identify how information can be extracted from other DYNAMITE classes:
nn_aphist.out
: not used by DYNAMITE, similar to Decomposition.comps_aphist()
?nn_con.out
: not used by DYNAMITE, use: mass constraint information?nn_intrinsic_moments.out
: not used by DYNAMITE, replaced by LegacyOrbitLibrary.get_model_intrinsic_moment_constructor()
. Used e.g. in Plotter.beta_plot()
calling Plotter.anisotropy_single()
.nn_kinem.out
and nn_nnls.out
: read by LegacyWeightSolver.read_chi2()
which is not used anywhere in DYNAMITE. That method was taken from the old schwpy code to calculate chi2 and kinchi2. They differ from the chi2 definitions used in DYNAMITE. To keep those legacy chi2 values for potential use, we can add them to the meta data of orbit_weights.ecsv
as chi2_legacy
and chi2_kin_legacy
(just like the existing chi2_tot
, chi2_kin
, and chi2_kinmap
). Alternatively, if everyone agrees we can eliminate LegacyWeightSolver.read_chi2()
altogether and also (a) remove the method LegacyWeightSolver.__read_file_element()
that is called by LegacyWeightSolver.read_chi2()
and not used anywhere else in DYNAMITE and (b) remove the filenames’ own class attributes (self.fname_nn_kinem
and self.fname_nn_nnls
, resp.).nn_orb.out
: read by LegacyWeightSolver.read_weights()
which is not used anywhere in DYNAMITE. That method reads nn_orb.out
and returns an astropy table with the columns [orb_idx, E_idx, I2_idx, I3_idx, totalnotregularizable, orb_type, weight, lcut]. Not sure about totalnotregularizable and lcut. Is orb_type the same as the classification in LegacyOrbitLibrary.classify_orbits()
? The weights column should be the same as the corresponding entry in the weights file orbit_weights.ecsv
.nn_orbmat.out
: read by LegacyWeightSolver.read_nnls_orbmat_rhs_and_solution()
, called by LegacyWeightSolver.get_weights_and_chi2_from_orbmat_file()
to calculate orbit_weights.ecsv
(including the three chi2 values). Not used by DYNAMITE any further.nn.in
: not used by DYNAMITE. This is the input file to the Fortran LegacyWeightSolver
executables.nnls.log
: not used by DYNAMITE, but worthwhile to keep for diagnostic reasons…
We are currently storing the matrix of size (N_orb, N_bins * 6) in
nn_orbmat.out
. This is >100 larger than any other output file.In previous versions (schwpy) this file wasn't stored. I decided to store this since it was useful to when re-implementing weight solving in Python. For the dimensions of the problem we were using at the time,
nn_orbmat.out
was no larger than other outputs, so I didn't think it would cause a problem. I didn't correctly extrapolate this change into the future, and more or less forgot about it since then. Clearly this was a mistake - my apologies!Can we safely delete the
nn_orbmat.out
files? If so, let's do it! If not, then:before we can safely delete
nn_orbmat.out
.