sot / sparkles

ACA review of star catalogs from proseco
BSD 3-Clause "New" or "Revised" License
0 stars 1 forks source link

Do not use aca_t_ccd_penalty_limit via import #196

Closed jeanconn closed 1 year ago

jeanconn commented 1 year ago

Description

Do not use aca_t_ccd_penalty_limit via import . If we only use the penalty limit attribute directly within the test function it will have a value that corresponds to CHANDRA_MODELS_DEFAULT_VERSION=3.48 from the conftest.py .

Fixes test against proseco with https://github.com/sot/proseco/pull/386 .

Interface impacts

Testing

Tested out of a ska3-masters environment on fido that included proseco up through https://github.com/sot/proseco/pull/386 .

Unit tests

Independent check of unit tests by Javier

Functional tests

No functional testing.

jeanconn commented 1 year ago

Thanks Javier. I updated the description to indicate that I also tested out of an environment (a ska3 masters env) that has the latest proseco.

I think there would be an argument to updating proseco to handle these attributes more dynamically but for now, in the sparkles tests, accessing the penalty limit just in the function (leaving the import at the top but not the attribute directly) still seems consistent with how this is used in other places.

javierggt commented 1 year ago

Yes, as it is, one can have the opposite problem, right? In other words, the chandra models version that is set when one first uses aca_t_ccd_penalty_limit will determine its value, and any changes to the chandra models version will have no effect.

I'm guessing that can happen with sparkles tests if it starts requiring a newer chandra models version.

Did I get it right?

jeanconn commented 1 year ago

If we update proseco to have more dynamic handling and define the opposite problem as "surprise that aca_t_ccd_penalty/planning limit has a value that changed when CHANDRA_MODELS_DEFAULT_VERSION changed" ... I I think that the opposite problem is unlikely to bite us as it seems like that would be a very specific analysis task (like if you tried to use mismatched versions of models via setting CHANDRA_MODELS_DEFAULT_VERSION on purpose and then got confused).

jeanconn commented 1 year ago

With "I'm guessing that can happen with sparkles tests if it starts requiring a newer chandra models version." my thought was that we'd be explicit in https://github.com/sot/sparkles/blob/6349d0ff40d7b00ef7a9c7f20e8f2e6c58ea6e62/sparkles/tests/conftest.py#L7 or in individual tests that would most likely just have one version of the models within one test. But in the end we do need to define what the expected behavior is --- this PR basically just avoided the question by updating to use an interface that was working more the way the test expected.