Closed taldcroft closed 2 years ago
What do you think about using sheet id unstead of spreadsheet id for scenarios? That way there could be one spreadsheet used for an anomaly response, and different sheets for each scenario being considered (with one central document holding everything).
see Google Sheets API Doc for information about the differences.
@jskrist - About sheet id, I'm a little worried about managing permissions. Right now the idea is that the "flight" Commands Sheet is editable by only a small set of people, but viewable by the world. I think if you make another sheet (tab) then it will have the same limited edit permissions, but the idea is that anyone can make their own scenario sheet and edit.
@taldcroft, that makes sense. I see the power and flexibility of allowing individuals to make scenarios in their own spreadsheets as they are doing analysis, but it puts back a little of the "where is the latest data?" problem, that this central sheet was resolving.
@jeanconn - any progress on review of this PR?
Not really, I'll have more for today.
For review, the new commands core.py
module started life as kadi/commands/commands.py
, so you can see the diffs somewhat better with:
git show master:kadi/commands/commands.py > commands_master.py
opendiff commands-master.py kadi/commands/core.py
@jeanconn - I added a more complete functional test of the star and fid identification in the top description.
Looking at the diff for core / commands, what were your thoughts on this bit
# Convert 'date' from bytestring to unicode. This allows
# date2secs(out['date']) to work and will generally reduce weird problems.
out.convert_bytestring_to_unicode()
No longer necessary? It just wasn't clear to me from the original comment of "weird problems" what I should look for in testing to make sure it isn't a problem going forward.
Should there be a test that hits the new stuff in parse_cm or are the unit tests over there sufficient?
Looking at the diff for core / commands, what were your thoughts on this bit
Great catch, that is an undesired API change for V1. I'll fix it. I also see how to fix Chandra.Time.date2secs
so the original weirdness doesn't show up (done in https://github.com/sot/Chandra.Time/pull/53).
Regarding "I'll also"... did you decide the fix in Chandra.Time was sufficient so we don't need to bring back the bytestring_to_unicode?
Me, I'm OK with the unicode fix being only in Chandra.Time.date2secs and not keeping the convert_bytestring_to_unicode in the V1 code. I'm also OK with no tests in here that directly hit the new parse_cm.csd code. So I think the testing is sufficient for V1 and the testing and docs are sufficient for V2, so I'm going to mark this approved.
Thanks for the review @jeanconn ! I actually had deaa502 in my local version for a couple days but forgot to push it. My goal was to make V1 as close to the original as possible. Right now adding the time
column is the only difference I know about, and that it very unlikely to break anything.
Testing the custom scenario with an RTS is a good idea. I just adapted the example in the narrative docs to a unit test in 30073a8, so thanks for the push.
Description
This PR implements the command archive v2.0 concept.
This requires https://github.com/sot/parse_cm/pull/35.
Testing
Interface impacts
pprint_like_backstop
method onCommandTable
V1 commands (currently the default)
time
column to the CommandTable returned byget_cmds
V2 commands (beta)
timeline_id
column and add asource
column to the CommandTable.get_observations
andget_starcats
functions.date
andsource
are kept in encoded ASCIIbytes
in the table instead of being converted to unicode for improved performance. The astropy Table "unicode-sandwich" machinery should make this largely transparent to the user.Functional testing
Star and fid identification [OK]
Confirm that with 1.5 arcsec star search box and 40 arcsec fid search box that stars and fids are identified for every observation in the archive. It appears there are a handful of failed IDs for fids before 2008. I suspect these are defects in the commands archive itself, but in any case this low-level of problems for old observations is OK.
Task schedule [OK]
On Mac with this PR installed to a test Ska3 env:
This ran successfully and updated
$SKA/data/kadi/cmds2.{h5,pkl}
as expected.Configuration [OK]
Now edit the config file to set
default_lookback = 60
and re-run but withoverwrite=False
:Tentative migration plan
See #216
Fixes #172