Closed sjspielman closed 11 months ago
Noting I'm going to wait to officially request review here until #251 is merged, but it is ready (aka this is not "draft" status) if folks want to look sooner!
Noting the GHA error -
ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
tensorstore==0.1.36 from https://files.pythonhosted.org/packages/0c/7e/ec209ed096cc55b10e4d25a84f6683b8916d4808fa0f92881e24de5a30bb/tensorstore-0.1.36-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (from -r /tmp/Rtmp6pk8WG/renv-requirements-14d51b25703a.txt (line 131)):
Expected sha256 d50b27919cde623e3918fe6ba054f41e2da5d7dbf7817d46d43131b50bcc9df4
Got 0e5e398e0345ef50eae04967b739f9517b534c5d5a3250d402bfbd451c35a045
Noting the GHA error -
This can be safely ignored as a transient cache problem, unless we start to see it on every build. renv
does not store the caches in the requirements file.
Passing locally! ✅
Closes #250
This (surprise!) PR deals with the metadata situation. To be frank, I did not formally diagnose whatever bug I thought might have been going on in #250 (indeed, there may have been no bug at all...), but I did get everything working as it should be which more or less is the goal anyways!
This was definitely necessary because, while writing tests for #150, I realized we had yet to fully deal with altExp metadata in the first place, and this itself was distracting me while writing said tests. So, I went ahead and tackled the metadata situation now via a bit of code refactoring. IMO, this makes our code much more streamlined and modular, and much easier to follow (for me, at least..) compared to what it was before.
This PR consolidates metadata preparation code as follows:
merge_sce_list()
, I check that the expected metadata fields are present. This is handled by the new functioncheck_metadata()
.prepare_merged_metadata()
- this function now handles every aspect of the metadata, which again I think greatly simplifies the script overall and makes it easier to follow.combineCols
, I also create a "MVM" (minimum viable metadata) list to include in the final merged object withextract_metadata_for_altexp()
. All other potential metadata fields will therefore end up get filled in withNULL
.extract_metadata_for_altexp()
could also be an anonymous function in thepurrr::map()
directly?prepare_merged_metadata()
takes a metadata list rather than SCE so that we can use it in situations where there is no metadata, ala for an altexp that didn't originally existAll is tested and passing locally. Enjoy today's🎢 !