sciencehistory / chf-sufia

sufia-based hydra app
Other
9 stars 4 forks source link

Integrate triplestore cache to newly-generated hyrax app for external authority data #708

Closed hackartisan closed 7 years ago

hackartisan commented 7 years ago

This is the final piece of work for fulfilling the IMLS Sparks! grant.

hackartisan commented 7 years ago

Reading over the grant to determine the minimum required here. Will update.

hackartisan commented 7 years ago

Immediate success will be measured by the ability to create functional, modular code that can successfully be deployed. CHF will conduct the first test implementation, beginning by deploying the code to its staging server and checking for bugs and compatibility with the greater Hydra stack using Hydra’s Sufia 7. Once all tests pass, CHF will deploy to its production server, access to which is currently limited to staff with IP addresses within CHF’s Philadelphia office. We will then capture and share five use cases for how diverse in-house staff integrate the improved autocomplete metadata template into their workflows and how the new linked data functionality enhances our collection management abilities. With these initial measures, we will widely promote the new functionality as a successful linked data implementation model that other organizations can consider.

hackartisan commented 7 years ago

also:

After development, CHF expects to use months six and seven of the grant period for implementation and testing. This will begin with in-house efforts, as CHF will deploy this code to its production server and ask catalogers to integrate the new metadata template into their workflow. Once satisfied with the in-house response, Ms. Headley and Ms. Lu will share screenshots and reports on internal workflows to targeted Hydra partners working on linked data and descriptive metadata with the hope that they might consider implementing this code. When we are confident in the performance of the code, Ms. Headley will work on extraction and consolidation with other relevant codebases and will write extensive documentation in order to make it easy for others to implement the code. Documentation will be found in a ReadMe file that will be available with the code on GitHub. The ReadMe file will include a description of the problem as well as technical notations related to dependencies, installation, and configuration. It will also include usage examples and development information, including how to run tests and how to contribute code back into the project.

hackartisan commented 7 years ago

also:

We expect to use the final quarter of the project period to define more specific requirements and priorities for additional functionality based on user feedback, and to engage in an aggressive promotion and outreach effort. CHF plans to deliver a poster and possibly present a paper at the annual Hydra Connect conference in September 2017.

hackartisan commented 7 years ago

Timebox to a week.

MDiMeo commented 7 years ago

For reference, he's the final reporting form from the IMLS: https://www.imls.gov/sites/default/files/imls_final_performance_report_2016.pdf I plan to attach an appendix with screenshots.

hackartisan commented 7 years ago

Thanks @MDiMeo that's helpful.

I'm going to take notes on our wiki while I work this through: https://github.com/chemheritage/chf-sufia/wiki/Caching-linked-authority-data

Then if it makes sense I will PR it to the hyrax development documentation site: https://github.com/samvera/samvera.github.io

hackartisan commented 7 years ago

I worked through the documentation in a new hyrax 1.0.3 application. I was able to get the functionality to mostly work in a dev environment.

I submitted PRs to the qa-ldf gem to

These PRs are pending review. Once accepted, the README of the qa-ldf gem sufficiently documents the process; there's no need to distribute or centralize the notes I took during this work.

I made small clarification updates to the Hyrax wiki page, as well. This content was duplicated in the qa-ldf README, with discrepancies, so once I figured out that the wiki page was more updated I submitted a PR to the README to remove the duplicated content (also pending review).

The code works, with 2 bugs:

@MDiMeo please reopen if you need anything more.

MDiMeo commented 7 years ago

@anna No need to reopen, but I'm going to use this ticket to add links to the code that aren't included in the ticket above. [I had trouble finding qa-ldf just now, but it's because I forgot it was in the curationexperts repo.]

PR to improve Read Me: https://github.com/curationexperts/qa-ldf/pull/27 PR to update dependencies: https://github.com/curationexperts/qa-ldf/pull/26