The-Academic-Observatory / oaebu-workflows

Telescopes, Workflows and Data Services for the 'Book Analytics Dashboard Project (2022-2025)', building upon the project 'Developing a Pilot Data Trust for Open Access eBook Usage (2020-2022)'
https://documentation.book-analytics.org/
Apache License 2.0
5 stars 0 forks source link

Bad 210 wp2 standardised data ingest pipelines workflows #105

Closed keegansmith21 closed 2 years ago

keegansmith21 commented 2 years ago

Integrated Crossref events and metadata retrieval into the Onix workflow to eliminate the dependency on The Academic Observatory repository.

The Onix Workflow will now query the crossref metadtata and events APIs using the ISBNs from the Onix table. It will then create its own "book" table and use this to create the book_products table for each publisher.

A similar process is done for the Oapen workflow, which also requires Crossref data.

codecov[bot] commented 2 years ago

Codecov Report

Merging #105 (2bd058c) into develop (60a051b) will decrease coverage by 0.42%. The diff coverage is 91.54%.

@@             Coverage Diff             @@
##           develop     #105      +/-   ##
===========================================
- Coverage    95.25%   94.82%   -0.43%     
===========================================
  Files           23       23              
  Lines         2359     2627     +268     
  Branches       302      338      +36     
===========================================
+ Hits          2247     2491     +244     
- Misses          55       70      +15     
- Partials        57       66       +9     
Impacted Files Coverage Δ
oaebu_workflows/workflows/onix_workflow.py 92.74% <90.22%> (-1.56%) :arrow_down:
oaebu_workflows/workflows/oapen_workflow.py 92.61% <96.61%> (+1.41%) :arrow_up:

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

keegansmith21 commented 2 years ago

@jdddog - Comments have been addressed and ready for re-review

Yes, the workflow was run locally from end-to-end and I successfully imported the data into Kibana