timsbiomed / issues

TIMS issue tracker.
https://github.com/orgs/timsbiomed/projects/9/views/1
1 stars 0 forks source link

CodeSystems: Ingests running on schedule #86

Open joeflack4 opened 1 year ago

joeflack4 commented 1 year ago

Overview

We have several CodeSystems, but no process yet to point to a stable release URI, download the latest release, and run our tools to do transforms and upload that onto our FHIR server.

Sub-task list

Sub-task details

0. Figure out if there are some 'ingests' we can already leverage

https://terminology.hl7.org/external_terminologies.html

1. Develop initial ingests

The biggest issue here is having a stable URI that doesn't change where we can download the latest version of a terminology.

RxNorm and SNOMED can be downloaded via UMLS, but it looks like the URLs change. So may need to write a script to go to this page and find the latest URL.

1.1 RxNorm

Can download via UMLS using API Key. Probably need to handle changing URL. https://documentation.uts.nlm.nih.gov/automating-downloads.html

1.2 LOINC

Problem here is that we have to log in first, then click a checkbox, then do some navigation. Perhaps there is some Python scripting we can do to automate that? Or maybe we can get some special treatment to get a stable URI if we contact them? https://loinc.org/file-access/?download-id=470626

1.3 ICD10CM

1.4 SNOMED CT

Can download via UMLS using API Key. Probably need to handle changing URL. https://documentation.uts.nlm.nih.gov/automating-downloads.html

Here as well but also looks like no stable URI. https://www.nlm.nih.gov/healthit/snomedct/us_edition.html

2. Use FHIR NPM packaging for ingests

Get packages into the proper FHIR format. Once we do that, we can simply load them by pointing to their URLs (regardless where those are) in the HAPI configuration and HAPI will automatically load them without the need for the bash loader scripts.

We need to follow what is specified here: https://confluence.hl7.org/display/FHIR/NPM+Package+Specification An example package: http://hl7.org/fhir/us/core/package.tgz Example of how HAPI is configured to use these packages: https://github.com/phenopackets/pheno-hapi-compose-setup/blob/dev/hapi/config/application-hs-ig-phenopackets.yaml

3. THO/Vulcan notify

These ingests could also be leveraged by the larger community. Davera suggests that when at least one of these ingests is available for consumption, we can notify; so let's come to her at that time and she can relay that info.

Additional info

OMOP->OWL: Thread, Code

Related

ShahimEssaid commented 1 year ago
A Slack discussion about automation Shahim Essaid 10:11 AM I'm in favor or using GH Actions for automation, and I can take the lead on this if we have some conceptual design. Joe Flack 10:11 AM once that's done, then, in addition to updating SNOMED, LOINC, etc, i think mostly what we need is to see if there's a stable URI where release artifacts are published 10:11 then we can have our stuff run on cron... in theory (edited) Shahim Essaid 10:12 AM And with GH hooks, we can really automate things. Joe Flack 10:12 AM i like GH actions too. i also have some experience with it Shahim Essaid 10:13 AM Long term, we should release the FHIR stuff as proper FHIR packages (the packaging format) and then figure out where to deposit those packages (the repository). 10:13 FHIR uses NPM as the format/repo 10:14 For now we can work in GH environment until our content is really FHIR/NPM compatible and then discuss how to move off GH Joe Flack 10:15 AM yeah i remember you talking about that 10:15 i agree with this strategy you propose very much (edited) Shahim Essaid 10:17 AM Let's talk soon about how to get our packages into the proper FHIR format. Once we do that, we can simply load them by pointing to their URLs (regardless where those are) in the HAPI configuration and HAPI will automatically load them without the need for the hacky bash loader scripts. (edited) 10:18 We need to follow what is specified here: https://confluence.hl7.org/display/FHIR/NPM+Package+Specification 10:20 An example package: http://hl7.org/fhir/us/core/package.tgz 10:21 It would be nice of oak can output this :+1: 1 Joe Flack 10:21 AM ah, in the HAPI configuration Shahim Essaid 10:21 AM along with some configuration options for what items go into a single package Joe Flack 10:22 AM i'll add those links to the issue I just made, which i linked in the agenda: https://github.com/HOT-Ecosystem/hapi-issues/issues/86 Shahim Essaid 10:24 AM Yes, the hapi configuration (which is Spring Boot based) allows for profile/incremental configuration. I use this Spring Boot feature to optionally load the Phenopackets IG like this: https://github.com/phenopackets/pheno-hapi-compose-setup/blob/dev/hapi/config/application-hs-ig-phenopackets.yaml 10:25 that configuration "fragment" is loaded when I start HAPI with a Spring Boot profile named "hs-ig-phenopackets" among any other profile names to customize how HAPI will start. 10:25 Various configuration fragments here: https://github.com/phenopackets/pheno-hapi-compose-setup/tree/dev/hapi/config and more to come as needed. An end user can simply enable what they want. 10:27 TIMS terminology packages can be loaded in a similar way. We just point to the package URL, id, and version, like you see here on lines 5-7 https://github.com/phenopackets/pheno-hapi-compose-setup/blob/dev/hapi/config/application-hs-ig-phenopackets.yaml (edited) 10:30 Are you okay if I copy and paste the above to that issue? Joe Flack 10:30 AM I trust you, sure 10:30 you'll see i put some of what you said at the bottom of the issue but feel free to edit 10:31 i just want to clarify with you; what is the mechanism by which HAPI gets updated on a schedule? New 10:31 we can automate through github actions on a schedule the creation of these artefacts, e.g. the NPM packages 10:31 but then, HAPI needs to get updated. Is there some config within HAPI to do that on cron? or do we have to do that outside of the HAPI codebase? Shahim Essaid 10:32 AM I can write a HAPI operation that gets called from GH actions to make HAPI load packages 10:33 HAPI has a package loading service that gets called during boot to load the ones in the configuration. I'm almost sure it can be called during runtime as well. 10:33 But let's do one step at a time. FHIR packages existing somewhere is a must and then we can go from there. Joe Flack 10:44 AM ah ok, that sounds interesting. a HAPI operation for data management, so to speak Shahim Essaid 10:47 AM We can write any REST call we want in HAPI to do whatever we want. We'll just need to learn the internal HAPI APIs to use them from the REST call, that's the hard part. But, I've already looked at this part a little to understand how it works, conceptually: https://github.com/hapifhir/hapi-fhir/blob/master/hapi-fhir-jpaserver-base/src/main/java/ca/uhn/fhir/jpa/packages/PackageInstallerSvcImpl.java (edited)
joeflack4 commented 1 year ago

Reached on on FHIR Zulip, terminology channel, to see if anyone could point me to if there is an (ideally regularly updated) repository of CodeSystems: https://chat.fhir.org/#narrow/stream/179202-terminology

joeflack4 commented 1 year ago

@ShahimEssaid @putmantime I looked at the thread on OMOP vocab in OWL, and found some code there as well. From what I've read, though, OMOP vocabularies are not represented in super high / native fidelity.

I do think this would be cool, though; I did write an OMOP to FHIR converter, but I do wonder how useful the end product would be.

Also note that TermHub essentially has OMOP tables internally, and we are migrating data from N3C already, so for that use case I'm not sure if / how much this would be helping. Siggie would have a better idea.