Open schristley opened 11 months ago
initial AK ontology has been created: https://github.com/airr-knowledge/ak-ontology
@jamesaoverton Is there a robot to keep this in sync with the ontologies in the LinkML schema?
- [ ] Will we be publishing this on OBO Foundry?
Yes, though not until later. Depends upon the difficulty of the publishing. If it's easy then we can aim to publish with first beta release.
That's not something I've done before, but I'm sure we can make it work. I would start by playing with LinkML's dynamic enum feature. It seems to take the same parameters as ROBOT extract
's --branch-from-term
option.
some info from James in an email:
In the ROBOT documentation <http://robot.obolibrary.org/> you'll find the 'extract' command for carving an ontology into pieces, the 'export' command for turning (chunks of) ontologies into tables, and the 'template' command for turning tables into (chunks of) ontologies. Those should be able to do all the heavy lifting. It might take some glue code to link those with LinkML (dynamic) enums, and/or the tool here: https://linkml.io/linkml/schemas/enums.html#tooling-to-support-dynamic-enums. I expect that the LinkML community on the OBO Slack and their regular calls would be interested in collaborating on that sort of tooling.
- [ ] How to incorporate ONTIE?
This should just be like including any other ontology? Actually maybe not, just putting in the yaml file with the other ontologies didn't work
INVALID ONTOLOGY FILE ERROR Could not load a valid ontology from file: ONTIE-download.owl
- [ ] Fill out with additional ontologies
Added more based upon the ones in the LinkML schema. Some have been commented out because of out of memory error.
@jamesaoverton I think I set the parameter right to increase memory for ROBOT, but I'm not sure. How much memory do you think is needed for something like NCBITaxon? My docker env is set to 11GB but maybe I need more.
@jamesaoverton I think I set the parameter right to increase memory for ROBOT, but I'm not sure. How much memory do you think is needed for something like NCBITaxon? My docker env is set to 11GB but maybe I need more.
@jamesaoverton I figured it out! I was running make
directly inside the docker instead of using that special run.sh
script.
well almost, seemed to work for GAZ but still ran out of memory for NCBITaxon, probably need to bump up
we cannot checkin these big ontologies into GitHub anyways, so need to generate for local dev.