INCATools / ontology-development-kit

Bootstrap an OBO Library ontology
http://incatools.github.io/ontology-development-kit/
BSD 3-Clause "New" or "Revised" License
224 stars 54 forks source link

Include OLS Dockerfile in seeded repo #360

Open cmungall opened 4 years ago

cmungall commented 4 years ago

See https://github.com/EBISPOT/OLS#building-the-docker-images-manually https://github.com/MaastrichtUniversity/ols-docker/blob/master/Dockerfile

@matentzn do you have any advice on how to go about this?

Should this go in the top level of template/

@hrshdhgd will need for making the allen OLS instance

wdduncan commented 4 years ago

I have used the OLS -Docker service (https://github.com/EBISPOT/OLS-docker) to create local versions of the ontology lookup service.

Do you want to have a make command for creating the docker file?

From what I understand we would need to include a ols-config.yaml file and seed with some default values. Here is an example of one I did for mixs-rdf repo:
https://github.com/GenomicsStandardsConsortium/mixs-rdf/blob/master/ols-docker/ols-config.yaml

wdduncan commented 4 years ago

Also, would be helpful to include scripts for starting and stopping the docker image; e.g.: mixs docker-start.sh

I also included shell scripts for entering the bash prompt, removing the container, etc. But, these may be overkill. See the folder in the mixs-rdf repo:
https://github.com/GenomicsStandardsConsortium/mixs-rdf/tree/master/ols-docker

matentzn commented 4 years ago

We have recently updated the whole infrastructure and have now moved to a docker-compose style architecture rather than using a monolithic OLS (@wdduncan I still used the same one for the monarch OLS, but the more recent setups now use docker compose).

I will work with @dosumis on a general solution to this; our general idea is to build a seed_my_ols function into ODK as a separate process, not coupled with seed my repo, but somewhat mirroring its flow, that can create a fully functioning ols/oxo/zooma instance, customised and ready to run with docker-compose. I have all the scripts and infrastructure ready for that; just need to add the customisation stuff. What do you think? I would be pretty excited to add a seed_my_ols.py to ODK (with options to deploy zooma and oxo as well!).

dosumis commented 4 years ago

We should co-ordinate with @hrshdhgd to have Allan brain data standards ontologies be a test case.

wdduncan commented 4 years ago

@matentzn I think that would be a great! I am happy to test your workflow too.

matentzn commented 3 years ago

This is not the same, but:

After a bit of testing by @hrshdhgd (thanks!):

A very experimental custom OLS setup: https://github.com/monarch-ebi-dev/ontotools-docker-config. Suggested way to deal:

  1. fork
  2. clone fork
  3. cd ontotools-docker-config
  4. sh redeploy.sh (to test deployment on local machine)
  5. Customize!

You are expected to have docker and docker-compose installed. I don't have time right now to integrate in ODK, but I will get to it at some point.

cmungall commented 3 years ago

I'm figuring out implications of this.

So if I have an ontology repo, I should not put any OLS stuff in that repo. I should instead fork the one above, and tweak the config there (doing periodic rebases to stay in sync)?

matentzn commented 3 years ago

Yes - this turns out to be the easier workflow. There is in any case always some kind of adaption that needs to happen beyond the blue print.. Why do you have any concerns here? I know you want a: run_ols command in your makefile, but imo supporting that goes too far.. Its stretches what we can provide support for -> there are 100 things that can go wrong with an OLS instance. I expect people deploying it to know what they are doing -> with ODK I don't so much. Let me know your thoughts though!