Open pdurbin opened 6 years ago
Hi Phil, images in vtycloud are mine images indeed but still in the experimental phase. We experiencing problems with Google persistent storage as all files (one volume) and stuff like images from docroot (second volume) should be persistent. But at the moment there is limitation from Google to mount only one volume for one container and we have to find some solution for that. Therefore I'm not sharing those images with community yet.
@4tikhonov that's fine. Please just let me know when you're ready to publish even somewhat experimental images on Docker Hub. Thanks!!
Hi @pdurbin, I've just published on Docker Hub latest image (Dataverse 4.9.4) with multilingual support: vtycloud/dataverse:4.9.4
Other images also there: vtycloud/solr7 and vtycloud/postgres
@4tikhonov great! Should we update the README.md in this repo to link to them?
@4tikhonov heads up that https://github.com/whole-tale/whole-tale/issues/49 indicates that the Whole Tale project intends to use dataverse-docker to spin up a Dataverse environment for development purposes (probably https://github.com/whole-tale/dashboard/issues/269 to start).
Some day we would like the Whole Tale team to offer an external tool to Dataverse installations: https://github.com/IQSS/dataverse/issues/5097
Consider to give my https://github.com/IQSS/dataverse/issues/5292 a look and leave a comment :-)
I'm just providing the full link so it's clickable: https://github.com/IQSS/dataverse/issues/5292
As part of my work on https://github.com/whole-tale/whole-tale/issues/49 I ran through the basic docker-compose build
process. I ran into a minor issue where the postgres
role didn't exist in the db
container and is expected by the dvinstall/install
script. After manually creating the role, I was able to deploy Dataverse. Per @4tikhonov's suggestion, I changed the DOI provider to EZID.
I'm curious to know why the initial.bash
approach is used to download dependencies outside of the docker build process instead of adding the dependency downloads to the Dockerfiles themselves. This prevents me from setting up automated builds to create these images, which is a goal for https://github.com/nds-org/ndslabs-dataverse/issues/8.
Would you be open to a PR that moved the dependency preparation into the respective Dockerfiles, or is there a reason this isn't done?
@craig-willis, the reason why we're downloading all dependencies in initial.bash script is more historical, we had to test different versions of Dataverse in the same time and it's quite convenient to have all stuff locally during building container process.
I'll be very appreciated to accept pull requests that can add automatic builds.
@4tikhonov that makes sense, thank you. I'll try to put something together for review.
Judging from the slides at https://twitter.com/4tykhonov/status/1116229640232873984 ...
... https://hub.docker.com/u/vtycloud is still the place to find these Docker images. Thanks, @4tikhonov !!
Hi! Over at https://github.com/nds-org/ndslabs-dataverse/issues/8#issuecomment-422361354 @craig-willis seems to be suggesting that he's willing to have images pushed to https://hub.docker.com/r/ndslabs/ at least in the short term. Are the images already being pushed to some other Docker Hub organization? (I see that https://hub.docker.com/r/vtycloud has some Dataverse images.) We at @IQSS would really appreciate it if the community could push the images somewhere because we don't have the resources at this time to push them. Please let me know what you think. Thanks!