The Docker Use Case is simple enough. There are a set of Integration Tests that depend on data from various supported data providers (JDBC, Cloudant, Mongo). I need a set of containers that when started always load a base state for the integration tests. This was very easy for MySql and pretty easy for Mongo
For Cloudant I was not able to run any type of script at container startup, but I was able to load a set of scripts that do the load. They have to pull down node runtimes, but they work, and an extra "docker exec" command is used to load the data after the container is started. Need to find a way to auto-load the data, and if possible to remove the node dependencies.
For DB2 I was able to take a similar approach, but creating the scripts and files needed to load the data included a "tail -f /dev/null" at the end and the container performs very poorly. This solution seems very "hackie"
The Docker Use Case is simple enough. There are a set of Integration Tests that depend on data from various supported data providers (JDBC, Cloudant, Mongo). I need a set of containers that when started always load a base state for the integration tests. This was very easy for MySql and pretty easy for Mongo
See project container files for the various Dockerfile and supporting data.
For Cloudant I was not able to run any type of script at container startup, but I was able to load a set of scripts that do the load. They have to pull down node runtimes, but they work, and an extra "docker exec" command is used to load the data after the container is started. Need to find a way to auto-load the data, and if possible to remove the node dependencies.
For DB2 I was able to take a similar approach, but creating the scripts and files needed to load the data included a "tail -f /dev/null" at the end and the container performs very poorly. This solution seems very "hackie"