BluesparkLabs / spark

✨ Toolkit to develop, test and run Drupal websites.
MIT License
2 stars 1 forks source link

Solr config version customization and fix containers destroy. #39

Closed citlacom closed 4 years ago

citlacom commented 5 years ago

Hi @jameswilson - This pull request solves a limitation from deprecated Docker image https://github.com/wodby/drupal-solr that don't support customization of SearchAPI Solr configuration version that is used in Solr service. I tried to define SEARCH_API_SOLR_VER as environment variable and as build argument and none of those override the version that is set from the parent image at the Make script https://github.com/wodby/drupal-solr/blob/master/Makefile#L4 The new image https://github.com/wodby/solr that replace this deprecated image seems to have support to override the config version but don't wanted to do a major change in Spark at this moment due possible side effects.

Also I have resolved that Solr configuration is updated correctly handling the docker images cleaning through the containers destroy that stopped the container but don't removed the images previously created, this is some kind of cache image that is reused with previous state when docker containers are activated again after destroy so the new configuration was not applied if you have previous docker image version.

I will comment at the IUL ticket that originated this issue how to QA in local environment.

jameswilson commented 5 years ago

@citlacom the methodology you're using here by exposing a variable and then using that variable to modify how the Solr configs inside the Docker Solr image is built does make sense.

I'm not totally opposed to committing this in order to get a quick-fix in place, however a couple things:

1) The new https://github.com/wodby/solr already takes care of some of this logic for grabbing the correct search_api_solr schema versions, so I'm not sure we should introduce our own code in the Dockerfile to do pretty much a similar thing that is already being done in the new upstream.

https://github.com/wodby/solr/blob/d8f74c3f6ee2defcc2b62ca04d6255788b51563e/search-api-solr.sh#L11-L42

I haven't studied this closely enough to know if that would be good enough for our needs or not, but it seems like that code always grabs the latest version, and so therefore is guaranteed to be more up-to-date than us hardcoding `8.x-2.1`  into spark. 

2) I would really love to be able to see something similar to the core_config option in .platform.app.yml setup in our .spark.yml file.

In an ideal world, we would be able to somehow let any Drupal 8 project (or non-Drupal project) specify its own Solr configs to copy into the image, instead of blindly copying the configurations hardcoded into `docker-compose.drupal8.yml` for all projects.  This way we could allow different versions per project and track change and release of those configs better right within the project's repository.

Perhaps we can translate a variable specified in .spark.yml into an environment variable on the fly before calling docker commands, and use the env variable inside the Dockerfile.

@balintk I would also really appreciate your input and ideas here too.