BluesparkLabs / spark

✨ Toolkit to develop, test and run Drupal websites.
MIT License
2 stars 1 forks source link

Introduce environment modes #28

Closed balintbrews closed 4 years ago

balintbrews commented 6 years ago

This is my first attempt to support what we've been calling the "hybrid approach": when PHP and the HTTP server are handled by the host machine, and Spark is only supposed to be running containers for the database, Solr, headless Chrome and whatever we introduce later.

The idea is a simple environment variable, SPARK_MODE defined in the .env file. Then we can rely on this value inside our commands and use a different Docker Compose file and run Drush commands accordingly.

(This is a very simple approach which we may need to revisit later, but it should get us going.)

balintbrews commented 6 years ago

@jameswilson That's totally fine, you just need to adjust your settings.php to connect to Solr, but use the local db connection if that's your preference. I realize the db container would still be running, but at this point I think it's better to not try supporting every edge case. Let's get this in, so that other, more important developments can advance, and we can iterate later.

Btw, @isholgueras had a proposal today for an option to supply your own Docker Compose file, which wouldn't be hard to implement and it would be a great way to support all kinds of use cases.

jameswilson commented 6 years ago

I realize the db container would still be running, but at this point I think it's better to not try supporting every edge case.

The thing is that this is not an edge case because both I and Pablo have IULD8 setup this way. This will be the definitive solution I use on my local environments because I need a local DB for my workflows, for example using the phpmyadmin tool to do cross-table, cross-column searches very quickly -- something that Sequel Pro cannot do.

I'm requesting that at the very least we support not just native php, and apache, but also native mysql. I don't want or need a ton of unused db containers hanging around. I don't want our first client project fully using spark to be an "exception to the rule" on typical usage. I don't want to not be able to leverage the db commands file because it thinks I'm using a db container when it is just a dummy one hanging around.

balintbrews commented 6 years ago

We agreed that one of Spark's goals is to provide an environment for running a Drupal website. Using a single container out of the many that make up the environment is hardly not an edge case. I understand you have your reasons to keep it that way, I wouldn't want to question that. But this PR doesn't block you or forces you to change your workflows. You're free to set things up however it is the most convenient for you. It should always be that way. I'm aware of the momentary tradeoff: Your database container would linger around, which is, of course, not acceptable, but I would like to address that problem, too. Just not in this PR.

What I'm proposing is that we provide Docker Compose files for the platforms we support, and we do so with two variants: a) everything in containers vs. b) everything except PHP and the HTTP server. With these two we cover a good chunk of the use cases and make the most people happy. (Btw, I'm hoping we can serve both D7 and D8 projects from the same Docker Compose file, but that's for later.)

Then what I would propose to be added in later PRs are two things:

  1. The ability to supply a list of container names (in .spark.yml and/or .env) that you would like to run. This could co-exist with the platform option and the SPARK_MODE .env variable which currently drive which Docker Compose file is used.
  2. The ability to define the path to your own Docker Compose file which lives in your project, completely replacing Spark's environment with the one you manage for your project.

I'm happy to open a new issue for these and happy to discuss these ideas there. I hope you see there's a path towards satisfying most, if not all uses cases. But we need to start somewhere, and this PR is that first step.

jameswilson commented 6 years ago

I guess, if you're not going to budge on special-casing the php and apache containers only, then I'm fundamentally opposed to the architecture of this SPARK_MODE solution. I dont want to have to write a custom docker-compose.yml for every project just to get a MAMP-compatible workflow. That is not sustainable to noobs. I cannot in good faith approve this PR knowing that its leaving the entire MAMP workflow as a second-class citizen. I am requesting that the SPARK_MODE architecture be completely re-thought to easily allow any of the "default" containers to be enabled or disabled without requiring an entirely parallel docker-compose.yml override.

balintbrews commented 6 years ago

You don't need to write a custom docker-compose.yml if you don't want to. Please see the other option I proposed. In your case, you can specify drupal8 as your platform, just as you already do, then you'll be able to add another property to .spark.yml:

containers:
    - solr

Here you can supply a list of container names you would like to actually run and use.

The custom Docker Compose file is another, more advanced use-case, i.e. when you need some exotic PHP extension, an example mentioned by @isholgueras earlier today.