BluesparkLabs / spark

✨ Toolkit to develop, test and run Drupal websites.
MIT License
2 stars 1 forks source link

Turn Spark into a global dependency #5

Open balintbrews opened 6 years ago

balintbrews commented 6 years ago

@isholgueras has a great proposal, which probably needs to be broken down into smaller parts. Let's start here.

balintbrews commented 6 years ago

For the record, we decided to hold off on this for now.

@isholgueras In the meantime, could you please document your concerns with the current architecture? What exactly would we gain by changing Spark to be a global dependency? What are the disadvantages of the current architecture where we include Spark as a local dependency for each project?

One problem I see with the global approach is that we'd need to solve versioning the schema of the .spark.yml file, but you may have a good answer to this, @isholgueras.

isholgueras commented 6 years ago

As we spoke in a PM, I see two advantages of having this tool as an external package:

  1. You don't need to adapt to the framework of the project (Drupal 7, 8 or Symfony 2, 3, 4,...) because you can use the most appropriate libraries and versions that you need. If you require spark as a library in the framework, you need to have a project that uses composer and you are forced to use compatible versions of the project libraries.
  2. Having spark isolated as a custom tool allows you to create shared containers between projects (like a Traefik, nginx, haproxy as an inverse proxy), because the manager is independent of the project.

That is very similar as the tool that Matt Glamman has with platform-docker

The steps to initialize a project are:

  1. platform-docker init. It creates the initial .platform/local.yml (or similar, don't remember the name). If the file exists, doesn't perform any action
  2. You modify this file to adapt to your project
  3. platform-docker start. Start all of the containers defined in the local.yml file.
  4. You can go to another project and execute the same: platform-docker init && platform-docker start, and thanks to the shared proxy container you have these 2 projects up and running (in different ports by default, but you can configure the Inverse Proxy to serve in 80 port the different domains)

That's basically what we spoke in PM but a bit more organized and described :)

jameswilson commented 6 years ago

The main issue that @isholgueras brought up that is indeed a significant limitation is this one:

you are forced to use compatible versions of the project libraries

this is precisely a problem for things like the drush functionality, and other dependencies (even Robo itself). for drush there are workarounds, we could write our code in a way that supports both drush 8 and drush 9 command execution, but things like robo versions would be really harder to work with two versions.

create shared containers between projects

I'm on the fence about this. Technically, this is not a limitation because we could have a task that checks for a shared haproxy inside a separate folder, eg ~/.spark/ and starts it, if it already exists and is not running, OR creates it from some scaffolding if it doesn't yet exist. This doesnt demand spark being a global dependency. On the other hand, you do start to run into issues as spark evolves and different projects have different spark versions, trying to run a shared service that expects different things could be problematic. I consider this to be a far-future possibility and a pretty big "what if".