Closed phobologic closed 8 years ago
One idea is to do something like golang's import system:
github.com/phobologic/myblueprints/myblueprints/cool_blueprint.CoolBlueprint
Then make it so that stacker pulls that file down (ending with .py, the CoolBlueprint part is the class), puts it in a temp directory, then adds that temp directory to your python path while it tries to import the blueprints.
i vote for:
# conf/example.yaml
stacks:
- name: coolBluePrint
blueprint:
git: github.com/phobologic/myblueprints
path: myblueprints.cool_blueprint.CoolBlueprint
i think that gives you more control than just having something like "blueprint: github.com/phobologic/myblueprints/myblueprints/cool_blueprint.CoolBlueprint" or "blueprint: stacker.blueprints.vpc.VPC", and having the logic be driven by the prefix of blueprint.
i would rather have something more verbose that gives you more room to expand it in the future.
i would also want to account for something like we did for empire stacks where the blueprint used other python files, ie "policies.py", which means more than just downloading 1 file.
Either of those would work. Another alternative is to set a list of directories where Stacker would look for blueprints...set in the yaml file:
blueprint_dirs:
- ~/path/to/blueprints
- ../another/path/to/blueprints
That coupled with the ideas above could work well...i'd also suggest changing some names...source
and class
to be more general and specific, respectively:
stacks:
- name: coolBluePrint
blueprint:
# Only look in this repo for this blueprint
source: github.com/phobologic/myblueprints
class: myblueprints.cool_blueprint.CoolBlueprint
- name: cache
blueprint:
# this would look through the set of blueprint_dirs, in addition to any default dirs
class: sekrt.elasticache.ElasticCache
+1 to what @brianz is suggesting. i think that could work really well
Generally +1 although fetching is a slippery slope given "go get" supports multiple VCS and then dealing with proxies, etc. Also, adding a way to specify branch or hash would be good for testing. See? Slippery slope of features...
What @markpeek said...totally agree. As a first step it'd be nice to simply add the ability to import local files and leave the other stuff as an exercise to the developer.
Yet another option....create your own blueprint packages, pip install them, done.
the local paths are the most important to me, i agree with the pip install one too
diff --git a/stacker/util.py b/stacker/util.py
index 3f714a1..548ca9a 100644
--- a/stacker/util.py
+++ b/stacker/util.py
@@ -1,5 +1,6 @@
import importlib
import logging
+import os
import re
import sys
import time
@@ -83,7 +84,11 @@ def load_object_from_string(fqcn):
object_name = fqcn
if '.' in fqcn:
module_path, object_name = fqcn.rsplit('.', 1)
- importlib.import_module(module_path)
+ try:
+ importlib.import_module(module_path)
+ except ImportError:
+ sys.path.append(os.path.curdir)
+ importlib.import_module(module_path)
return getattr(sys.modules[module_path], object_name)
the version of stacker I run just has the above change and I haven't had to think much about it after that. I think blueprint_dirs
would be a better option though.
So, I think the blueprint_dirs idea works for most cases. Our workflow is that we have a repo private_stacks
that has a setup.py. We have to make sure that before we update an environment we pull down the latest private_stacks and run python setup.py install
in the virtualenv we're using
The blueprint_dirs
suggestion wouldn't help with ensuring the git repo is up to date, but it would help with not having to do python setup.py install
, though it also means that everyone has to have the same layout of git directories. Hmmmm. Not sure how to solve that problem.
what if you added a pre_hook to pull the repo?
So....I'm liking the idea of blueprints as packages a lot. The reason are:
pip install -e .
your package, and always get up-to-date blueprints while developingSomething as basic as this works, again, with zero changes to stacker.
from setuptools import setup, find_packages
setup(
name='mycompany_blueprints',
version='0.0.1',
author='Brian Zambrano',
author_email='brianz@gmail.com',
url="https://github.com/brianz/mycompany-blueprints",
description='Stacker blueprints for My Company',
packages=find_packages(),
)
i buy all of that +1
I think the introduction of stacker_blueprints
as a separate package that has dependencies on specific versions of stacker mostly handles this. For personal packages, the same process should be followed (and if your stacks rely on stacker_blueprints
, it should specify that in setup.py, required_packages
with a specific version). Closing this out.
Right now it's kind of annoying to use third party blueprints in stacker - you have to get them (git pull), then install them in your python path (python setup.py install), only then can you use them.
I'm trying to think of a better way to do this, so opening this up for discussion from 'the community' :)