Open mjmare opened 3 years ago
I like this idea, not quite sure on a good CLI that wouldn't clash with existing commands/etc, but I'm keen to implement this or similar!
I like the spirit of this idea, but I think it'd be difficult to do without making deploys opinionated about the hosts they run on. @mjmare it seems to me like this might be achievable if inventory allowed you to specify which connector to use when executing tasks. Then you could do something like:
# inventory.py
build_servers = [
# The hypothetical third argument tells pyinfra to use the @local connector
('localhost', {}, '@local')
}
deploy_servers = ['whatever', 'you', 'want']
In fact you might already be able to do this, I can see some examples in the docs using @ssh/hostname
syntax, but I'm not sure if that's supported in inventory. @Fizzadar can you weigh in?
Then inside your deploys you could have something like:
from pyinfra import host
if 'build_servers' in host.groups:
do_some_build_task
# Do more things
This would solve the problem of having to specify different combinations of inventory, however you'd still need to run the pyinfra CLI twice to first run your build tasks, then run your deployment tasks. I think the answer to that is adding a high level Python script that uses packaged deploys to orchestrate this activity for you. Then your Python script would just be:
from packaged_deploys import build, deploy
build()
deploy()
and you could just run that high level script on the CLI from then on.
During my digging through the docs I also found this @mjmare, pyinfra seems to support a config.py
file that will give you access to run tasks locally before making any remote connections. It appears this might suit your use case?
Historically I have achieved this kind of thing using config.py
as @benridley suggested - specifically for things like building frontend bundles with webpack before uploading. It should also be possible to achieve in one (see below) but any operations that rely on local files generated by local commands will fail because those files will not exist until after operations start to execute.
# inventory.py
local = ['@local']
remote = ['my-server.net']
# deploy.py
from pyinfra import host
from pyinfra.operations import server
if 'local' in host.groups:
server.shell('echo this is a local command')
if 'remote' in host.groups:
server.shell('echo this is a remote command, executed second')
Also note - @ssh/X
works too - the SSH connector is just the default for any host names that don't begin @
, so is recommended to be left off as the most common target type.
I am now wondering if there's a better way to handle this within a single deploy file - like above but executed as two deploys rather than one. OR - perhaps this is an unncessary over-complication of pyinfra, and something that should be handled outside (via script/similar). I'm unsure on the right approach here, open to thoughts.
I find that I find the PyInfra CLI somewhat cumbersome. Maybe I'm using it wrong, or maybe not.
In a typical deploy I have to do some tasks on my local system, and only then proceed to do actions on the remote system. Because PI forces me to specify the inventory I have to split the deployment in two deploys, say 01-build.py and 02-deploy.py. Then, because I'm lazy I create two xonsh (could be bash) scripts that call pyinfra with the appropriate inventory, so that I can call build.sxh and deploy.xsh instead of:
I know, not the end of the world, but it feels a bit redundant.
Describe the solution you'd like
I'd like to specify combinations of inventory and deploy once, and not keep it in my head. Any solution that lets me create a set of simple commands that I can run repeatedly using some shorthand. Maybe let me specify the inventory name in the deploy file. Or something like make.
I hope this makes sense.