Closed gcv closed 6 years ago
I am sorry but I don't think that it is a good idea to automatically propagate the all provider.environment
to local commands. The provider environment is meant to be used in the context of the remote functions and not in the context of the local shell. If you really want to propagate it to your scripts, I think that it is safer to pass each variable explicitly with something such as:
MY_VAR=${self:provider.environnement.MY_VAR} my-script # my-script should have access to MY_VAR
Passing each variable explicitly as you suggest becomes brittle and difficult to maintain as the number of environment variables grows. I see serverless-plugin-scripts
as a means to run code in the context of the overall serverless environment. The absence of provider.environment
variables — available in normal serverless execution in process.env
— is surprising behavior.
I agree with @gcv. The reason I want to run a script through this plugin is so code will run in exactly the environment defined by serverless.yml
. For example, if I run a part of my codebase using:
scripts:
commands:
foo: node src/myfile.js
As @gcv says, I have to explicitly re-inject whatever variables myfile.js
needs, it's very likely that a change to the env vars will break this script.
Is there a situation where you wouldn't want all the environment variables by default? It is hard for me to imagine.
Thanks, @gcv, and @aneilbaboo for your feedback, but I still don't think it's a good idea. The goal of this plugin is to provide an easy way to extend serverless
CLI commands. Apart from invoke local
, none of the built-in commands has access to provider.environment
, and it doesn't seem right to me that the custom commands could behave differently.
What about:
scripts:
foo:
commands: |
node src/myfile.js
environment:
env1: val1
env2: val2
or similar structure, of course
I think it's good idea, isn't it?
@mvila
@AhmedNourJamalElDin If we specify environment variables explicitly, then I'd rather doing so the standard way:
ENV1=val1 ENV2=val2 node src/myfile.js
ye, just not to repeat the same thing hundered times. if there are some variables related to stage variable, so we type serverless some-command --stage dev
and those variables will get thier values so we don't have to write serverless some-command --stage dev --var1 var-dev --var2 var2-dev
and so on.
your solution would be amazing if you did it internally in the plugin. just concatenate the variables and pass them to the command as you wrote..
and by that I mean in serverless.yml:
scripts:
foo:
commands: |
node src/myfile.js
environment:
env1: val1-${self::custom.stage}
env2: val2-${self::custom.stage}
internally it's equal to:
env1=val1-dev env2=val2-dev node src/myfile.js
so we call it:
sls foo --stage dev
and if I want to do the same for staging or testing:
sls foo --stage staging
sls foo --stage testing
instead of typing these commands
env1=val1-staging env2=val2-staging node src/myfile.js
env1=val1-testing env2=val2-testing node src/myfile.js
I think you can place the Serverless expression in the command itself. For example:
scripts:
foo:
commands: |
MY_VAR=${self:provider.environnement.MY_VAR} node src/myfile.js
Would it work for your use case?
Check out serverless-scriptable-plugin
which since 1.2.0 can call custom .js
ad-hoc scripts which get the serverless
object passed. You can then populate the env from that object.
E.g. simply, Object.assign(process.env, service.provider.environment)
In the current release of
serverless-plugin-scripts
, environment variables defined inserverless.yml
do not propagate to the script. This patch modifies the invocation ofexecSync
to include those environment variables.