Closed miha-plesko closed 6 years ago
One cannot make use of meta/run.yaml but simply run unikernel with:
What does this mean?
Generally, I like the proposal. It somehow removes the necessity for the different runtime types (java, node, native) because the base packages can provide the template for the commands.
One thing that slightly worries me is the env vars coming from different packages. For example, I can imagine, e.g. for debugging, that one would compose a unikernel with Spark and OSv's HTTP server, both having the PORT
env, rendering this impractical.
Should such problem be left to the user/package maintainer? Namely, should package maintainers ensure that variables are named properly, i.e. OSv's HTTP server would have OSV_HTTP_PORT
and Spark would have SPARK_PORT
and SPARK_UI_PORT
?
One cannot make use of meta/run.yaml but simply run unikernel with:
Oh, it's really strangely written, one can of course use meta/run.yaml. I just wanted to emphasize that inside it she cannot refer to master
or slave
but has to copy-paste bootcmd for master
/slave
even if she only wants to just modify the port.
Re environment variables clash: yes, I'd leave it to package maintainer. In other words, I wouldn't worry too much about it since for debugging purposes we usually set the bootcmd using --execute
directly.
Ok, let's leave it for now. However, I suggest you name env variables verbosely in capstan-packages repo. This will introduce a nice best practice. Please proceed with the implementation.
Closing since it's already implemented.
With recent update of OSv core we are able to boot unikernel with
runscript
based on current environment variables i.e. command is built dynamically whenrunscript
is invoked.Below is an example of how apache.spark package works:
meta/run.yaml in openjdk8-zulu-compact3-with-java-beans
meta/run.yaml in apache.spark
Notice how bootcmd is actually a copy-paste from
openjdk8-zulu-compact3-with-java-beans
package only it sets different environment variables for it. This is what the proposal below aims to resolve.meta/run.yaml in my package that uses apache.spark
One cannot make use of
meta/run.yaml
but simply run unikernel with:Proposal
I suggest that we support recursive
meta/run.yaml
so that user will be able to make use of arbitrary config_set from arbitrary package and only provide environment variables for it. This would result in apache.spark turning into this:meta/run.yaml in openjdk8-zulu-compact3-with-java-beans
(same as before)
meta/run.yaml in apache.spark
Notice how we introduce
base: "<package>:<config_set>"
and then only contextualize that specific config_set with our own environment variables.meta/run.yaml in my package that uses apache.spark
Notice how we are now able to use apache.spark on our custom ports 9000 and 9001 by default:
or specify it by ourselfs (this is already supported, so nothing surprising):