Open dirkjonker opened 6 years ago
I see that the current beta release can run on the UCR. I'll give it a try!
I can't get it working with a private registry as there seems to be no way to configure the secret to use to pull the image.
@dirkjonker See instructions starting at Step 30 here: https://docs.mesosphere.com/1.10/administering-clusters/deploying-a-local-dcos-universe/
@dirkjonker We've made the latest release of Spark 2.5.0-2.2.1 work well with the UCR on both RHEL/CentOS and CoreOS (and other distros).
In addition, with the introduction of the Package Registry in DC/OS 1.12, we've made the operator experience for air-gapped environments a lot easier with the pre-built .dcos
files for certified packages: https://downloads.mesosphere.com/universe/packages/packages.html
Ref: https://docs.mesosphere.com/1.12/administering-clusters/repo/package-registry/
Other Universe services such as Cassandra and Kafka run on the Universal Container Runtime (UCR), Spark is still using the Docker runtime. Are there any plans to support running Spark on the UCR as well?