I have a stack of multiple images for local AWS product development:
glue
airflow
minio s3
Now I need to submit spark jobs from either airflow container or from my local computer from shell to the docker image which I am not able to do at moment.
Any hints how to achieve it? Thx a lot!
Eg. Can I start achieve it by starting all /sbin/start-all.sh which starts master and workers? so I get spark://localhost:7077 port available to submit glue jobs?
Hello,
thx again for releasing v3 of image, its great!
I have a stack of multiple images for local AWS product development:
Now I need to submit spark jobs from either airflow container or from my local computer from shell to the docker image which I am not able to do at moment.
Any hints how to achieve it? Thx a lot!
Eg. Can I start achieve it by starting all /sbin/start-all.sh which starts master and workers? so I get spark://localhost:7077 port available to submit glue jobs?