nimbly-dev / nyctripdata_project

Project to learn Data Engineering from: https://github.com/DataTalksClub/data-engineering-zoomcamp
0 stars 0 forks source link

DATAENG-2: Fix docker configuration so that there is no need to individually build mage-ai and spark image before running docker-compose #2

Open nimbly-dev opened 1 week ago

nimbly-dev commented 1 week ago

Currently, the startup step is as follows:

mageai/mageai:latest Run docker build --no-cache -t mageai/mageai:latest .

cd/deployment/spark: Run docker build --no-cache -t cluster-apache-spark:python3.10.14-spark3.5.1 .

On root project: Run docker-compose up -d

Fix this so that there is no need to run the docker-build between the two. I think this should be handled on DockerCompose file.