By default, Dagster tools automatically create a process on your local machine for each of your code locations. However, it's also possible to run your own gRPC server that's responsible for serving information about your code locations. This can be useful in more complex system architectures that deploy user code separately from the Dagster webserver.
I think each set of 'workflow' code needs it own code repository, so that things do not get lost when 'user' code is updated.
https://docs.dagster.io/deployment/guides/docker#multi-container-docker-deployment
https://docs.dagster.io/concepts/code-locations/workspace-files#running-your-own-grpc-server
By default, Dagster tools automatically create a process on your local machine for each of your code locations. However, it's also possible to run your own gRPC server that's responsible for serving information about your code locations. This can be useful in more complex system architectures that deploy user code separately from the Dagster webserver.
https://github.com/dagster-io/dagster/tree/1.4.2/examples/deploy_docker
https://github.com/dagster-io/dagster/blob/1.4.2/examples/deploy_docker/Dockerfile_user_code