Running docker-compose build always rebuilds python requirements. This is fast on the server, but it is really slow on most laptops used for development. Rebuilding even for small changes may be necessary as adding a "docker volume" for requirements would force us to delete other volumes (e.q. the mysql database) if we want to rebuild python requirements through docker-compose down --volumes.
Another way of improving the "restart speed" if a developer uses docker when developing may be to use the repository as a volume, enabling "hot reloads". This, however, may not be appropriate in production.
Running
docker-compose build
always rebuilds python requirements. This is fast on the server, but it is really slow on most laptops used for development. Rebuilding even for small changes may be necessary as adding a "docker volume" for requirements would force us to delete other volumes (e.q. the mysql database) if we want to rebuild python requirements throughdocker-compose down --volumes
.A possible way of speeding up the build time might be to use a lighter python image. Here's a potential lead: https://stackoverflow.com/questions/60086741/docker-so-slow-while-installing-pip-requirements
Another way of improving the "restart speed" if a developer uses docker when developing may be to use the repository as a volume, enabling "hot reloads". This, however, may not be appropriate in production.