acraven / microservice-outsidein-tests

Companion repo for Microservices and Outside-in Tests Using C# and Docker Compose blog post
https://medium.com/@acraven/microservices-and-outside-in-tests-using-c-and-docker-compose-22328f47627?source=friends_link&sk=86092d6e7dfc8a3324f57ba24aed9833
6 stars 2 forks source link

Combine RUN dotnet restores #1

Open jzabroski opened 4 years ago

jzabroski commented 4 years ago

https://github.com/acraven/microservice-outsidein-tests/blob/5f0b08cc359f9dc5402701d22f4c16ef17819e6f/app/Dockerfile#L9-L10

Why don't you combine these into one line?

acraven commented 4 years ago

Hey John, by omitting the first line, the restore is delayed until the publish, as can be seen from the attached logs. Adding --no-restore to the publish causes it to fail due to a missing file it expects to be generated by the restore.

Having the explicit restore before copying the remaining source should mean restores are only performed when packages are changed, rather than on every build.

However, the docker-compose command in build.ps1 specifies the --no-cache arg so layer caching is not enabled meaning the restores will be done for every build.

As an aside, I've also had to remove the --parallel arg from build.ps1 as this causes random restore errors.

no-restore.txt explicit_restore.txt implicit_restore.txt restore-errors.txt

jzabroski commented 4 years ago

@acraven Ha, thanks - forgot I asked this question. I just constantly scan other people's code trying to understand why they do stuff. Definitely learned a lot more about docker after I asked this question. In particular, I realize now that whether a docker layer is cached is a lot more complicated. The other trick I see people do is they copy only the csproj over, then do a restore on that, as that creates a unique layer containing only the packages folder. This is almost exactly the same as what you're doing.

I wonder if --parallel also needs to have IsDeterministic flag to avoid errors? Give it a shot. When I worked at erecruit, I used to get very similar errors but that was back in 2013. My workaround at the time was to set-up a myget mirror. From my limited debugging, it was as if nuget.org had blacklisted my IP address for making too many requests. You can think of it as an FTP server rate limiting the number of concurrent downloads for any particular logged in user.