ducktors / turborepo-remote-cache

Open source implementation of the Turborepo custom remote cache server.
https://ducktors.github.io/turborepo-remote-cache/
MIT License
1.03k stars 95 forks source link

Cache not being utilized when called through Docker build #34

Closed mkotsollaris closed 2 years ago

mkotsollaris commented 2 years ago

I've successfully cloned the turborepo-remote-cache and I've hooked it up in AWS. Everything is working fine, up until I try to use the cache from docker. Here's my dockerfile:

FROM docker.artifactory.moveaws.com/node:16 as build
COPY . /app
WORKDIR /app
RUN yarn install
RUN yarn build
RUN npx turbo run build

Both RUN yarn build and RUN npx turbo run build are not updating the cache. Originally I thought that my docker instance couldn't communicate with the deployed service, but when I added a CURL inside the container, I could verify that outbound calls are properly triggered.

My question is, has anyone run into this? Maybe someone has an example of a dockerfile already out there? Or what could possibly be the issue that the commands won't trigger the cache during the build phase?

fox1t commented 2 years ago

You can follow here https://github.com/fox1t/turborepo-remote-cache#deploy-on-docker You are not setting the vars.

You can also use the official image we pushed on docker hub.

fox1t commented 2 years ago

You can find the Dockerfile here: https://github.com/fox1t/turborepo-remote-cache/blob/main/Dockerfile

StevenMatchett commented 2 years ago

@fox1t I believe @mkotsollaris is referring to building a docker image that uses the remote-cache and not deploying the remote cache. I am also running into issues when trying to reach out to my deployed remote-cache. When docker is running a turbo command it is not using hosted remote cache

fox1t commented 2 years ago

Ok, I finally got it. You are trying to connect to a remotely deployed remote-cache server from a docker container. Right?

mkotsollaris commented 2 years ago

Ok, I finally got it. You are trying to connect to a remotely deployed remote-cache server from a docker container. Right?

Yes, we are basically just testing this implementation. It works great up until we try use remote-caching during docker build, for some reason it won't work! 🤔

fox1t commented 2 years ago

Can you check if the same setup works with Vercel's own remote caching? We should exclude some kind of unexpected setting/bug on the turborepo client-side.

fox1t commented 2 years ago

ref: https://github.com/vercel/turborepo/issues/1508

StevenMatchett commented 2 years ago

We can likely close this issue. I was able to solve it by adding this hack that looks like they added for their CICD

Dockerfile

ENV VERCEL_ARTIFACTS_TOKEN=xxx
ENV VERCEL_ARTIFACTS_OWNER=team_xxx

package.json

{
  "scripts":{
      "build": "turbo run build --team=\"team_xxxx\" --token=\
      xxx\""
  }
}
fox1t commented 2 years ago

I think we should add this as documentation somewhere.

fox1t commented 2 years ago

@StevenMatchett, I couldn't make it work with your hack. Can you please write every step you took to make it work?

StevenMatchett commented 2 years ago

I created a new branch in my repo that includes the ENV variables in the dockerfile. Also you will need to update the .turbo/config.js to your api and update the token

https://github.com/StevenMatchett/turbo-repo-issue/tree/matchett/self-hosted-remote-cache

Let me know if that works. Though I feel like this should be fixed in turborepo and shouldn't promote using env variables that are not documented and might change on a whim

fox1t commented 2 years ago

I've just done some experiments, and it turned out that you don't need VERCEL vars (for me, turborepo-remote-cache is not working using VERCEL vars). I added my findings to the README: https://github.com/fox1t/turborepo-remote-cache#enable-remote-caching-in-docker Also, we needed to update turborepo-remote-cache to version 1.4, introducing the support for the slug query string parameter. Unfortunately, when you use the --team CLI option, it passes it as slug instead of teamId. I don't know why they decided like that or if it is just a bug. This is the ref: https://github.com/vercel/turborepo/blob/d83130a8b30b01ae2ce1627f4188e791598ea8b3/cli/internal/config/config.go#L172

StevenMatchett commented 2 years ago

Awesome thank you! I am curious if its a linux thing rather than a docker thing

fox1t commented 2 years ago

Update: you were right! We need to use also the env vars to activate the caching! It is clearly a bug!

ENV VERCEL_ARTIFACTS_TOKEN=turbotoken
ENV VERCEL_ARTIFACTS_OWNER=team_

I am doing more tests, and if everything works as expected, I'll also add this part to the README.

fox1t commented 2 years ago

I added the instructions to make it work in Docker. https://github.com/fox1t/turborepo-remote-cache#enable-remote-caching-in-docker

StevenMatchett commented 2 years ago

@fox1t here is a PR for the fix in turborepo

https://github.com/vercel/turborepo/pull/1527

LucM commented 1 year ago

I still had the problem with the last updates.

❌ With the node:16-slim turbo was not fetching cache from the remote server.

✅ With node:16-alpine, it works

I do not know the reason, however it could be interesting to add it in the documentation :)

NorkzYT commented 1 year ago

@LucM

What Architecture are you running your setup on?

rahnarsson commented 1 year ago

Hello!

Had this similar issue myself and after some Wiresharking the connection between my local machine and turbo-cache I noticed that the TLS Handshake failed between my docker build and remote cache. No error was produced on the build logs about bad certificate. The certificate I use on remote cache is valid AWS Issued certificate.

I used node:18-slim base image and after installing ca-certificates package:

RUN apt-get update && apt-get install -y \
    ca-certificates \
&& rm -rf /var/lib/apt/lists/*

I was able to hit and connect to cache.

WANZARGEN commented 1 year ago

@LucM As you said, I changed to use the node:16-alpine version and it works! Thanx 🙏

In my case, ❌ With the node:16, it didn't worked.

FROM node:16
...

✅ With node:16-alpine, now it works! 😄

FROM node:16-alpine
RUN apk add --no-cache libc6-compat
RUN apk update
...