GoogleContainerTools / skaffold

Easy and Repeatable Kubernetes Development
https://skaffold.dev/
Apache License 2.0
14.94k stars 1.62k forks source link

k3s cluster - images cant be pulled and [maybe? no trailing of containers] #8452

Closed omer-bar closed 1 year ago

omer-bar commented 1 year ago

i want to preface something first - on my own machine i have k3s and i have one project who i finally got to work with skaffold just to find out the containers aren't tailing as they should, so i wanted to know maybe its a problem with k3s because on docker-desktop "built-in" cluster the tailing works, opened a fresh VM and took the steps to reproduce the behavior in the title(hence the maybe? no trailing of containers, i cant even pull images with examples to find out if its consistent with k3s or just a problem on my end)

Expected behavior

the images should be pulled and there should be tailing of the container

Actual behavior

images aren't pulling and no tail when they are being pulled

Information

Steps to reproduce the behavior

  1. on a new environment/machine(VM just for this) cloned skaffold/examples.
  2. installed k3s using their quick start script.
  3. skaffold dev inside getting-started example folder(tried a bunch of them just to see if its consistent)
  4. profit?

skaffold dev -vdebug log in nodejs example:


DEBU[0000] skaffold API not starting as it's not requested  subtask=-1 task=DevLoop
INFO[0000] Skaffold &{Version:v2.1.0 ConfigVersion:skaffold/v4beta2 GitVersion: GitCommit:c037d6f51276e178a2c05c1c59665956ff34aa4c BuildDate:2023-01-23T09:57:08Z GoVersion:go1.19.1 Compiler:gc Platform:linux/amd64 User:}  subtask=-1 task=DevLoop
INFO[0000] Loaded Skaffold defaults from "/home/eryx/.skaffold/config"  subtask=-1 task=DevLoop
DEBU[0000] parsed 1 configs from configuration file /home/eryx/skaffold-dev/skaffold/examples/nodejs/skaffold.yaml  subtask=-1 task=DevLoop
INFO[0000] applying profile: dev                         subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field Build  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field artifacts  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field insecureRegistries  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field tagPolicy  subtask=-1 task=DevLoop
INFO[0000] no values found in profile for field TagPolicy, using original config values  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field platforms  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field BuildType  subtask=-1 task=DevLoop
INFO[0000] no values found in profile for field BuildType, using original config values  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field Test   subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field Render  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field Generate  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field rawYaml  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field remoteManifests  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field kustomize  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field helm   subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field kpt    subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field hooks  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field before  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field after  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field transform  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field validate  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field output  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field Deploy  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field DeployType  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field docker  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field helm   subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field kpt    subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field kubectl  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field cloudrun  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field statusCheck  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field statusCheckDeadlineSeconds  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field tolerateFailuresUntilDeadline  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field kubeContext  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field logs   subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field prefix  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field jsonParse  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field fields  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field -      subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field PortForward  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field ResourceSelector  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field allow  subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field deny   subtask=-1 task=DevLoop
DEBU[0000] overlaying profile on config for field Verify  subtask=-1 task=DevLoop
DEBU[0000] Defaulting deploy type to kubectl             subtask=-1 task=DevLoop
INFO[0000] map entry found when executing locate for &{node-example backend 0xc000532e70 {0xc0006aa630 <nil> <nil> <nil> <nil> <nil> <nil>} [] {[] []} [] } of type *latest.Artifact and pointer: 824641840352  subtask=-1 task=DevLoop
DEBU[0000] skipping validating `kubectl` deployer manifests since only the default manifest list is defined  subtask=-1 task=DevLoop
INFO[0000] Using kubectl context: default                subtask=-1 task=DevLoop
DEBU[0000] getting client config for kubeContext: `default`  subtask=-1 task=DevLoop
DEBU[0000] Running command: [minikube version --output=json]  subtask=-1 task=DevLoop
DEBU[0000] setting Docker user agent to skaffold-v2.1.0  subtask=-1 task=DevLoop
INFO[0000] no kpt renderer or deployer found, skipping hydrated-dir creation  subtask=-1 task=DevLoop
DEBU[0000] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0000] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0000] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0000] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0000] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0001] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0001] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0001] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0001] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0001] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0001] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0001] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0001] Running command: [kubectl config view --minify -o jsonpath='{..namespace}']  subtask=-1 task=DevLoop
DEBU[0001] Command output: ['default']                   subtask=-1 task=DevLoop
DEBU[0001] CLI platforms provided: ""                    subtask=-1 task=DevLoop
DEBU[0001] getting client config for kubeContext: `default`  subtask=-1 task=DevLoop
DEBU[0001] platforms detected from active kubernetes cluster nodes: "linux/amd64"  subtask=-1 task=DevLoop
DEBU[0001] platforms selected for artifact "node-example": "linux/amd64"  subtask=-1 task=DevLoop
DEBU[0001] Using builder: local                          subtask=-1 task=DevLoop
INFO[0001] build concurrency first set to 1 parsed from *local.Builder[0]  subtask=-1 task=DevLoop
INFO[0001] final build concurrency value is 1            subtask=-1 task=DevLoop
Generating tags...
 - node-example -> DEBU[0001] Running command: [git describe --tags --always]  subtask=-1 task=Build
DEBU[0001] Command output: [v2.1.0-35-g10254871a
]       subtask=-1 task=Build
DEBU[0001] Running command: [git status . --porcelain]   subtask=-1 task=Build
DEBU[0001] Command output: []                            subtask=-1 task=Build
node-example:v2.1.0-35-g10254871a
INFO[0001] Tags generated in 75.308089ms                 subtask=-1 task=Build
Checking cache...
DEBU[0001] Executing template &{envTemplate 0xc000a12000 0xc00058bae0  } with environment map[DBUS_SESSION_BUS_ADDRESS:unix:path=/run/user/1000/bus HOME:/home/eryx HOMEBREW_CELLAR:/home/linuxbrew/.linuxbrew/Cellar HOMEBREW_PREFIX:/home/linuxbrew/.linuxbrew HOMEBREW_REPOSITORY:/home/linuxbrew/.linuxbrew/Homebrew INFOPATH:/home/linuxbrew/.linuxbrew/share/info: KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_ACTIVE:1 KUBIE_DEPTH:1 KUBIE_FISH_USE_RPROMPT:0 KUBIE_KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_PROMPT_DISABLE:0 KUBIE_SESSION:/tmp/kubie-sessionAemXrS.json KUBIE_SHELL:bash KUBIE_STATE:/home/eryx/.local/share/kubie/state.json KUBIE_XONSH_USE_RIGHT_PROMPT:0 KUBIE_ZSH_USE_RPS1:0 LANG:C.UTF-8 LC_ADDRESS:C.UTF-8 LC_IDENTIFICATION:C.UTF-8 LC_MEASUREMENT:C.UTF-8 LC_MONETARY:C.UTF-8 LC_NAME:C.UTF-8 LC_NUMERIC:C.UTF-8 LC_PAPER:C.UTF-8 LC_TELEPHONE:C.UTF-8 LC_TIME:C.UTF-8 LESSCLOSE:/usr/bin/lesspipe %s %s LESSOPEN:| /usr/bin/lesspipe %s LOGNAME:eryx LS_COLORS:rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36: MANPATH:/home/linuxbrew/.linuxbrew/share/man: OLDPWD:/home/eryx/skaffold-dev/skaffold/examples PATH:/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin PWD:/home/eryx/skaffold-dev/skaffold/examples/nodejs SHELL:/bin/bash SHLVL:2 SSH_CLIENT:147.235.202.182 12910 22 SSH_CONNECTION:147.235.202.182 12910 165.227.147.57 22 SSH_TTY:/dev/pts/0 TERM:xterm-256color USER:eryx XDG_DATA_DIRS:/usr/local/share:/usr/share:/var/lib/snapd/desktop XDG_RUNTIME_DIR:/run/user/1000 XDG_SESSION_CLASS:user XDG_SESSION_ID:14 XDG_SESSION_TYPE:tty _:/usr/local/bin/skaffold]  subtask=-1 task=DevLoop
DEBU[0001] Executing template &{envTemplate 0xc000a12c60 0xc0005fa000  } with environment map[DBUS_SESSION_BUS_ADDRESS:unix:path=/run/user/1000/bus HOME:/home/eryx HOMEBREW_CELLAR:/home/linuxbrew/.linuxbrew/Cellar HOMEBREW_PREFIX:/home/linuxbrew/.linuxbrew HOMEBREW_REPOSITORY:/home/linuxbrew/.linuxbrew/Homebrew INFOPATH:/home/linuxbrew/.linuxbrew/share/info: KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_ACTIVE:1 KUBIE_DEPTH:1 KUBIE_FISH_USE_RPROMPT:0 KUBIE_KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_PROMPT_DISABLE:0 KUBIE_SESSION:/tmp/kubie-sessionAemXrS.json KUBIE_SHELL:bash KUBIE_STATE:/home/eryx/.local/share/kubie/state.json KUBIE_XONSH_USE_RIGHT_PROMPT:0 KUBIE_ZSH_USE_RPS1:0 LANG:C.UTF-8 LC_ADDRESS:C.UTF-8 LC_IDENTIFICATION:C.UTF-8 LC_MEASUREMENT:C.UTF-8 LC_MONETARY:C.UTF-8 LC_NAME:C.UTF-8 LC_NUMERIC:C.UTF-8 LC_PAPER:C.UTF-8 LC_TELEPHONE:C.UTF-8 LC_TIME:C.UTF-8 LESSCLOSE:/usr/bin/lesspipe %s %s LESSOPEN:| /usr/bin/lesspipe %s LOGNAME:eryx LS_COLORS:rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36: MANPATH:/home/linuxbrew/.linuxbrew/share/man: OLDPWD:/home/eryx/skaffold-dev/skaffold/examples PATH:/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin PWD:/home/eryx/skaffold-dev/skaffold/examples/nodejs SHELL:/bin/bash SHLVL:2 SSH_CLIENT:147.235.202.182 12910 22 SSH_CONNECTION:147.235.202.182 12910 165.227.147.57 22 SSH_TTY:/dev/pts/0 TERM:xterm-256color USER:eryx XDG_DATA_DIRS:/usr/local/share:/usr/share:/var/lib/snapd/desktop XDG_RUNTIME_DIR:/run/user/1000 XDG_SESSION_CLASS:user XDG_SESSION_ID:14 XDG_SESSION_TYPE:tty _:/usr/local/bin/skaffold]  subtask=-1 task=DevLoop
DEBU[0001] Found dependencies for dockerfile: [{package-lock.json /home/node/app true 12 12} {package.json /home/node/app true 12 12} {. /home/node/app true 15 15}]  subtask=-1 task=DevLoop
DEBU[0001] Executing template &{envTemplate 0xc000974c60 0xc0005fa5a0  } with environment map[DBUS_SESSION_BUS_ADDRESS:unix:path=/run/user/1000/bus HOME:/home/eryx HOMEBREW_CELLAR:/home/linuxbrew/.linuxbrew/Cellar HOMEBREW_PREFIX:/home/linuxbrew/.linuxbrew HOMEBREW_REPOSITORY:/home/linuxbrew/.linuxbrew/Homebrew INFOPATH:/home/linuxbrew/.linuxbrew/share/info: KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_ACTIVE:1 KUBIE_DEPTH:1 KUBIE_FISH_USE_RPROMPT:0 KUBIE_KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_PROMPT_DISABLE:0 KUBIE_SESSION:/tmp/kubie-sessionAemXrS.json KUBIE_SHELL:bash KUBIE_STATE:/home/eryx/.local/share/kubie/state.json KUBIE_XONSH_USE_RIGHT_PROMPT:0 KUBIE_ZSH_USE_RPS1:0 LANG:C.UTF-8 LC_ADDRESS:C.UTF-8 LC_IDENTIFICATION:C.UTF-8 LC_MEASUREMENT:C.UTF-8 LC_MONETARY:C.UTF-8 LC_NAME:C.UTF-8 LC_NUMERIC:C.UTF-8 LC_PAPER:C.UTF-8 LC_TELEPHONE:C.UTF-8 LC_TIME:C.UTF-8 LESSCLOSE:/usr/bin/lesspipe %s %s LESSOPEN:| /usr/bin/lesspipe %s LOGNAME:eryx LS_COLORS:rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36: MANPATH:/home/linuxbrew/.linuxbrew/share/man: OLDPWD:/home/eryx/skaffold-dev/skaffold/examples PATH:/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin PWD:/home/eryx/skaffold-dev/skaffold/examples/nodejs SHELL:/bin/bash SHLVL:2 SSH_CLIENT:147.235.202.182 12910 22 SSH_CONNECTION:147.235.202.182 12910 165.227.147.57 22 SSH_TTY:/dev/pts/0 TERM:xterm-256color USER:eryx XDG_DATA_DIRS:/usr/local/share:/usr/share:/var/lib/snapd/desktop XDG_RUNTIME_DIR:/run/user/1000 XDG_SESSION_CLASS:user XDG_SESSION_ID:14 XDG_SESSION_TYPE:tty _:/usr/local/bin/skaffold]  subtask=-1 task=DevLoop
 - node-example: Found Locally
INFO[0001] Cache check completed in 70.600584ms          subtask=-1 task=Build
INFO[0001] Starting render...                            subtask=-1 task=DevLoop
INFO[0001] starting render process                       subtask=0 task=Render
DEBU[0001] Executing template &{envTemplate 0xc000975d40 0xc0005fa730  } with environment map[DBUS_SESSION_BUS_ADDRESS:unix:path=/run/user/1000/bus HOME:/home/eryx HOMEBREW_CELLAR:/home/linuxbrew/.linuxbrew/Cellar HOMEBREW_PREFIX:/home/linuxbrew/.linuxbrew HOMEBREW_REPOSITORY:/home/linuxbrew/.linuxbrew/Homebrew INFOPATH:/home/linuxbrew/.linuxbrew/share/info: KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_ACTIVE:1 KUBIE_DEPTH:1 KUBIE_FISH_USE_RPROMPT:0 KUBIE_KUBECONFIG:/tmp/kubie-configKBhrt7.yaml KUBIE_PROMPT_DISABLE:0 KUBIE_SESSION:/tmp/kubie-sessionAemXrS.json KUBIE_SHELL:bash KUBIE_STATE:/home/eryx/.local/share/kubie/state.json KUBIE_XONSH_USE_RIGHT_PROMPT:0 KUBIE_ZSH_USE_RPS1:0 LANG:C.UTF-8 LC_ADDRESS:C.UTF-8 LC_IDENTIFICATION:C.UTF-8 LC_MEASUREMENT:C.UTF-8 LC_MONETARY:C.UTF-8 LC_NAME:C.UTF-8 LC_NUMERIC:C.UTF-8 LC_PAPER:C.UTF-8 LC_TELEPHONE:C.UTF-8 LC_TIME:C.UTF-8 LESSCLOSE:/usr/bin/lesspipe %s %s LESSOPEN:| /usr/bin/lesspipe %s LOGNAME:eryx LS_COLORS:rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36: MANPATH:/home/linuxbrew/.linuxbrew/share/man: OLDPWD:/home/eryx/skaffold-dev/skaffold/examples PATH:/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin PWD:/home/eryx/skaffold-dev/skaffold/examples/nodejs SHELL:/bin/bash SHLVL:2 SSH_CLIENT:147.235.202.182 12910 22 SSH_CONNECTION:147.235.202.182 12910 165.227.147.57 22 SSH_TTY:/dev/pts/0 TERM:xterm-256color USER:eryx XDG_DATA_DIRS:/usr/local/share:/usr/share:/var/lib/snapd/desktop XDG_RUNTIME_DIR:/run/user/1000 XDG_SESSION_CLASS:user XDG_SESSION_ID:14 XDG_SESSION_TYPE:tty _:/usr/local/bin/skaffold]  subtask=-1 task=DevLoop
DEBU[0001] manifests with tagged images:apiVersion: v1
kind: Service
metadata:
  name: node
spec:
  ports:
  - port: 3000
  selector:
    app: node
  type: LoadBalancer
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: node
spec:
  selector:
    matchLabels:
      app: node
  template:
    metadata:
      labels:
        app: node
    spec:
      containers:
      - image: node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6
        name: node
        ports:
        - containerPort: 3000  subtask=0 task=Render
DEBU[0001] manifests with labels apiVersion: v1
kind: Service
metadata:
  labels:
    skaffold.dev/run-id: 23ca34f1-bb6e-4e59-a1b1-2172b0d2d1d7
  name: node
spec:
  ports:
  - port: 3000
  selector:
    app: node
  type: LoadBalancer
---
apiVersion: apps/v1
kind: Deployment
metadata:
  labels:
    skaffold.dev/run-id: 23ca34f1-bb6e-4e59-a1b1-2172b0d2d1d7
  name: node
spec:
  selector:
    matchLabels:
      app: node
  template:
    metadata:
      labels:
        app: node
        skaffold.dev/run-id: 23ca34f1-bb6e-4e59-a1b1-2172b0d2d1d7
    spec:
      containers:
      - image: node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6
        name: node
        ports:
        - containerPort: 3000  subtask=-1 task=DevLoop
DEBU[0001] manifests set with namespace apiVersion: v1
kind: Service
metadata:
  labels:
    skaffold.dev/run-id: 23ca34f1-bb6e-4e59-a1b1-2172b0d2d1d7
  name: node
  namespace: default
spec:
  ports:
  - port: 3000
  selector:
    app: node
  type: LoadBalancer
---
apiVersion: apps/v1
kind: Deployment
metadata:
  labels:
    skaffold.dev/run-id: 23ca34f1-bb6e-4e59-a1b1-2172b0d2d1d7
  name: node
  namespace: default
spec:
  selector:
    matchLabels:
      app: node
  template:
    metadata:
      labels:
        app: node
        skaffold.dev/run-id: 23ca34f1-bb6e-4e59-a1b1-2172b0d2d1d7
    spec:
      containers:
      - image: node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6
        name: node
        ports:
        - containerPort: 3000  subtask=-1 task=DevLoop
INFO[0001] Render completed in 11.174278ms               subtask=-1 task=DevLoop
Tags used in deployment:
 - node-example -> node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6
DEBU[0001] Local images can't be referenced by digest.
They are tagged and referenced by a unique, local only, tag instead.
See https://skaffold.dev/docs/pipeline-stages/taggers/#how-tagging-works  subtask=-1 task=Deploy
Starting deploy...
DEBU[0001] getting client config for kubeContext: `default`  subtask=-1 task=DevLoop
DEBU[0001] Running command: [kubectl --context default get -f - --ignore-not-found -ojson]  subtask=0 task=Deploy
DEBU[0002] Command output: []                            subtask=0 task=Deploy
DEBU[0002] 2 manifests to deploy. 2 are updated or new   subtask=0 task=Deploy
DEBU[0002] Running command: [kubectl --context default apply -f -]  subtask=0 task=Deploy
 - service/node created
 - deployment.apps/node created
Waiting for deployments to stabilize...
DEBU[0002] getting client config for kubeContext: `default`  subtask=-1 task=DevLoop
DEBU[0002] getting client config for kubeContext: `default`  subtask=-1 task=DevLoop
DEBU[0003] checking status deployment/node               subtask=0 task=Deploy
DEBU[0004] Running command: [kubectl --context default rollout status deployment node --namespace default --watch=false]  subtask=0 task=Deploy
DEBU[0004] Command output: [Waiting for deployment "node" rollout to finish: 0 of 1 updated replicas are available...
]  subtask=0 task=Deploy
DEBU[0004] Pod "node-5548588b65-tnchn" scheduled but not ready: checking container statuses  subtask=-1 task=DevLoop
DEBU[0005] Running command: [kubectl --context default rollout status deployment node --namespace default --watch=false]  subtask=0 task=Deploy
DEBU[0005] Command output: [Waiting for deployment "node" rollout to finish: 0 of 1 updated replicas are available...
]  subtask=0 task=Deploy
DEBU[0005] Pod "node-5548588b65-tnchn" scheduled but not ready: checking container statuses  subtask=-1 task=DevLoop
DEBU[0005] marking resource failed due to error code STATUSCHECK_IMAGE_PULL_ERR  subtask=0 task=Deploy
 - deployment/node: container node is waiting to start: node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6 can't be pulled
    - pod/node-5548588b65-tnchn: container node is waiting to start: node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6 can't be pulled
 - deployment/node failed. Error: container node is waiting to start: node-example:b4f3ba3f51434a6eb4000d4ea18168bdc0fd3458942a54c8755058d0baab5db6 can't be pulled.
DEBU[0005] setting skaffold deploy status to STATUSCHECK_IMAGE_PULL_ERR.  subtask=0 task=Deploy
Cleaning up...
DEBU[0005] Running command: [kubectl --context default delete --ignore-not-found=true --wait=false -f -]  subtask=-1 task=DevLoop
 - service "node" deleted
 - deployment.apps "node" deleted
INFO[0006] Cleanup completed in 346.801682ms             subtask=-1 task=DevLoop
DEBU[0006] Running command: [tput colors]                subtask=-1 task=DevLoop
DEBU[0006] Command output: [256
]                        subtask=-1 task=DevLoop
1/1 deployment(s) failed```
ericzzzzzzz commented 1 year ago

it seems like that the image was not loaded to your k3d cluster,

could you try to configure a afterhook stanza under your build in your skaffold.yaml , something similar to this

apiVersion: skaffold/v4beta1
kind: Config
build:
  artifacts:
  - image: skaffold-example
    hooks:
      after:
        - command:
            - "bash"
            - "-c"
            - "k3d image import --cluster /your/cluster/name $SKAFFOLD_IMAGE"

this will load your image to your cluster after skaffold build

omer-bar commented 1 year ago

tried what you were suggesting, still cant pull the image

apiVersion: skaffold/v4beta2
kind: Config
build:
  artifacts:
    - image: skaffold-example
      hooks:
        after:
          - command:
              - "bash"
              - "-c"
              - "sudo k3s ctr images import ./skaffold-example.tar"
manifests:
  rawYaml:
    - k8s-pod.yaml

somewhere in the creation of the deployments stage on the k3s cluster it just fails to pull the image... Also tried a bunch of stuff to make it work, added a imagePullSecret to the depl with the docker hub credentials, before and after hooks with the built image, put every way of auth into k3s for a bunch of registries(quay, docker hub, ghcr etc) on every node... pretty much made sure that whatever is skaffold or k3s tries to do with external and internal services - they can.

ericzzzzzzz commented 1 year ago

@EryX666 ~can you try to replace - "sudo k3s ctr images import ./skaffold-example.tar" with sudo k3s ctr images import $SKAFFOLD_IMAGE~ Skaffold inject the built image reference into $SKAFFOLD_IMAGE env variable so post hook on the host machine can use it .. more info can be found here. https://skaffold.dev/docs/pipeline-stages/lifecycle-hooks/#environment-variables

Also, could you double check the imagePollPolicy in your manifest .. should set to Never or IfNotPresent so your cluster doesn't need to fetch image from remote if your image is available on cluster nodes.

Finally got this works on my machine, due to skaffold use docker image id as tag to identify image for deployment in local dev.. the previous suggestion doesn't work.. so we need to do some trick to get the correct image id for deployment . could you try something like below ?

kind: Config
build:
  artifacts:
  - image: skaffold-example
    hooks:
      after:
        - command:
            - "bash"
            - "-c"
            - docker save skaffold-example:$(docker inspect --format="{{slice .Id 7}}" $SKAFFOLD_IMAGE) | sudo k3s ctr images import -
  local:
    push: false
manifests:
  rawYaml:
  - k8s-pod.yaml

I have to specify the kubeconfig for k3s cluster.. if you also need to do this, please see the following example command for dev

 skaffold dev --kubeconfig /etc/rancher/k3s/k3s.yaml  --cache-artifacts=false

I also disabled cache just to make sure build post hook is always triggered

ericzzzzzzz commented 1 year ago

You can also start a local registry with docker then use it as default registry with skaffold Example command:

omer-bar commented 1 year ago

ok yha it worker, problem now is, for some reason my personal project doesnt tail logs from the containers, but the getting-started does so... i dont know why exactly this happens... the project does tail the logs on windows docker-desktop cluster and on linux docker-desktop cluster just not in k3s for some reason.