openshift / source-to-image

A tool for building artifacts from source and injecting into container images
Apache License 2.0
2.47k stars 698 forks source link

Incremental build error (read/write on closed pipe) #301

Closed babelop closed 7 years ago

babelop commented 9 years ago

Saving incremental artifacts seems to fail randomly on (perhaps just large) projects. Could be a regression of #171, although ulimit and filesystem hard limits doesn't seem to affect it.

I've created a deployable project to replicate the issue, available at:

https://github.com/binarybabel/sti-rails-incremental-demo Build config and logs are available in the extra directory.

Level 5 build log illustrating the problem:

https://raw.githubusercontent.com/binarybabel/sti-rails-incremental-demo/master/extra/incremental-failure.log

babelop commented 9 years ago

@bparees The new issue as you requested. Build logs available in description.

mfojtik commented 8 years ago

@binarybabel can you check if https://github.com/openshift/source-to-image/pull/343 fixes this issue?

bparees commented 8 years ago

@rhcarvalho @mfojtik can you guys re-evaluate this and close if you think it's fixed/non-reproducible?

bparees commented 8 years ago

closing based on age, lack of info, and that we've reworked this code numerous times since this issue was reported.

accursoft commented 8 years ago

@bparees I'm running into a variation of this, so requesting to re-open.

  1. Clone https://github.com/accursoft/Haskell-Cloud
  2. Run test (this will take a long time for the first run while it builds the image)

There is a pause of two minutes between incremental builds, caused by docker.holdHijackedConnection hanging in d.redirectResponseToOutputStream until RunContainer times out with case <-time.After(DefaultDockerTimeout).

I have traced this to stdCopy.StdCopy, where out.Write(buf[StdWriterPrefixLen : frameSize+StdWriterPrefixLen]) hangs for two minutes, then exits with the closed pipe error. The logrus output does not seem to go anywhere, I had to change it to Println to see the error.

I'm running this on Debian Stretch.

(One of the last tests hangs indefinitely, but that's a different issue, possibly on my side.)

accursoft commented 8 years ago

PS Shouldn't there be some kind of warning logged if RunContainer exits with a time out?

rhcarvalho commented 8 years ago

@accursoft hi, what version of s2i are you running?

$ s2i version
accursoft commented 8 years ago

s2i v1.1.2-38-gddb10f1

rhcarvalho commented 8 years ago

I'm trying to reproduce with:

$ s2i version
s2i v1.1.3
$ git clone https://github.com/accursoft/Haskell-Cloud.git
$ cd Haskell-Cloud
$ ./test 
Sending build context to Docker daemon 13.82 kB
Step 1 : FROM accursoft/micro-debian
Trying to pull repository docker.io/accursoft/micro-debian ... 
...
rhcarvalho commented 8 years ago

Looking like it's gonna take forever, but still running ghc :smile:

bparees commented 8 years ago

fyi i just did a new release last night. which i guess @rhcarvalho already picked up :)

rhcarvalho commented 8 years ago

@bparees yes I did, I'm testing with the latest release.

rhcarvalho commented 8 years ago

@accursoft does this resemble what you're observing?

...
Step 16 : COPY s2i /opt/s2i/
 ---> a38370cab215
Removing intermediate container e5a94d27cc8b
Successfully built a38370cab215
Sending build context to Docker daemon 2.048 kB
Step 1 : FROM ghc-temp
 ---> a38370cab215
Step 2 : MAINTAINER Gideon Sireling <code@accursoft.com>
 ---> Running in 00103e06c964
 ---> 3975c9b373fd
Removing intermediate container 00103e06c964
Step 3 : LABEL io.openshift.s2i.scripts-url "image://opt/s2i" io.k8s.display-name "Haskell" io.k8s.description "GHC and cabal-install" io.openshift.tags "haskell,ghc,network,ghc-8.0.1,cabal-install-1.24.0.0,network-2.6.3.1,builder" io.openshift.expose-services "8080:http"
 ---> Running in 3ad2c0804727
 ---> fe41e4ce0010
Removing intermediate container 3ad2c0804727
Successfully built fe41e4ce0010
Untagged: ghc-temp:latest
  * Testing create app ...
Running S2I version "v1.1.3"
Preparing to build test
Copying sources from "/vagrant/Haskell-Cloud/server" to "/tmp/s2i528190197/upload/src"
Image "test:latest" not available locally, pulling ...
Clean build will be performed
Running "assemble" in "test"

Downloading the latest package list from hackage.haskell.org
3 CPUs available for parallel builds
Resolving dependencies...
Configuring server-0.0...
Building server-0.0...
Installed server-0.0

  * Testing enable flags ...
Running S2I version "v1.1.3"
Preparing to build test
Copying sources from "/vagrant/Haskell-Cloud/server" to "/tmp/s2i915776168/upload/src"
Existing image for tag test detected for incremental build
Saving build artifacts from image test to path /tmp/s2i915776168/upload/artifacts
Running "assemble" in "test"

3 CPUs available for parallel builds
marker: cabal_flags
b
Resolving dependencies...
Configuring server-0.0...
Building server-0.0...
Installed server-0.0

Removing previously-tagged image sha256:9662fc589e27ec609225596df921b9e46a0591b9dceeec6d62f3346067b2bced
curl: (56) Recv failure: Connection reset by peer
  * enable flags FAILED
rhcarvalho commented 8 years ago

Running with bash -x I see the curl call is not part of the build:

+ docker run --name test -d -p 8080:8080 -u 1001 test
+ sleep 0.01
+ curl -sS localhost:8080
+ grep -q Greetings
curl: (56) Recv failure: Connection reset by peer
+ echo '  * pre_build hook FAILED'
  * pre_build hook FAILED
+ exit 1

So that looks like an error in the image, @accursoft can you confirm / provide another way to reproduce the s2i build problem? Thanks!

accursoft commented 8 years ago

Increasing the value of sleep may help on slow systems. Either way, the problem should be manifest in a 2 minute pause between the first two tests, so you could just if false out all the others and skip to the clean up at the end.

test sets --loglevel=1; if you change this to 2 you should see the pause immediately after unpacking the tar stream from save-artifacts.

My docker is Docker version 1.12.2, build bb80604 from apt.dockerproject.org.

rhcarvalho commented 8 years ago

@accursoft would you mind making the proposed edits in your code and sending it over? It's a holiday tomorrow, so I may only look again into this next week. Thanks.

rhcarvalho commented 8 years ago

@accursoft I was able to reproduce what you've described, no need to send over code changes.

Changes I've made:

diff --git a/test b/test
index 55fa2c2..af3c2f7 100755
--- a/test
+++ b/test
@@ -1,6 +1,6 @@
 #!/bin/bash -eu

-s2i="s2i build --rm --incremental --loglevel=1 . accursoft/ghc-network test"
+s2i="s2i build --rm --incremental --loglevel=2 . accursoft/ghc-network test"

 function check {
   echo "  * Testing $1 ..."

Here's what I see:

$ bash -x test
[...]
+ s2i build --rm --incremental --loglevel=2 . accursoft/ghc-network test
Running S2I version "v1.1.3"

Builder Name:                   Haskell
Builder Image:                  accursoft/ghc-network
Source:                         .
Output Image Tag:               test
Environment:
Labels:
Incremental Build:              enabled
Incremental Image Pull User:
Remove Old Build:               enabled
Builder Pull Policy:            if-not-present
Previous Image Pull Policy:     if-not-present
Quiet:                          disabled
Layered Build:                  disabled
Docker Endpoint:                unix:///var/run/docker.sock

Image sha256:fe41e4ce0010a5c5cc3bef2cd5b1fd6587c4b9a17b03d8b6d83ba21f93cbfcf1 contains io.openshift.s2i.scripts-url set to "image://opt/s2i"
Preparing to build test
Copying sources from "/vagrant/Haskell-Cloud/server" to "/tmp/s2i209367094/upload/src"
Existing image for tag test detected for incremental build
Performing source build from file:///vagrant/Haskell-Cloud/server
Saving build artifacts from image test to path /tmp/s2i209367094/upload/artifacts
Image sha256:d800c645e21292816ab8ca6b5b6222d11a76e3952f28edfde14b90584faf6ead contains io.openshift.s2i.scripts-url set to "image://opt/s2i"
Base directory for S2I scripts is 'opt/s2i'. Untarring destination is '/tmp'.
Creating container with options {Name:"s2i_test_3d9ae978" Config:&{Hostname: Domainname: User:build AttachStdin:false AttachStdout:true AttachStderr:false ExposedPorts:map[] Tty:false OpenStdin:false StdinOnce:false Env:[] Cmd:[opt/s2i/save-artifacts] Healthcheck:<nil> ArgsEscaped:false Image:test:latest Volumes:map[] WorkingDir: Entrypoint:[] NetworkDisabled:false MacAddress: OnBuild:[] Labels:map[] StopSignal: StopTimeout:<nil> Shell:[]} HostConfig:&{Binds:[] ContainerIDFile: LogConfig:{Type: Config:map[]} NetworkMode: PortBindings:map[] RestartPolicy:{Name: MaximumRetryCount:0} AutoRemove:false VolumeDriver: VolumesFrom:[] CapAdd:[] CapDrop:[] DNS:[] DNSOptions:[] DNSSearch:[] ExtraHosts:[] GroupAdd:[] IpcMode: Cgroup: Links:[] OomScoreAdj:0 PidMode: Privileged:false PublishAllPorts:false ReadonlyRootfs:false SecurityOpt:[] StorageOpt:map[] Tmpfs:map[] UTSMode: UsernsMode: ShmSize:0 Sysctls:map[] Runtime: ConsoleSize:[0 0] Isolation: Resources:{CPUShares:0 Memory:0 CgroupParent: BlkioWeight:0 BlkioWeightDevice:[] BlkioDeviceReadBps:[] BlkioDeviceWriteBps:[] BlkioDeviceReadIOps:[] BlkioDeviceWriteIOps:[] CPUPeriod:0 CPUQuota:0 CpusetCpus: CpusetMems: Devices:[] DiskQuota:0 KernelMemory:0 MemoryReservation:0 MemorySwap:0 MemorySwappiness:<nil> OomKillDisable:<nil> PidsLimit:0 Ulimits:[] CPUCount:0 CPUPercent:0 IOMaximumIOps:0 IOMaximumBandwidth:0} Mounts:[]}} ...
Attaching to container "a06e3a5e13a4aa6bf5a91e4088bf4d0944abffc4c0aee06eca849f26b3375323" ...
Starting container "a06e3a5e13a4aa6bf5a91e4088bf4d0944abffc4c0aee06eca849f26b3375323" ...
Done extracting tar stream

<<< long pause >>>

Running "assemble" in "test"
Using image name accursoft/ghc-network
starting the source uploading ...

Image sha256:fe41e4ce0010a5c5cc3bef2cd5b1fd6587c4b9a17b03d8b6d83ba21f93cbfcf1 contains io.openshift.s2i.scripts-url set to "image://opt/s2i"
Base directory for S2I scripts is 'opt/s2i'. Untarring destination is '/tmp'.
[...]

The pause by itself doesn't necessarily mean there's something wrong, might just be the log messages fooling us. But yes, looks suspicious and deserves investigation.


Here's a goroutine dump taken during the pause for (my) reference later:

``` + s2i build --rm --incremental --loglevel=2 . accursoft/ghc-network test Running S2I version "v1.1.3" Builder Name: Haskell Builder Image: accursoft/ghc-network Source: . Output Image Tag: test Environment: Labels: Incremental Build: enabled Incremental Image Pull User: Remove Old Build: enabled Builder Pull Policy: if-not-present Previous Image Pull Policy: if-not-present Quiet: disabled Layered Build: disabled Docker Endpoint: unix:///var/run/docker.sock Image sha256:fe41e4ce0010a5c5cc3bef2cd5b1fd6587c4b9a17b03d8b6d83ba21f93cbfcf1 contains io.openshift.s2i.scripts-url set to "image://opt/s2i" Preparing to build test Copying sources from "/vagrant/Haskell-Cloud/server" to "/tmp/s2i383991059/upload/src" Existing image for tag test detected for incremental build Performing source build from file:///vagrant/Haskell-Cloud/server Saving build artifacts from image test to path /tmp/s2i383991059/upload/artifacts Image sha256:531dccfefc6728198d2eddd7b60d1f73f9830b0da8de3c23e999b3a41b358cdf contains io.openshift.s2i.scripts-url set to "image://opt/s2i" Base directory for S2I scripts is 'opt/s2i'. Untarring destination is '/tmp'. Creating container with options {Name:"s2i_test_36301125" Config:&{Hostname: Domainname: User:build AttachStdin:false AttachStdout:true AttachStderr:false ExposedPorts:map[] Tty:false OpenStdin:false StdinOnce:false Env:[] Cmd:[opt/s2i/save-artifacts] Healthcheck: ArgsEscaped:false Image:test:latest Volumes:map[] WorkingDir: Entrypoint:[] NetworkDisabled:false MacAddress: OnBuild:[] Labels:map[] StopSignal: StopTimeout: Shell:[]} HostConfig:&{Binds:[] ContainerIDFile: LogConfig:{Type: Config:map[]} NetworkMode: PortBindings:map[] RestartPolicy:{Name: MaximumRetryCount:0} AutoRemove:false VolumeDriver: VolumesFrom:[] CapAdd:[] CapDrop:[] DNS:[] DNSOptions:[] DNSSearch:[] ExtraHosts:[] GroupAdd:[] IpcMode: Cgroup: Links:[] OomScoreAdj:0 PidMode: Privileged:false PublishAllPorts:false ReadonlyRootfs:false SecurityOpt:[] StorageOpt:map[] Tmpfs:map[] UTSMode: UsernsMode: ShmSize:0 Sysctls:map[] Runtime: ConsoleSize:[0 0] Isolation: Resources:{CPUShares:0 Memory:0 CgroupParent: BlkioWeight:0 BlkioWeightDevice:[] BlkioDeviceReadBps:[] BlkioDeviceWriteBps:[] BlkioDeviceReadIOps:[] BlkioDeviceWriteIOps:[] CPUPeriod:0 CPUQuota:0 CpusetCpus: CpusetMems: Devices:[] DiskQuota:0 KernelMemory:0 MemoryReservation:0 MemorySwap:0 MemorySwappiness: OomKillDisable: PidsLimit:0 Ulimits:[] CPUCount:0 CPUPercent:0 IOMaximumIOps:0 IOMaximumBandwidth:0} Mounts:[]}} ... Attaching to container "b20e643f4b181c3c73b67b1be6d4ca692cfc02804e86c20a17414b1bbbbf4622" ... Starting container "b20e643f4b181c3c73b67b1be6d4ca692cfc02804e86c20a17414b1bbbbf4622" ... Done extracting tar stream ^\goroutine 53 [running]: github.com/openshift/source-to-image/pkg/docker.(*stiDocker).RunContainer.func2(0xc5ad80, 0xc4203c3100) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:958 +0xf4 github.com/openshift/source-to-image/pkg/util/interrupt.(*Handler).signal.func1() /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/util/interrupt/interrupt.go:102 +0xb5 sync.(*Once).Do(0xc42030ef20, 0xc420047f20) /usr/local/go/src/sync/once.go:44 +0xdb github.com/openshift/source-to-image/pkg/util/interrupt.(*Handler).signal(0xc42030ef00, 0xc5ad80, 0xc4203c3100) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/util/interrupt/interrupt.go:103 +0x60 github.com/openshift/source-to-image/pkg/util/interrupt.(*Handler).Run.func1(0xc420091560, 0xc42030ef00) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/util/interrupt/interrupt.go:75 +0x82 created by github.com/openshift/source-to-image/pkg/util/interrupt.(*Handler).Run /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/util/interrupt/interrupt.go:76 +0x103 goroutine 1 [select]: github.com/openshift/source-to-image/pkg/docker.(*stiDocker).RunContainer.func3(0x0, 0x0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:1068 +0x9fe github.com/openshift/source-to-image/pkg/util/interrupt.(*Handler).Run(0xc42030ef00, 0xc4201c5110, 0x0, 0x0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/util/interrupt/interrupt.go:78 +0x133 github.com/openshift/source-to-image/pkg/docker.(*stiDocker).RunContainer(0xc42036c280, 0x7ffe77262674, 0x4, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:1074 +0x930 github.com/openshift/source-to-image/pkg/build/strategies/sti.(*STI).Save(0xc420321000, 0xc420214380, 0x0, 0x0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/build/strategies/sti/sti.go:465 +0xb5d github.com/openshift/source-to-image/pkg/build/strategies/sti.(*STI).Build(0xc420321000, 0xc420214380, 0x0, 0x0, 0x0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/build/strategies/sti/sti.go:193 +0xd25 main.newCmdBuild.func1(0xc4202cc5a0, 0xc42020eea0, 0x3, 0x6) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/cmd/s2i/main.go:156 +0x75c github.com/openshift/source-to-image/vendor/github.com/spf13/cobra.(*Command).execute(0xc4202cc5a0, 0xc42020ede0, 0x6, 0x6, 0xc4202cc5a0, 0xc42020ede0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/spf13/cobra/command.go:547 +0x411 github.com/openshift/source-to-image/vendor/github.com/spf13/cobra.(*Command).Execute(0xc4202cc1e0, 0xc4201c5ea8, 0x1) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/spf13/cobra/command.go:630 +0x398 main.main() /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/cmd/s2i/main.go:415 +0x664 goroutine 17 [syscall, locked to thread]: runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:2086 +0x1 goroutine 5 [chan receive]: github.com/openshift/source-to-image/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0xc71ba0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/golang/glog/glog.go:882 +0x7a created by github.com/openshift/source-to-image/vendor/github.com/golang/glog.init.1 /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/golang/glog/glog.go:410 +0x21d goroutine 6 [syscall]: os/signal.signal_recv(0xc5ad80) /usr/local/go/src/runtime/sigqueue.go:116 +0x157 os/signal.loop() /usr/local/go/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.1 /usr/local/go/src/os/signal/signal_unix.go:28 +0x41 goroutine 54 [chan receive]: github.com/openshift/source-to-image/pkg/docker.(*stiDocker).holdHijackedConnection(0xc42036c280, 0x899400, 0x0, 0x0, 0xc56f40, 0xc4202ec168, 0xc56f40, 0xc4202ec178, 0xc5e720, 0xc4202ec1b8, ...) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:874 +0x27c github.com/openshift/source-to-image/pkg/docker.(*stiDocker).RunContainer.func3.1(0xc42036c280, 0xc420094b00, 0x40, 0x0, 0x0, 0x0, 0xc420300820, 0xc4203c45a0, 0xc4203c2930, 0xc420317f10) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:987 +0x47a created by github.com/openshift/source-to-image/pkg/docker.(*stiDocker).RunContainer.func3 /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:989 +0x2c7 goroutine 8 [IO wait]: net.runtime_pollWait(0x7f066be20220, 0x72, 0x3) /usr/local/go/src/runtime/netpoll.go:160 +0x59 net.(*pollDesc).wait(0xc4202d1480, 0x72, 0xc4200399d0, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:73 +0x38 net.(*pollDesc).waitRead(0xc4202d1480, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:78 +0x34 net.(*netFD).Read(0xc4202d1420, 0xc4202fe000, 0x1000, 0x1000, 0x0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_unix.go:243 +0x1a1 net.(*conn).Read(0xc4200283f0, 0xc4202fe000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:173 +0x70 net/http.(*persistConn).Read(0xc4202ce800, 0xc4202fe000, 0x1000, 0x1000, 0x62b260, 0xc420039b58, 0x409a9d) /usr/local/go/src/net/http/transport.go:1261 +0x154 bufio.(*Reader).fill(0xc42020f320) /usr/local/go/src/bufio/bufio.go:97 +0x10c bufio.(*Reader).Peek(0xc42020f320, 0x1, 0xc420039bbd, 0x1, 0x0, 0xc4202f4420, 0x0) /usr/local/go/src/bufio/bufio.go:129 +0x62 net/http.(*persistConn).readLoop(0xc4202ce800) /usr/local/go/src/net/http/transport.go:1418 +0x1a1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1062 +0x4e9 goroutine 9 [select]: net/http.(*persistConn).writeLoop(0xc4202ce800) /usr/local/go/src/net/http/transport.go:1646 +0x3bd created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1063 +0x50e goroutine 32 [select, locked to thread]: runtime.gopark(0x899dc0, 0x0, 0x858d00, 0x6, 0x18, 0x2) /usr/local/go/src/runtime/proc.go:259 +0x13a runtime.selectgoImpl(0xc42031e730, 0x0, 0x18) /usr/local/go/src/runtime/select.go:423 +0x11d9 runtime.selectgo(0xc42031e730) /usr/local/go/src/runtime/select.go:238 +0x1c runtime.ensureSigM.func1() /usr/local/go/src/runtime/signal1_unix.go:304 +0x2f3 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:2086 +0x1 goroutine 40 [IO wait]: net.runtime_pollWait(0x7f066be20160, 0x72, 0x5) /usr/local/go/src/runtime/netpoll.go:160 +0x59 net.(*pollDesc).wait(0xc4203165a0, 0x72, 0xc42003a9d0, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:73 +0x38 net.(*pollDesc).waitRead(0xc4203165a0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:78 +0x34 net.(*netFD).Read(0xc420316540, 0xc420354000, 0x1000, 0x1000, 0x0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_unix.go:243 +0x1a1 net.(*conn).Read(0xc4202ec068, 0xc420354000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:173 +0x70 net/http.(*persistConn).Read(0xc42006a200, 0xc420354000, 0x1000, 0x1000, 0x62b260, 0xc42003ab58, 0x409a9d) /usr/local/go/src/net/http/transport.go:1261 +0x154 bufio.(*Reader).fill(0xc4202f4cc0) /usr/local/go/src/bufio/bufio.go:97 +0x10c bufio.(*Reader).Peek(0xc4202f4cc0, 0x1, 0xc42003abbd, 0x1, 0x0, 0xc4202f4d20, 0x0) /usr/local/go/src/bufio/bufio.go:129 +0x62 net/http.(*persistConn).readLoop(0xc42006a200) /usr/local/go/src/net/http/transport.go:1418 +0x1a1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1062 +0x4e9 goroutine 41 [select]: net/http.(*persistConn).writeLoop(0xc42006a200) /usr/local/go/src/net/http/transport.go:1646 +0x3bd created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1063 +0x50e goroutine 22 [IO wait]: net.runtime_pollWait(0x7f066be200a0, 0x72, 0x6) /usr/local/go/src/runtime/netpoll.go:160 +0x59 net.(*pollDesc).wait(0xc420310300, 0x72, 0xc4203099d0, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:73 +0x38 net.(*pollDesc).waitRead(0xc420310300, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:78 +0x34 net.(*netFD).Read(0xc4203102a0, 0xc42037c000, 0x1000, 0x1000, 0x0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_unix.go:243 +0x1a1 net.(*conn).Read(0xc420096030, 0xc42037c000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:173 +0x70 net/http.(*persistConn).Read(0xc420372000, 0xc42037c000, 0x1000, 0x1000, 0x62b260, 0xc420309b58, 0x409a9d) /usr/local/go/src/net/http/transport.go:1261 +0x154 bufio.(*Reader).fill(0xc420090660) /usr/local/go/src/bufio/bufio.go:97 +0x10c bufio.(*Reader).Peek(0xc420090660, 0x1, 0xc420309bbd, 0x1, 0x0, 0xc4202f5140, 0x0) /usr/local/go/src/bufio/bufio.go:129 +0x62 net/http.(*persistConn).readLoop(0xc420372000) /usr/local/go/src/net/http/transport.go:1418 +0x1a1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1062 +0x4e9 goroutine 23 [select]: net/http.(*persistConn).writeLoop(0xc420372000) /usr/local/go/src/net/http/transport.go:1646 +0x3bd created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1063 +0x50e goroutine 27 [IO wait]: net.runtime_pollWait(0x7f066be1ffe0, 0x72, 0x7) /usr/local/go/src/runtime/netpoll.go:160 +0x59 net.(*pollDesc).wait(0xc420310610, 0x72, 0xc42030b9d0, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:73 +0x38 net.(*pollDesc).waitRead(0xc420310610, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:78 +0x34 net.(*netFD).Read(0xc4203105b0, 0xc42038a000, 0x1000, 0x1000, 0x0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_unix.go:243 +0x1a1 net.(*conn).Read(0xc420096078, 0xc42038a000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:173 +0x70 net/http.(*persistConn).Read(0xc420372100, 0xc42038a000, 0x1000, 0x1000, 0x30, 0xc42030bb58, 0x43eaac) /usr/local/go/src/net/http/transport.go:1261 +0x154 bufio.(*Reader).fill(0xc420090f00) /usr/local/go/src/bufio/bufio.go:97 +0x10c bufio.(*Reader).Peek(0xc420090f00, 0x1, 0x0, 0x1, 0x0, 0xc4203c5020, 0x0) /usr/local/go/src/bufio/bufio.go:129 +0x62 net/http.(*persistConn).readLoop(0xc420372100) /usr/local/go/src/net/http/transport.go:1418 +0x1a1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1062 +0x4e9 goroutine 28 [select]: net/http.(*persistConn).writeLoop(0xc420372100) /usr/local/go/src/net/http/transport.go:1646 +0x3bd created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1063 +0x50e goroutine 47 [IO wait]: net.runtime_pollWait(0x7f066be1ff20, 0x72, 0x9) /usr/local/go/src/runtime/netpoll.go:160 +0x59 net.(*pollDesc).wait(0xc420317bf0, 0x72, 0xc4200369d0, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:73 +0x38 net.(*pollDesc).waitRead(0xc420317bf0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_poll_runtime.go:78 +0x34 net.(*netFD).Read(0xc420317b90, 0xc4203ba000, 0x1000, 0x1000, 0x0, 0xc58e80, 0xc420012068) /usr/local/go/src/net/fd_unix.go:243 +0x1a1 net.(*conn).Read(0xc4202ec150, 0xc4203ba000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:173 +0x70 net/http.(*persistConn).Read(0xc42006a500, 0xc4203ba000, 0x1000, 0x1000, 0x62b260, 0xc420036b58, 0x409a9d) /usr/local/go/src/net/http/transport.go:1261 +0x154 bufio.(*Reader).fill(0xc4202f5aa0) /usr/local/go/src/bufio/bufio.go:97 +0x10c bufio.(*Reader).Peek(0xc4202f5aa0, 0x1, 0xc420036bbd, 0x1, 0x0, 0xc4200913e0, 0x0) /usr/local/go/src/bufio/bufio.go:129 +0x62 net/http.(*persistConn).readLoop(0xc42006a500) /usr/local/go/src/net/http/transport.go:1418 +0x1a1 created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1062 +0x4e9 goroutine 48 [select]: net/http.(*persistConn).writeLoop(0xc42006a500) /usr/local/go/src/net/http/transport.go:1646 +0x3bd created by net/http.(*Transport).dialConn /usr/local/go/src/net/http/transport.go:1063 +0x50e goroutine 55 [semacquire]: sync.runtime_notifyListWait(0xc42034a1f8, 0xc40000023e) /usr/local/go/src/runtime/sema.go:267 +0x122 sync.(*Cond).Wait(0xc42034a1e8) /usr/local/go/src/sync/cond.go:57 +0x80 io.(*pipe).write(0xc42034a180, 0xc420412008, 0x7800, 0x8001, 0x0, 0xc566c0, 0xc420012160) /usr/local/go/src/io/pipe.go:89 +0x191 io.(*PipeWriter).Write(0xc4202ec168, 0xc420412008, 0x7800, 0x8001, 0x1, 0x0, 0x0) /usr/local/go/src/io/pipe.go:156 +0x4c github.com/openshift/source-to-image/vendor/github.com/docker/docker/pkg/stdcopy.StdCopy(0xc56f40, 0xc4202ec168, 0xc56f40, 0xc4202ec178, 0xc56240, 0xc4203c4840, 0xc420027710, 0xc4203c4120, 0x0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/docker/docker/pkg/stdcopy/stdcopy.go:151 +0x41d github.com/openshift/source-to-image/pkg/docker.(*stiDocker).redirectResponseToOutputStream(0xc42036c280, 0xc420027700, 0xc56f40, 0xc4202ec168, 0xc56f40, 0xc4202ec178, 0xc56240, 0xc4203c4840, 0x0, 0x0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:845 +0xd2 github.com/openshift/source-to-image/pkg/docker.(*stiDocker).holdHijackedConnection.func1(0xc4203c48a0, 0xc42036c280, 0xc420045d00, 0xc56f40, 0xc4202ec168, 0xc56f40, 0xc4202ec178, 0xc4203d8120) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:856 +0x8c created by github.com/openshift/source-to-image/pkg/docker.(*stiDocker).holdHijackedConnection /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/pkg/docker/docker.go:857 +0x11d goroutine 66 [runnable]: github.com/openshift/source-to-image/vendor/github.com/docker/engine-api/client/transport/cancellable.Do.func3(0xc5c840, 0xc4203c4ea0, 0xc4203c31a0, 0xc4203f54a0) /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/docker/engine-api/client/transport/cancellable/cancellable.go:78 created by github.com/openshift/source-to-image/vendor/github.com/docker/engine-api/client/transport/cancellable.Do /home/rodolfo/openshift/src/github.com/openshift/source-to-image/_output/local/go/src/github.com/openshift/source-to-image/vendor/github.com/docker/engine-api/client/transport/cancellable/cancellable.go:85 +0x2c3 ```

And here is the output with --loglevel=7 just in case:

``` + s2i build --rm --incremental --loglevel=7 . accursoft/ghc-network test I1027 21:05:24.270559 5142 main.go:68] Running S2I version "v1.1.3" I1027 21:05:24.274647 5142 docker.go:515] Using locally available image "accursoft/ghc-network:latest" I1027 21:05:24.276769 5142 main.go:147] Builder Name: Haskell Builder Image: accursoft/ghc-network Source: . Output Image Tag: test Environment: Labels: Incremental Build: enabled Incremental Image Pull User: Remove Old Build: enabled Builder Pull Policy: if-not-present Previous Image Pull Policy: if-not-present Quiet: disabled Layered Build: disabled Docker Endpoint: unix:///var/run/docker.sock I1027 21:05:24.293843 5142 docker.go:515] Using locally available image "accursoft/ghc-network:latest" I1027 21:05:24.297759 5142 docker.go:515] Using locally available image "accursoft/ghc-network:latest" I1027 21:05:24.297850 5142 docker.go:726] Image sha256:fe41e4ce0010a5c5cc3bef2cd5b1fd6587c4b9a17b03d8b6d83ba21f93cbfcf1 contains io.openshift.s2i.scripts-url set to "image://opt/s2i" I1027 21:05:24.297918 5142 scm.go:21] DownloadForSource . I1027 21:05:24.297973 5142 git.go:208] makePathAbsolute . I1027 21:05:24.297999 5142 git.go:211] makePathAbsolute new path /vagrant/Haskell-Cloud/server err I1027 21:05:24.298010 5142 scm.go:28] return from ParseFile file exists true proto specified false use copy true I1027 21:05:24.298019 5142 scm.go:40] new source from parse file /vagrant/Haskell-Cloud/server I1027 21:05:24.298053 5142 sti.go:178] Preparing to build test I1027 21:05:24.298644 5142 download.go:30] Copying sources from "/vagrant/Haskell-Cloud/server" to "/tmp/s2i420240331/upload/src" I1027 21:05:24.298773 5142 fs.go:117] F "/vagrant/Haskell-Cloud/server/.hg_archival.txt" -> "/tmp/s2i420240331/upload/src/.hg_archival.txt" I1027 21:05:24.298881 5142 fs.go:117] F "/vagrant/Haskell-Cloud/server/Main.hs" -> "/tmp/s2i420240331/upload/src/Main.hs" I1027 21:05:24.298976 5142 fs.go:117] F "/vagrant/Haskell-Cloud/server/server.cabal" -> "/tmp/s2i420240331/upload/src/server.cabal" I1027 21:05:24.299088 5142 fs.go:117] F "/vagrant/Haskell-Cloud/server/response" -> "/tmp/s2i420240331/upload/src/response" I1027 21:05:24.299130 5142 fs.go:104] D "/vagrant/Haskell-Cloud/server/.s2i" -> "/tmp/s2i420240331/upload/src/.s2i" I1027 21:05:24.299208 5142 fs.go:104] D "/vagrant/Haskell-Cloud/server/.s2i/hooks" -> "/tmp/s2i420240331/upload/src/.s2i/hooks" I1027 21:05:24.299295 5142 fs.go:104] D "/vagrant/Haskell-Cloud/server/.s2i/markers" -> "/tmp/s2i420240331/upload/src/.s2i/markers" I1027 21:05:24.299490 5142 install.go:249] Using "assemble" installed from "image://opt/s2i/assemble" I1027 21:05:24.299532 5142 install.go:249] Using "run" installed from "image://opt/s2i/run" I1027 21:05:24.299558 5142 install.go:249] Using "save-artifacts" installed from "image://opt/s2i/save-artifacts" I1027 21:05:24.299583 5142 ignore.go:63] .s2iignore file does not exist I1027 21:05:24.302095 5142 docker.go:515] Using locally available image "test:latest" I1027 21:05:24.303736 5142 sti.go:186] Existing image for tag test detected for incremental build I1027 21:05:24.303771 5142 sti.go:191] Performing source build from file:///vagrant/Haskell-Cloud/server I1027 21:05:24.303859 5142 sti.go:431] Saving build artifacts from image test to path /tmp/s2i420240331/upload/artifacts I1027 21:05:24.305396 5142 sti.go:443] The assemble user is not set, defaulting to "build" user I1027 21:05:24.308002 5142 docker.go:726] Image sha256:531dccfefc6728198d2eddd7b60d1f73f9830b0da8de3c23e999b3a41b358cdf contains io.openshift.s2i.scripts-url set to "image://opt/s2i" I1027 21:05:24.308030 5142 docker.go:801] Base directory for S2I scripts is 'opt/s2i'. Untarring destination is '/tmp'. I1027 21:05:24.308042 5142 docker.go:930] Setting "opt/s2i/save-artifacts" command for container ... I1027 21:05:24.308148 5142 docker.go:935] Creating container with options {Name:"s2i_test_dac254ac" Config:&{Hostname: Domainname: User:build AttachStdin:false AttachStdout:true AttachStderr:false ExposedPorts:map[] Tty:false OpenStdin:false StdinOnce:false Env:[] Cmd:[opt/s2i/save-artifacts] Healthcheck: ArgsEscaped:false Image:test:latest Volumes:map[] WorkingDir: Entrypoint:[] NetworkDisabled:false MacAddress: OnBuild:[] Labels:map[] StopSignal: StopTimeout: Shell:[]} HostConfig:&{Binds:[] ContainerIDFile: LogConfig:{Type: Config:map[]} NetworkMode: PortBindings:map[] RestartPolicy:{Name: MaximumRetryCount:0} AutoRemove:false VolumeDriver: VolumesFrom:[] CapAdd:[] CapDrop:[] DNS:[] DNSOptions:[] DNSSearch:[] ExtraHosts:[] GroupAdd:[] IpcMode: Cgroup: Links:[] OomScoreAdj:0 PidMode: Privileged:false PublishAllPorts:false ReadonlyRootfs:false SecurityOpt:[] StorageOpt:map[] Tmpfs:map[] UTSMode: UsernsMode: ShmSize:0 Sysctls:map[] Runtime: ConsoleSize:[0 0] Isolation: Resources:{CPUShares:0 Memory:0 CgroupParent: BlkioWeight:0 BlkioWeightDevice:[] BlkioDeviceReadBps:[] BlkioDeviceWriteBps:[] BlkioDeviceReadIOps:[] BlkioDeviceWriteIOps:[] CPUPeriod:0 CPUQuota:0 CpusetCpus: CpusetMems: Devices:[] DiskQuota:0 KernelMemory:0 MemoryReservation:0 MemorySwap:0 MemorySwappiness: OomKillDisable: PidsLimit:0 Ulimits:[] CPUCount:0 CPUPercent:0 IOMaximumIOps:0 IOMaximumBandwidth:0} Mounts:[]}} ... I1027 21:05:24.607999 5142 docker.go:969] Attaching to container "77f30a1f4fd08373045f6cd37895dd5fe61ce03ff3d1ae38d0c49e53ecb8bb4c" ... I1027 21:05:24.609588 5142 docker.go:1007] Starting container "77f30a1f4fd08373045f6cd37895dd5fe61ce03ff3d1ae38d0c49e53ecb8bb4c" ... I1027 21:05:24.967117 5142 docker.go:1033] Waiting for container "77f30a1f4fd08373045f6cd37895dd5fe61ce03ff3d1ae38d0c49e53ecb8bb4c" to stop ... I1027 21:05:24.976861 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal I1027 21:05:24.977254 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal I1027 21:05:24.977297 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/config I1027 21:05:24.977362 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/config I1027 21:05:24.977473 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/bin I1027 21:05:24.977571 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/bin I1027 21:05:24.977662 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/bin/server I1027 21:05:24.977730 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/bin/server I1027 21:05:25.286007 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/lib I1027 21:05:25.286144 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1 I1027 21:05:25.286228 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG I1027 21:05:25.286321 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/Data I1027 21:05:25.286412 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/Data I1027 21:05:25.286441 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/Data/Group.hi I1027 21:05:25.286502 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/Data/Group.hi I1027 21:05:25.287685 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG I1027 21:05:25.287718 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/libHSgroups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.a I1027 21:05:25.287811 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/libHSgroups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.a I1027 21:05:25.289045 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages I1027 21:05:25.289166 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org I1027 21:05:25.289273 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org I1027 21:05:25.289298 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz I1027 21:05:25.289408 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz I1027 21:05:26.142156 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org I1027 21:05:26.142193 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz.etag I1027 21:05:26.142295 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz.etag I1027 21:05:26.142366 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org I1027 21:05:26.142383 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/build-reports.log I1027 21:05:26.142439 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/build-reports.log I1027 21:05:26.142495 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups I1027 21:05:26.301410 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0 I1027 21:05:26.301634 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0 I1027 21:05:26.301660 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz I1027 21:05:26.301711 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz I1027 21:05:26.302224 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0 I1027 21:05:26.302292 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz.etag I1027 21:05:26.302367 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz.etag I1027 21:05:26.302419 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share I1027 21:05:26.302490 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/doc I1027 21:05:26.302553 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1 I1027 21:05:26.302635 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1/groups-0.4.0.0 I1027 21:05:26.303115 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1/groups-0.4.0.0 I1027 21:05:26.303144 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1/groups-0.4.0.0/LICENSE I1027 21:05:26.303195 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1/groups-0.4.0.0/LICENSE I1027 21:05:26.303276 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/x86_64-linux-ghc-8.0.1 I1027 21:05:26.303379 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/x86_64-linux-ghc-8.0.1/server-0.0 I1027 21:05:26.303504 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.cabal/share/x86_64-linux-ghc-8.0.1/server-0.0 I1027 21:05:26.303545 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.cabal/share/x86_64-linux-ghc-8.0.1/server-0.0/response I1027 21:05:26.303614 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.cabal/share/x86_64-linux-ghc-8.0.1/server-0.0/response I1027 21:05:26.303709 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.ghc I1027 21:05:26.303809 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.ghc/x86_64-linux-8.0.1 I1027 21:05:26.303899 5142 tar.go:364] Creating directory /tmp/s2i420240331/upload/artifacts/.ghc/x86_64-linux-8.0.1/package.conf.d I1027 21:05:26.303993 5142 tar.go:373] Creating directory /tmp/s2i420240331/upload/artifacts/.ghc/x86_64-linux-8.0.1/package.conf.d I1027 21:05:26.304035 5142 tar.go:424] Creating /tmp/s2i420240331/upload/artifacts/.ghc/x86_64-linux-8.0.1/package.conf.d/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.conf I1027 21:05:26.304106 5142 tar.go:434] Extracting/writing /tmp/s2i420240331/upload/artifacts/.ghc/x86_64-linux-8.0.1/package.conf.d/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.conf I1027 21:05:26.304224 5142 tar.go:403] Done extracting tar stream <<< 2-minute pause >>> I1027 21:07:26.530096 5142 docker.go:948] Removing container "77f30a1f4fd08373045f6cd37895dd5fe61ce03ff3d1ae38d0c49e53ecb8bb4c" ... I1027 21:07:26.870675 5142 docker.go:952] Removed container "77f30a1f4fd08373045f6cd37895dd5fe61ce03ff3d1ae38d0c49e53ecb8bb4c" I1027 21:07:26.870805 5142 sti.go:202] Running "assemble" in "test" I1027 21:07:26.870849 5142 sti.go:481] Using image name accursoft/ghc-network I1027 21:07:26.870890 5142 sti.go:380] No user environment provided (no environment file found in application sources) I1027 21:07:26.871127 5142 sti.go:606] I1027 21:07:26.871127 5142 sti.go:589] starting the source uploading ... I1027 21:07:26.871194 5142 tar.go:250] Adding "/tmp/s2i420240331/upload" to tar ... I1027 21:07:26.871452 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/bin/server as artifacts/.cabal/bin/server I1027 21:07:26.873699 5142 docker.go:726] Image sha256:fe41e4ce0010a5c5cc3bef2cd5b1fd6587c4b9a17b03d8b6d83ba21f93cbfcf1 contains io.openshift.s2i.scripts-url set to "image://opt/s2i" I1027 21:07:26.873733 5142 docker.go:801] Base directory for S2I scripts is 'opt/s2i'. Untarring destination is '/tmp'. I1027 21:07:26.873751 5142 docker.go:930] Setting "/bin/sh -c tar -C /tmp -xf - && opt/s2i/assemble" command for container ... I1027 21:07:26.873887 5142 docker.go:935] Creating container with options {Name:"s2i_accursoft_ghc_network_7d3370e6" Config:&{Hostname: Domainname: User: AttachStdin:false AttachStdout:true AttachStderr:false ExposedPorts:map[] Tty:false OpenStdin:true StdinOnce:true Env:[] Cmd:[/bin/sh -c tar -C /tmp -xf - && opt/s2i/assemble] Healthcheck: ArgsEscaped:false Image:accursoft/ghc-network:latest Volumes:map[] WorkingDir: Entrypoint:[] NetworkDisabled:false MacAddress: OnBuild:[] Labels:map[] StopSignal: StopTimeout: Shell:[]} HostConfig:&{Binds:[] ContainerIDFile: LogConfig:{Type: Config:map[]} NetworkMode: PortBindings:map[] RestartPolicy:{Name: MaximumRetryCount:0} AutoRemove:false VolumeDriver: VolumesFrom:[] CapAdd:[] CapDrop:[] DNS:[] DNSOptions:[] DNSSearch:[] ExtraHosts:[] GroupAdd:[] IpcMode: Cgroup: Links:[] OomScoreAdj:0 PidMode: Privileged:false PublishAllPorts:false ReadonlyRootfs:false SecurityOpt:[] StorageOpt:map[] Tmpfs:map[] UTSMode: UsernsMode: ShmSize:0 Sysctls:map[] Runtime: ConsoleSize:[0 0] Isolation: Resources:{CPUShares:0 Memory:0 CgroupParent: BlkioWeight:0 BlkioWeightDevice:[] BlkioDeviceReadBps:[] BlkioDeviceWriteBps:[] BlkioDeviceReadIOps:[] BlkioDeviceWriteIOps:[] CPUPeriod:0 CPUQuota:0 CpusetCpus: CpusetMems: Devices:[] DiskQuota:0 KernelMemory:0 MemoryReservation:0 MemorySwap:0 MemorySwappiness: OomKillDisable: PidsLimit:0 Ulimits:[] CPUCount:0 CPUPercent:0 IOMaximumIOps:0 IOMaximumBandwidth:0} Mounts:[]}} ... I1027 21:07:27.151610 5142 docker.go:969] Attaching to container "d6ad92246a06e81c55ffc5dc8e8a68072f8ce4dc2c57e9580b06e36b3ffd6d5e" ... I1027 21:07:27.153251 5142 docker.go:1007] Starting container "d6ad92246a06e81c55ffc5dc8e8a68072f8ce4dc2c57e9580b06e36b3ffd6d5e" ... I1027 21:07:27.395659 5142 docker.go:1033] Waiting for container "d6ad92246a06e81c55ffc5dc8e8a68072f8ce4dc2c57e9580b06e36b3ffd6d5e" to stop ... I1027 21:07:27.410135 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/config as artifacts/.cabal/config I1027 21:07:27.410434 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/Data/Group.hi as artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/Data/Group.hi I1027 21:07:27.410534 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/libHSgroups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.a as artifacts/.cabal/lib/x86_64-linux-ghc-8.0.1/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG/libHSgroups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.a I1027 21:07:27.411344 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz as artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz I1027 21:07:27.450347 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz.etag as artifacts/.cabal/packages/hackage.haskell.org/00-index.tar.gz.etag I1027 21:07:27.450951 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/build-reports.log as artifacts/.cabal/packages/hackage.haskell.org/build-reports.log I1027 21:07:27.451124 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz as artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz I1027 21:07:27.451267 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz.etag as artifacts/.cabal/packages/hackage.haskell.org/groups/0.4.0.0/groups-0.4.0.0.tar.gz.etag I1027 21:07:27.451461 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1/groups-0.4.0.0/LICENSE as artifacts/.cabal/share/doc/x86_64-linux-ghc-8.0.1/groups-0.4.0.0/LICENSE I1027 21:07:27.451586 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.cabal/share/x86_64-linux-ghc-8.0.1/server-0.0/response as artifacts/.cabal/share/x86_64-linux-ghc-8.0.1/server-0.0/response I1027 21:07:27.451822 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/artifacts/.ghc/x86_64-linux-8.0.1/package.conf.d/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.conf as artifacts/.ghc/x86_64-linux-8.0.1/package.conf.d/groups-0.4.0.0-53wMJ09LBR64aEJqWveBMG.conf I1027 21:07:27.451963 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/src/.hg_archival.txt as src/.hg_archival.txt I1027 21:07:27.452110 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/src/Main.hs as src/Main.hs I1027 21:07:27.452173 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/src/response as src/response I1027 21:07:27.452271 5142 tar.go:320] Adding to tar: /tmp/s2i420240331/upload/src/server.cabal as src/server.cabal I1027 21:07:28.583213 5142 sti.go:621] 3 CPUs available for parallel builds I1027 21:07:31.799779 5142 sti.go:621] Resolving dependencies... I1027 21:07:31.968767 5142 sti.go:621] Configuring server-0.0... I1027 21:07:32.213014 5142 sti.go:621] Building server-0.0... I1027 21:07:33.156561 5142 sti.go:621] Installed server-0.0 I1027 21:07:33.593911 5142 docker.go:1062] Invoking PostExecute function I1027 21:07:33.593944 5142 postexecutorstep.go:57] Executing step: store previous image I1027 21:07:33.593951 5142 sti.go:624] I1027 21:07:33.595561 5142 postexecutorstep.go:109] Executing step: commit image I1027 21:07:33.598831 5142 docker.go:1105] Committing container with dockerOpts: {Reference:test Comment: Author: Changes:[] Pause:false Config:0xc4203acb40}, config: {Hostname: Domainname: User:build AttachStdin:false AttachStdout:false AttachStderr:false ExposedPorts:map[] Tty:false OpenStdin:false StdinOnce:false Env:[] Cmd:[opt/s2i/run] Healthcheck: ArgsEscaped:false Image: Volumes:map[] WorkingDir: Entrypoint:[] NetworkDisabled:false MacAddress: OnBuild:[] Labels:map[io.k8s.display-name:test io.openshift.expose-services:8080:http io.openshift.s2i.scripts-url:image://opt/s2i io.openshift.s2i.build.image:accursoft/ghc-network io.openshift.s2i.build.source-location:/vagrant/Haskell-Cloud/server io.openshift.tags:haskell,ghc,network,ghc-8.0.1,cabal-install-1.24.0.0,network-2.6.3.1,builder io.k8s.description:GHC and cabal-install] StopSignal: StopTimeout: Shell:[]} I1027 21:07:34.902182 5142 postexecutorstep.go:370] Executing step: report success I1027 21:07:34.902242 5142 postexecutorstep.go:375] Successfully built test I1027 21:07:34.902276 5142 postexecutorstep.go:82] Executing step: remove previous image I1027 21:07:34.902312 5142 postexecutorstep.go:96] Removing previously-tagged image sha256:531dccfefc6728198d2eddd7b60d1f73f9830b0da8de3c23e999b3a41b358cdf I1027 21:07:35.083507 5142 postexecutorstep.go:393] Skipping step: invoke callback url I1027 21:07:35.083532 5142 docker.go:948] Removing container "d6ad92246a06e81c55ffc5dc8e8a68072f8ce4dc2c57e9580b06e36b3ffd6d5e" ... I1027 21:07:35.461912 5142 docker.go:952] Removed container "d6ad92246a06e81c55ffc5dc8e8a68072f8ce4dc2c57e9580b06e36b3ffd6d5e" I1027 21:07:35.461985 5142 cleanup.go:33] Removing temporary directory /tmp/s2i420240331 I1027 21:07:35.461994 5142 fs.go:159] Removing directory '/tmp/s2i420240331' ```
rhcarvalho commented 8 years ago

And to increase the confidence that there is a bug:

$ git grep -n '2 \* time.Minute'
pkg/docker/docker.go:68:        DefaultDockerTimeout = 2 * time.Minute

I'll be happy to fix this issue :smile:

rhcarvalho commented 8 years ago

The code that causes the 2 min pause on the happy path is here: pkg/docker/docker.go#L1068-L1073

I could not write any small patch that would not introduce other problems :(

Trying to write an integration test that reproduces the problem, so that we get not just a fix, but also a regression test.

accursoft commented 8 years ago

@rhcarvalho please see my earlier comment - the root cause lies much deeper.

accursoft commented 7 years ago

Any progress with this?

rhcarvalho commented 7 years ago

@accursoft I haven't had time I wished to devote to this, and had hopes #625 (though in spirit unrelated) would improve the situation (for the are some good commits cleaning up synchronization issues). I re-run your code against that PR's branch and realized that the builds would actually hang forever, not the 2min we see now.

Would be really helpful to have a smaller/quicker example to reproduce the issue, ideally in the form of an integration test (something I started but then got pulled into other work).

@bparees if there is anybody with more bandwidth to work on this, feel free to reassign.

jim-minter commented 7 years ago

@accursoft taking a look at this - when I run ./test it eventually bugs out with:

  HC [stage 1] libraries/base/dist-install/build/GHC/Unicode.o
  HC [stage 1] libraries/base/dist-install/build/Text/ParserCombinators/ReadP.o
  HC [stage 1] libraries/base/dist-install/build/Text/ParserCombinators/ReadPrec.o
  HC [stage 1] libraries/base/dist-install/build/Text/Read/Lex.o
/usr/bin/ld: fatal error: -pie and -r are incompatible
collect2: error: ld returned 1 exit status
`gcc' failed in phase `Linker'. (Exit code: 1)
make[1]: *** [libraries/base/dist-install/build/GHC/Float.o] Error 1
libraries/base/ghc.mk:4: recipe for target 'libraries/base/dist-install/build/GHC/Float.o' failed
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2
The command '/bin/sh -c /tmp/ghc.sh' returned a non-zero code: 2

Any thoughts?

jim-minter commented 7 years ago

@accursoft if you've got working images sitting somewhere public I can pull them if that's easiest?

accursoft commented 7 years ago

https://www.mail-archive.com/debian-haskell@lists.debian.org/msg07168.html

I'll have a look at it might just need to update some dependencies.

jim-minter commented 7 years ago

@accursoft https://ghc.haskell.org/trac/ghc/ticket/12759 looks possible?

accursoft commented 7 years ago

@jim-minter Yes, that's the culprit. I'm checking if this has been fixed in Stretch.

jim-minter commented 7 years ago

Am trying patching image/ghc.sh according to https://ghc.haskell.org/trac/ghc/ticket/12759#comment:5

rhcarvalho commented 7 years ago

@jim-minter @accursoft my local checkout is pointing at https://github.com/accursoft/Haskell-Cloud/commit/47b1dcc56f8b30effa9c410cd65860f4b04abdde, I did not get that error. Maybe it was introduced in one of the more recent commits.

Building the images does take a lot of time though.

accursoft commented 7 years ago

The build error is fixed in ghc 8.0.2, currently in RC. I'll update the repo when it's released. You need to change 2 lines in image/ghc.sh to use the RC:

--- a/image/ghc.sh      Mon Oct 31 19:24:02 2016 +0000
+++ b/image/ghc.sh      Tue Nov 29 14:24:42 2016 +0000
@@ -34,7 +34,7 @@
 echo "silent
 show-error" >>~/.curlrc
 echo "Downloading GHC ..."
-curl https://downloads.haskell.org/~ghc/8.0.1/ghc-8.0.1-src.tar.xz | tar xJ
+curl https://downloads.haskell.org/~ghc/8.0.2-rc1/ghc-8.0.1.20161117-src.tar.xz | tar xJ
 cd ghc-*

 #hpc, hp2ps, runghc and iserv not needed
@@ -66,4 +66,4 @@
                        /usr/lib/gcc/x86_64-linux-gnu/6/*.a \
                        /usr/lib/gcc/x86_64-linux-gnu/6/cc1 \
                        /usr/lib/gcc/x86_64-linux-gnu/6/lto1 \
-                       /usr/local/lib/ghc-8.0.1/rts/*
\ No newline at end of file
+                       /usr/local/lib/ghc-*/rts/*
\ No newline at end of file
jim-minter commented 7 years ago

@accursoft thanks for the bug report. I now understand the issue - Debian's tar is appending 6144 '\0' bytes to the end of its tar stream, which the Go tar reader isn't consuming. Because these data are stuck in flight, the container doesn't end up closing. The fix will be for s2i to consume any remaining data in the pipe to /dev/null. It's a straightforward fix which is best done as part of or on top of #625 since that reworks this area anyway.

jim-minter commented 7 years ago

Update - the patch that should fix this is in PR #625.