fabric8io / jenkins-docker

docker file for a jenkins docker image
84 stars 96 forks source link

Permissions Problems with Jenkins #110

Closed magick93 closed 7 years ago

magick93 commented 7 years ago

Problem

When running a Jenkins job, job fails with:


[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-b23305f2ea4d4f5ca6225ee0fe1d9db0-311b84f28bd9e]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/jenkins-result.txt' 
$ cd /home/jenkins/workspace/copy-of-sw
sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/jenkins-result.txt' 
exit
$ /bin/sh: 2: cannot create /home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/pid: Permission denied
/bin/sh: 2: cannot create /home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/jenkins-log.txt: Permission denied
/bin/sh: 2: cannot create /home/jenkins/workspace/copy-of-sw@tmp/durable-84010d23/jenkins-result.txt: Permission denied
$ command terminated with non-zero exit code: Error executing in Docker Container: 2[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code -2
Finished: FAILURE

Describe Pod


name:           jenkins-1-fmkhf
Namespace:      default
Security Policy:    jenkins
Node:           176.9.36.15/176.9.36.15
Start Time:     Mon, 06 Feb 2017 10:22:47 +0100
Labels:         deployment=jenkins-1
            deploymentconfig=jenkins
            group=io.fabric8.devops.apps
            project=jenkins
            provider=fabric8
            version=2.2.311
Status:         Running
IP:         172.17.0.5
Controllers:        ReplicationController/jenkins-1
Init Containers:
  init:
    Container ID:   docker://fba4e604ace68b779201c95fb650b15b051895ff56e7a937d2fe079765c9476d
    Image:      busybox
    Image ID:       docker-pullable://docker.io/busybox@sha256:817a12c32a39bbe394944ba49de563e085f1d3c5266eb8e9723256bc4448680e
    Port:       
    Command:
      chmod
      777
      /var/jenkins_home/workspace
      /var/jenkins_home/jobs
    State:      Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Mon, 06 Feb 2017 10:22:57 +0100
      Finished:     Mon, 06 Feb 2017 10:22:57 +0100
    Ready:      True
    Restart Count:  0
    Volume Mounts:
      /var/jenkins_home/jobs from jenkins-jobs (rw)
      /var/jenkins_home/workspace from jenkins-workspace (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from jenkins-token-t7v0j (ro)
    Environment Variables:  <none>
Containers:
  jenkins:
    Container ID:   docker://2d11d0ac717a8460c303d97ee6c6a80f3c828b2e268c3f18bb878935c31bb028
    Image:      fabric8/jenkins-docker:2.2.311
    Image ID:       docker-pullable://docker.io/fabric8/jenkins-docker@sha256:d2da5a18524d06ec9e310efa4c11d96d6cc2f77309a13f8077dfe08ba762a71e
    Ports:      50000/TCP, 8080/TCP
    Limits:
      cpu:  0
      memory:   0
    Requests:
      cpu:      0
      memory:       0
    State:      Running
      Started:      Mon, 06 Feb 2017 10:23:02 +0100
    Ready:      True
    Restart Count:  0
    Liveness:       http-get http://:8080/blue/ delay=120s timeout=10s period=10s #success=1 #failure=3
    Readiness:      http-get http://:8080/blue/ delay=20s timeout=10s period=10s #success=1 #failure=3
    Volume Mounts:
      /var/jenkins_home/jobs from jenkins-jobs (rw)
      /var/jenkins_home/workspace from jenkins-workspace (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from jenkins-token-t7v0j (ro)
    Environment Variables:
      PROJECT_VERSION:          <set to the key 'project-version' of config map 'jenkins'>
      PIPELINE_ELASTICSEARCH_PROTOCOL:  <set to the key 'pipeline-elasticsearch-protocol' of config map 'jenkins'>
      JENKINS_GOGS_PASSWORD:        <set to the key 'jenkins-gogs-password' of config map 'jenkins'>
      JENKINS_GOGS_USER:        <set to the key 'jenkins-gogs-user' of config map 'jenkins'>
      JENKINS_GOGS_EMAIL:       <set to the key 'jenkins-gogs-email' of config map 'jenkins'>
      PIPELINE_ELASTICSEARCH_HOST:  <set to the key 'pipeline-elasticsearch-host' of config map 'jenkins'>
      KUBERNETES_NAMESPACE:     default (v1:metadata.namespace)
      KUBERNETES_MASTER:        https://kubernetes.default
Conditions:
  Type      Status
  Initialized   True 
  Ready     True 
  PodScheduled  True 
Volumes:
  jenkins-jobs:
    Type:   PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
    ClaimName:  jenkins-jobs
    ReadOnly:   false
  jenkins-workspace:
    Type:   PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)
    ClaimName:  jenkins-workspace
    ReadOnly:   false
  jenkins-token-t7v0j:
    Type:   Secret (a volume populated by a Secret)
    SecretName: jenkins-token-t7v0j
QoS Class:  BestEffort
Tolerations:    <none>
Events:
  FirstSeen LastSeen    Count   From            SubobjectPath           Type        Reason      Message
  --------- --------    -----   ----            -------------           --------    ------      -------
  9m        9m      1   {default-scheduler }                    Normal      Scheduled   Successfully assigned jenkins-1-fmkhf to 176.9.36.15
  9m        9m      1   {kubelet 176.9.36.15}   spec.initContainers{init}   Normal      Pulled      Container image "busybox" already present on machine
  9m        9m      1   {kubelet 176.9.36.15}   spec.initContainers{init}   Normal      Created     Created container with docker id fba4e604ace6; Security:[seccomp=unconfined]
  9m        9m      1   {kubelet 176.9.36.15}   spec.initContainers{init}   Normal      Started     Started container with docker id fba4e604ace6
  9m        9m      1   {kubelet 176.9.36.15}   spec.containers{jenkins}    Normal      Pulled      Container image "fabric8/jenkins-docker:2.2.311" already present on machine
  9m        9m      1   {kubelet 176.9.36.15}   spec.containers{jenkins}    Normal      Created     Created container with docker id 2d11d0ac717a; Security:[seccomp=unconfined]
  9m        9m      1   {kubelet 176.9.36.15}   spec.containers{jenkins}    Normal      Started     Started container with docker id 2d11d0ac717a
  8m        8m      2   {kubelet 176.9.36.15}   spec.containers{jenkins}    Warning     Unhealthy   Readiness probe failed: HTTP probe failed with statuscode: 503

Describe Jenkins SCC

$ oc describe scc jenkins
Name:                       jenkins
Priority:                   99
Access:                     
  Users:                    system:serviceaccount:default:admin,system:serviceaccount:default:jenkins,admin
  Groups:                   system:cluster-admins,system:nodes
Settings:                   
  Allow Privileged:             true
  Default Add Capabilities:         <none>
  Required Drop Capabilities:           <none>
  Allowed Capabilities:             <none>
  Allowed Volume Types:             *
  Allow Host Network:               true
  Allow Host Ports:             true
  Allow Host PID:               false
  Allow Host IPC:               false
  Read Only Root Filesystem:            false
  Run As User Strategy: RunAsAny        
    UID:                    <none>
    UID Range Min:              <none>
    UID Range Max:              <none>
  SELinux Context Strategy: RunAsAny        
    User:                   <none>
    Role:                   <none>
    Type:                   <none>
    Level:                  <none>
  FSGroup Strategy: RunAsAny            
    Ranges:                 <none>
  Supplemental Groups Strategy: RunAsAny    
    Ranges:                 <none>

Jenkinsfile

#!/usr/bin/groovy
@Library('github.com/fabric8io/fabric8-pipeline-library@master')
def failIfNoTests = ""
try {
  failIfNoTests = ITEST_FAIL_IF_NO_TEST
} catch (Throwable e) {
  failIfNoTests = "false"
}

def itestPattern = ""
try {
  itestPattern = ITEST_PATTERN
} catch (Throwable e) {
  itestPattern = "*KT"
}

def versionPrefix = ""
try {
  versionPrefix = VERSION_PREFIX
} catch (Throwable e) {
  versionPrefix = "1.0"
}

def canaryVersion = "${versionPrefix}.${env.BUILD_NUMBER}"
def utils = new io.fabric8.Utils()
def buildLabel = "mylabel.${env.JOB_NAME}.${env.BUILD_NUMBER}".replace('-', '_').replace('/', '_')

podTemplate(label: buildLabel, 
 containers: [containerTemplate(alwaysPullImage: false, args: 'cat', command: '/bin/sh -c', 
        envVars: [
                    containerEnvVar(key: 'DOCKER_CONFIG', value: '/home/jenkins/.docker/')], 
        image: 'jhipster/jhipster', name: 'maven', privileged: true, resourceLimitCpu: '', resourceLimitMemory: '', resourceRequestCpu: '', resourceRequestMemory: '', ttyEnabled: true, workingDir: '/home/jenkins')],
 volumes: [hostPathVolume(hostPath: '/var/run/docker.sock', mountPath: '/var/run/docker.sock'), secretVolume(mountPath: '/root/.m2', secretName: 'jenkins-maven-settings'), secretVolume(mountPath: '/home/jenkins/.docker', secretName: 'jenkins-docker-cfg')],
 serviceAccount: 'jenkins') {
    node(buildLabel) {
        container(name: 'maven') {
  def envProd = 'shiftwork-production'

  checkout scm

    stage 'Canary Release'
    mavenCanaryRelease{
      version = canaryVersion
    }

    stage 'Integration Test'
    mavenIntegrationTest{
      environment = 'Testing'
      failIfNoTests = localFailIfNoTests
      itestPattern = localItestPattern
    }

    stage 'Rolling Upgrade Production'
    def rc = readFile 'target/classes/kubernetes.json'
    kubernetesApply(file: rc, environment: envProd)
   }
  }
}

Attempted fixes

rawlingsj commented 7 years ago

I wonder if this is related to using the old style kubernetes-pipeline-plugin approach vs the new kubernetes-plugin style. I.e. you could try the new approach, assuming that jhipster/jhipster is your builder image, something like..

@Library('github.com/fabric8io/fabric8-pipeline-library@master')
def dummy
dockerTemplate{
  mavenNode(mavenImage: 'jhipster/jhipster') {

    container(name: 'maven') {
      checkout scm
      stage 'Canary Release'
      mavenCanaryRelease{
        version = canaryVersion
      }

      stage 'Integration Test'
      mavenIntegrationTest{
        environment = 'Testing'
        failIfNoTests = localFailIfNoTests
        itestPattern = localItestPattern
      }

      stage 'Rolling Upgrade Production'
      def rc = readFile 'target/classes/kubernetes.json'
      kubernetesApply(file: rc, environment: envProd)
    }
  }
}
magick93 commented 7 years ago

I get the same error with the above jenkinsfile.

rawlingsj commented 7 years ago

Can you try one of the fabric8 create new projects apps please? I.e. in the fabric8 console, create a new project, select say the Microservices project and select a the Canary Release and stage pipeline to see if that goes through successfully.

magick93 commented 7 years ago

We are unable to login to conole due to https://github.com/fabric8io/fabric8-console/issues/257

iocanel commented 7 years ago

I am wondering if this is related to the way arbitrary user ids are handled in openshift.

magick93 commented 7 years ago

@rawlingsj - now that we can access the f8 console, we created a Spring Boot Web MVC QuickStart example app. This did build fine, the first time. Subsequent builds fail with:

[31mTimed out waiting for pods/services!
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 315.624 sec <<< FAILURE! - in org.example.KubernetesIntegrationKT
org.example.KubernetesIntegrationKT  Time elapsed: 315.623 sec  <<< ERROR!
java.lang.RuntimeException: java.lang.IllegalStateException: Failed to apply kubernetes configuration.

Nonetheless, this is, hopefully helpful.

I should add, this test app was created in the same OpenShift project, and used the same Jenkins as our app that is having problems.

magick93 commented 7 years ago

@rawlingsj - when I tried the Jenkinsfile in the Spring Boot Web MVC QuickStart example app it also fails with:

[f8test] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-87f59e31eac64eeabef929673f2019e6-29cc6924696c]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/f8test@tmp/durable-38812024/pid'; jsc=durable-8d9b47991d65e79b5ef07257f76d9da4; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/f8test@tmp/durable-38812024/script.sh' > '/home/jenkins/workspace/f8test@tmp/durable-38812024/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/f8test@tmp/durable-38812024/jenkins-result.txt' 
$ cd /home/jenkins/workspace/f8test
sh -c echo $$ > '/home/jenkins/workspace/f8test@tmp/durable-38812024/pid'; jsc=durable-8d9b47991d65e79b5ef07257f76d9da4; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/f8test@tmp/durable-38812024/script.sh' > '/home/jenkins/workspace/f8test@tmp/durable-38812024/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/f8test@tmp/durable-38812024/jenkins-result.txt' 
exit
$ /bin/sh: 2: cannot create /home/jenkins/workspace/f8test@tmp/durable-38812024/pid: Permission denied
/bin/sh: 2: cannot create /home/jenkins/workspace/f8test@tmp/durable-38812024/jenkins-log.txt: Permission denied
/bin/sh: 2: cannot create /home/jenkins/workspace/f8test@tmp/durable-38812024/jenkins-result.txt: Permission denied
$ command terminated with non-zero exit code: Error executing in Docker Container: 2[Pipeline] }
magick93 commented 7 years ago

@iocanel - can you help to clarify - we have two jenkinsfiles in this issue, neither works. Which is the new and which is the legacy? Or are the just different syntaxs for same library? Are there alternatives that do what the jenkinsfiles describe?

iocanel commented 7 years ago

Currently, regardless of how you express things under the hood everything goes through the kubernetes-plugin.

The syntax that comes with the plugin is the following:

 podTemplate(...) {
 }

On top of that the Fabric8 Pipeline Library adds some sugar, to make it easier to compose podTemplate with multiple differrent containers. For example:

 dockerTemplate {
      mavenTemplate {
      }
 }

Even the old/legacy Kubernetes Pipeline Plugin internally uses the same concepts (e.g. podTemplate).

 kubernetes.pod('mypod').image('myimage').inside {
 } 

In any case the problem you are having, doesn't have to do with the syntax of the plugin, but something else. (not sure what but my guess is that its due to Openshift's arbitrary user ids).

magick93 commented 7 years ago

Thanks

not sure what but my guess is that its due to Openshift's arbitrary user ids

How can we investigate this hypothesis further?

iocanel commented 7 years ago

My hypothesis is wrong. It's not due to arbitrary user ids.

magick93 commented 7 years ago

Is there anything we can do to debug the issue?

magick93 commented 7 years ago

This maybe related - https://issues.jenkins-ci.org/browse/JENKINS-37069

rawlingsj commented 7 years ago

Ah interesting - that had jogged my memory and we have a doc on the SELinux workaround suggested in that link.

https://github.com/jenkinsci/kubernetes-pipeline-plugin/blob/master/kubernetes-steps/readme.md#technical-notes

iocanel commented 7 years ago

@magick93: can you try removing

 , workingDir: '/home/jenkins')

and tell me if it changes anything?

magick93 commented 7 years ago

...tell me if it changes anything?

Unfortunately no change - same error.

magick93 commented 7 years ago

@rawlingsj - the link you gave has a reference to another page:

https://github.com/jenkinsci/kubernetes-pipeline-plugin/blob/master/kubernetes-steps/readme.md#technical-notes

An example security context constraint that configures myserviceacccount in the default namespace can be found here

But the here is a 404. Do you know where this should point to?

magick93 commented 7 years ago

On https://github.com/jenkinsci/kubernetes-pipeline-plugin/blob/master/kubernetes-steps/readme.md#technical-notes, it says:

Under the hood the plugin is using hostPath mounts. This requires two things

A service account associated with a security context constraint that allows hostPath mounts. A host capable of hostPath mounts

In the Jenkins SCC we have:

Settings:                   
  Allow Privileged:             true
  Default Add Capabilities:         <none>
  Required Drop Capabilities:           <none>
  Allowed Capabilities:             <none>
  Allowed Volume Types:             *
  Allow Host Network:               true
  Allow Host Ports:             true
  Allow Host PID:               false
  Allow Host IPC:               false

I did not see any setting specifically mentioning 'hostPath' in the SCC output.

magick93 commented 7 years ago

This looks related - https://issues.jenkins-ci.org/browse/JENKINS-37069

iocanel commented 7 years ago

The docker pipeline plugin and the kubernetes plugin use the same principal: share workspace via volumes, so that you can exec shell commands in container.

What is quite different in your case, is that sharing is done via pod volumes, so there shouldn't be any fancy permission issue. The fabric8/jenkins-docker works nicely in Openshift (no issues with arbitrary user ids).

In some tests I run locally, I managed to get your Jenkinsfile (a trimmed down version of it) working, by completely removing the workingDir directive.

magick93 commented 7 years ago

removing the workingDir directive

My jenkinsfile doesnt specify a workingDir and I dont have jenkins job setting for a workingDir.

How do you remove this directive?

iocanel commented 7 years ago

One of the variations you sent me did specify a workingDir (I am referring to the updated one).

magick93 commented 7 years ago

Can anyone suggest any workarounds? We have been trying to deploy our apps for a long time and would so appreciate finding anything that works.

rawlingsj commented 7 years ago

You shouldn't need it AFAIK but have you tried running the Jenkins master as a privileged pod? I think I suggested that along the way but I can't see it on this thread so perhaps it's not been attempted.

So oc edit dc jenkins and add the privileged security context like like this example, note the last two lines..

          image: fabric8/jenkins-docker:2.2.311
          imagePullPolicy: IfNotPresent
          livenessProbe:
            httpGet:
              path: /blue/
              port: 8080
            initialDelaySeconds: 120
            timeoutSeconds: 10
          name: jenkins
          securityContext:
            privileged: true
magick93 commented 7 years ago

I have tried that, but ran into these issues: https://github.com/openshift/origin-web-console/issues/1221 https://github.com/openshift/origin/issues/12819

rawlingsj commented 7 years ago

It looks to me from those two issue that you're adding the security context to the wrong part of the YAML. Have you tried adding it to the container as in my example above? If it still fails can you copy the DeploymentConfig here so I can see where you're adding the privileged privileged?

magick93 commented 7 years ago

Ah yes, your'e right. I was adding it to the wrong section.

I tried again, this time adding it to the container section. It saved correctly. But, upon running the jenkins job again, the same error occurs.

magick93 commented 7 years ago

Is there anything I can do to provide more helpful information, eg, is there some logging or bash code that I can add to help understand this issue?

iocanel commented 7 years ago

@magick93: I am starting to believe that permission issues are because of your image.

Can you please crate a simple pipeline project in jenkins and use the following script: https://gist.github.com/iocanel/500b5dfc4ab2b65306f52f765fff74d3 and tell me if this works (runs without those permission errors).

If it works, can you then replace it with your actual image so that we can crosscheck?

Here, it works fine with maven but doesn't work with jhipster/jhipster

magick93 commented 7 years ago
[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-888f3739fb764fdbb3b1cfe724692645-68d84cb387b4]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/jenkins-result.txt' 
# cd /home/jenkins/workspace/copy-of-sw
sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-faf13645/jenkins-result.txt' 
exit
# # + pwd
/home/jenkins/workspace/copy-of-sw
[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-888f3739fb764fdbb3b1cfe724692645-68d84cb387b4]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/jenkins-result.txt' 
# cd /home/jenkins/workspace/copy-of-sw
sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/jenkins-result.txt' 
exit
# # + whomi
/home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/script.sh: 2: /home/jenkins/workspace/copy-of-sw@tmp/durable-8e0ef96d/script.sh: whomi: not found
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
Finished: FAILURE
iocanel commented 7 years ago

So typo aside (whomi should be whoami) you don't have the permissions issue, right?

Can u fix the typo and try again?

magick93 commented 7 years ago

Yes, correct, no more permission issues:

[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-19810d3295bf440788adde9199a30211-6af47a58ecd6]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/jenkins-result.txt' 
# cd /home/jenkins/workspace/copy-of-sw
sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-0914b9b6/jenkins-result.txt' 
exit
# # + pwd
/home/jenkins/workspace/copy-of-sw
[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-19810d3295bf440788adde9199a30211-6af47a58ecd6]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/jenkins-result.txt' 
# cd /home/jenkins/workspace/copy-of-sw
sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-409cc4f2/jenkins-result.txt' 
exit
# # + whoami
root
[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [maven] of pod [kubernetes-19810d3295bf440788adde9199a30211-6af47a58ecd6]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/jenkins-result.txt' 
# cd /home/jenkins/workspace/copy-of-sw
sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-722305a5/jenkins-result.txt' 
exit
# # + echo hello world
hello world
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
Finished: SUCCESS
iocanel commented 7 years ago

So, the problem is with the jhipster/jhipster image.

Not sure exactly why it cases a problem (will have to investigate further) , but I think its safe to assume that its due to the image.

I would just use the maven image. After all the jhipster image is only used for generating that initial project and is not required for the build (from a quick glance at the docs).

If for any reason, you build does require a custom docker image, then you better create one of your own. (if I had to go down this road I would try to modifiy the original one and change things that could affect permissions etc. For example: not use a custom user).

magick93 commented 7 years ago

Ok, but how do we package our .war file is deployed in the jhipster image?

And I'm not sure, but I believe jhipster image is needed for building as it uses both maven and node.

I do know it used to work, that we used to be able to do this.

magick93 commented 7 years ago

We also do have a custom jhipster image. Currently we are blocked from using this due to https://github.com/openshift/origin/issues/12863

However, is it possible to build from a dockerfile, push to openshift registry, and then deploy using this image, from jenkins groovy?

magick93 commented 7 years ago

This is what used to work, approx a year ago.

Dockerfile

FROM node

RUN apt-get update && apt-get install -y --no-install-recommends \
        bzip2 \
        unzip \
        xz-utils \
    && rm -rf /var/lib/apt/lists/*

RUN echo 'deb http://httpredir.debian.org/debian jessie-backports main' > /etc/apt/sources.list.d/jessie-backports.list

# Default to UTF-8 file.encoding
ENV LANG C.UTF-8

# add a simple script that can auto-detect the appropriate JAVA_HOME value
# based on whether the JDK or only the JRE is installed
RUN { \
        echo '#!/bin/sh'; \
        echo 'set -e'; \
        echo; \
        echo 'dirname "$(dirname "$(readlink -f "$(which javac || which java)")")"'; \
    } > /usr/local/bin/docker-java-home \
    && chmod +x /usr/local/bin/docker-java-home

ENV JAVA_HOME /usr/lib/jvm/java-8-openjdk-amd64

ENV JAVA_VERSION 8u72
ENV JAVA_DEBIAN_VERSION 8u72-b15-1~bpo8+1

# see https://bugs.debian.org/775775
# and https://github.com/docker-library/java/issues/19#issuecomment-70546872
ENV CA_CERTIFICATES_JAVA_VERSION 20140324

RUN set -x \
    && apt-get update \
    && apt-get install -y \
        openjdk-8-jdk="$JAVA_DEBIAN_VERSION" \
        ca-certificates-java="$CA_CERTIFICATES_JAVA_VERSION" \
    && rm -rf /var/lib/apt/lists/* \
    && apt-get install -y git \
    && apt-get clean    \
    && [ "$JAVA_HOME" = "$(docker-java-home)" ]

# see CA_CERTIFICATES_JAVA_VERSION notes above
RUN /var/lib/dpkg/info/ca-certificates-java.postinst configure

ENV MAVEN_VERSION 3.3.3

RUN mkdir -p /usr/share/maven \
    && curl -fsSL http://apache.osuosl.org/maven/maven-3/$MAVEN_VERSION/binaries/apache-maven-$MAVEN_VERSION-bin.tar.gz \
       | tar -xzC /usr/share/maven --strip-components=1 \
    && ln -s /usr/share/maven/bin/mvn /usr/bin/mvn \
    && npm set progress=false \
    && npm install --global --progress=false gulp bower \
    && echo '{ "allow_root": true }' > /root/.bowerrc \
    && mkdir /workdir

ENV MAVEN_HOME /usr/share/maven

EXPOSE 8080

WORKDIR /workdir

CMD npm install && bower install && $MAVEN_HOME/bin/mvn -Pprod

COPY . /workdir

Jenkinsfile

#!/usr/bin/groovy
def failIfNoTests = ""
try {
  failIfNoTests = ITEST_FAIL_IF_NO_TEST
} catch (Throwable e) {
  failIfNoTests = "false"
}

def itestPattern = ""
try {
  itestPattern = ITEST_PATTERN
} catch (Throwable e) {
  itestPattern = "*KT"
}

def versionPrefix = ""
try {
  versionPrefix = VERSION_PREFIX
} catch (Throwable e) {
  versionPrefix = "1.0"
}

def canaryVersion = "${versionPrefix}.${env.BUILD_NUMBER}"
def utils = new io.fabric8.Utils()

node {
  def envProd = 'shiftwork-production'

  checkout scm

  kubernetes.pod('buildpod').withImage('<ip address>:80/shiftwork/jhipster-build')
      .withPrivileged(true)
      .withHostPathMount('/var/run/docker.sock','/var/run/docker.sock')
      .withEnvVar('DOCKER_CONFIG','/home/jenkins/.docker/')
      .withSecret('jenkins-docker-cfg','/home/jenkins/.docker')
      .withSecret('jenkins-maven-settings','/root/.m2')
      .withServiceAccount('jenkins')
      .inside {

    stage 'Canary Release'
    mavenCanaryRelease{
      version = canaryVersion
    }

    stage 'Integration Test'
    mavenIntegrationTest{
      environment = 'Testing'
      failIfNoTests = localFailIfNoTests
      itestPattern = localItestPattern
    }

    stage 'Rolling Upgrade Production'
    def rc = readFile 'target/classes/kubernetes.json'
    kubernetesApply(file: rc, environment: envProd)

  }
}
rawlingsj commented 7 years ago

Are you able to push a test project to GitHub that contains an example of the type of project you're trying to build and deploy? If we have a test project and are able to recreate your issue I'm confident we can find a solution.

magick93 commented 7 years ago

Thanks James - I've pushed to https://github.com/magick93/shiftwork

magick93 commented 7 years ago

I found that there is a https://github.com/jhipster/jhipster-ci-stack but even with this I still get the same errors as with previous attempts.

rawlingsj commented 7 years ago

@magick93 thanks for the project link, I'll try and take a look tonight. I did manage to recreate the issue with that jhipster image. I'll try and come up with something soon.

magick93 commented 7 years ago

@rawlingsj - You cannot understand how much I appreciate your help!

rawlingsj commented 7 years ago

That's ok I understand what it's like.

FWIW I've got it running on minikube once I realised it was nodejs5 and deployed PostgreSQL too, I've also upgraded the app to use the new fabric8 maven plugin. I recreated your issue btw, next I'll add a custom jhipster-builder image to use in your Jenkinsfile and you should be good to go. I've had to stop working at the moment but will try and push what I have before calling it a night. But I think we're pretty much there.

rawlingsj commented 7 years ago

Ok just submitted a PR - I've not been able to test on openshift yet I'm afraid but I'm out of time this weekend. It all seems to be working on minikube so hopefully it will be fine for you. Note that the Jenkinsfile is pointing to my library at the moment where I've added a jhipster node and builder image. I've also added a few extras which means you get a few more kubernetes benefits:

NOTE the first shiftwork deployment will fail readiness check because its waiting for the postgres service to be ready. Kubernetes will automatically restart the pod after a short while and connect to postgress.

I'll be around from Monday in case you have any questions or run into problems @magick93, fingers crossed this gets you going!

rawlingsj commented 7 years ago

BTW I forgot to say rather than importing the project in the fabric8 UI, for now I just created a new Jenkins pipeline job and pointed the pipeline (using the SCM option) to the github project. The rest should just work.

magick93 commented 7 years ago

Thank you so much!

I'm testing this, and certainly seem to be getting further, however the build still fails with the below. Maybe this is an OpenShift error - Im not sure. Does this error mean the image cannot be pushed to the internal registry?

[INFO] --- fabric8-maven-plugin:3.2.20:build (default) @ shiftwork ---
[INFO] F8: Using OpenShift build with strategy S2I
[INFO] Copying files to /home/jenkins/workspace/copy-of-sw/target/docker/staffrostering/shiftwork/1.0.128/build/maven
[INFO] Building tar: /home/jenkins/workspace/copy-of-sw/target/docker/staffrostering/shiftwork/1.0.128/tmp/docker-build.tar
[INFO] F8: [staffrostering/shiftwork:1.0.128]: Created docker source tar /home/jenkins/workspace/copy-of-sw/target/docker/staffrostering/shiftwork/1.0.128/tmp/docker-build.tar
[INFO] F8: Creating BuildConfig shiftwork-s2i for Source build
[INFO] F8: Adding to ImageStream shiftwork
[INFO] F8: Starting Build shiftwork-s2i
sh-4.2# exit
exit
[INFO] F8: Waiting for build shiftwork-s2i-1 to complete...
[INFO] F8: warning: Image sha256:1a13e31efd4b230e2c061ed8d07e479f0dfd38f9ebe6f380e8db3d111d1ec577 does not contain a value for the io.openshift.s2i.scripts-url label
[INFO] F8: Receiving source from STDIN as archive ...
[INFO] F8: error: build error: failed to install [assemble run]
[INFO] WebSocket successfully opened
[INFO] F8: Build shiftwork-s2i-1 status: Failed
[ERROR] F8: OpenShift Build shiftwork-s2i-1 Failed
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:33 min
[INFO] Finished at: 2017-02-11T11:29:18+00:00
[INFO] Final Memory: 102M/1744M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal io.fabric8:fabric8-maven-plugin:3.2.20:build (default) on project shiftwork: OpenShift Build shiftwork-s2i-1 Failed -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal io.fabric8:fabric8-maven-plugin:3.2.20:build (default) on project shiftwork: OpenShift Build shiftwork-s2i-1 Failed
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: OpenShift Build shiftwork-s2i-1 Failed
    at io.fabric8.maven.core.openshift.OpenShiftBuildService.waitForOpenShiftBuildToComplete(OpenShiftBuildService.java:80)
    at io.fabric8.maven.plugin.mojo.build.BuildMojo.executeOpenShiftBuild(BuildMojo.java:333)
    at io.fabric8.maven.plugin.mojo.build.BuildMojo.buildAndTag(BuildMojo.java:247)
    at io.fabric8.maven.docker.BuildMojoNoFork.executeInternal(BuildMojoNoFork.java:46)
    at io.fabric8.maven.plugin.mojo.build.BuildMojo.executeInternal(BuildMojo.java:228)
    at io.fabric8.maven.docker.AbstractDockerMojo.execute(AbstractDockerMojo.java:208)
    at io.fabric8.maven.plugin.mojo.build.BuildMojo.execute(BuildMojo.java:211)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
    ... 20 more
[INFO] WebSocket close received. code: 1000, reason: 
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[WARNING] Ignoring onClose for already closed/closing websocket
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE

Build pod

$ oc describe pod shiftwork-s2i-2-build
Name:           shiftwork-s2i-2-build
Namespace:      default
Security Policy:    jenkins
Node:           176.9.36.15/176.9.36.15
Start Time:     Sat, 11 Feb 2017 13:12:10 +0100
Labels:         openshift.io/build.name=shiftwork-s2i-2
Status:         Failed
IP:         172.17.0.14
Controllers:        <none>
Containers:
  sti-build:
    Container ID:   docker://49d0df757a58d220a5f0f1e485d284773975dcd00d4adb3d7c4561ee1443d376
    Image:      openshift/origin-sti-builder:v1.4.1
    Image ID:       docker-pullable://docker.io/openshift/origin-sti-builder@sha256:c46d9a24e59019032d21acdbb45fcdfce2eed2ac111a5961f28f5023b7f7aaab
    Port:       
    Args:
      --loglevel=0
    State:      Terminated
      Reason:       Error
      Exit Code:    1
      Started:      Sat, 11 Feb 2017 13:12:16 +0100
      Finished:     Sat, 11 Feb 2017 13:12:21 +0100
    Ready:      False
    Restart Count:  0
    Volume Mounts:
      /var/run/docker.sock from docker-socket (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from builder-token-1mmaj (ro)
      /var/run/secrets/openshift.io/push from builder-dockercfg-x84j4-push (ro)
    Environment Variables:
      BUILD:    {"kind":"Build","apiVersion":"v1","metadata":{"name":"shiftwork-s2i-2","namespace":"default","selfLink":"/oapi/v1/namespaces/default/builds/shiftwork-s2i-2","uid":"50334373-f053-11e6-a274-406186becd9d","resourceVersion":"660382","creationTimestamp":"2017-02-11T12:12:09Z","labels":{"buildconfig":"shiftwork-s2i","group":"com.teammachine.staffrostering","openshift.io/build-config.name":"shiftwork-s2i","openshift.io/build.start-policy":"Serial","project":"shiftwork","provider":"fabric8","version":"1.0.128"},"annotations":{"openshift.io/build-config.name":"shiftwork-s2i","openshift.io/build.number":"2"}},"spec":{"serviceAccount":"builder","source":{"type":"Binary","binary":{}},"strategy":{"type":"Source","sourceStrategy":{"from":{"kind":"DockerImage","name":"docker.io/fabric8/java-jboss-openjdk8-jdk:1.0.10"}}},"output":{"to":{"kind":"DockerImage","name":"172.30.139.137:5000/default/shiftwork:1.0.129"},"pushSecret":{"name":"builder-dockercfg-x84j4"}},"resources":{},"postCommit":{},"nodeSelector":{},"triggeredBy":null},"status":{"phase":"New","outputDockerImageReference":"172.30.139.137:5000/default/shiftwork:1.0.129","config":{"kind":"BuildConfig","namespace":"default","name":"shiftwork-s2i"}}}

      ORIGIN_VERSION:       v1.4.1+3f9807a
      PUSH_DOCKERCFG_PATH:  /var/run/secrets/openshift.io/push
Conditions:
  Type      Status
  Initialized   True 
  Ready     False 
  PodScheduled  True 
Volumes:
  docker-socket:
    Type:   HostPath (bare host directory volume)
    Path:   /var/run/docker.sock
  builder-dockercfg-x84j4-push:
    Type:   Secret (a volume populated by a Secret)
    SecretName: builder-dockercfg-x84j4
  builder-token-1mmaj:
    Type:   Secret (a volume populated by a Secret)
    SecretName: builder-token-1mmaj
QoS Class:  BestEffort
Tolerations:    <none>
Events:
  FirstSeen LastSeen    Count   From            SubobjectPath           Type        Reason      Message
  --------- --------    -----   ----            -------------           --------    ------      -------
  42m       42m     1   {default-scheduler }                    Normal      Scheduled   Successfully assigned shiftwork-s2i-2-build to 176.9.36.15
  42m       42m     1   {kubelet 176.9.36.15}   spec.containers{sti-build}  Normal      Pulled      Container image "openshift/origin-sti-builder:v1.4.1" already present on machine
  42m       42m     1   {kubelet 176.9.36.15}   spec.containers{sti-build}  Normal      Created     Created container with docker id 49d0df757a58; Security:[seccomp=unconfined]
  42m       42m     1   {kubelet 176.9.36.15}   spec.containers{sti-build}  Normal      Started     Started container with docker id 49d0df757a58
magick93 commented 7 years ago

I think the remaining issue is similar to https://bugzilla.redhat.com/show_bug.cgi?id=1324194

magick93 commented 7 years ago

I managed to get pass the previous error by changing

<docker.from>docker.io/fabric8/java-jboss-openjdk8-jdk:1.0.10</docker.from>

to

<docker.from>fabric8/s2i-java</docker.from>

Now, however, I'm once again getting Cannot connect to the Docker daemon. Is the docker daemon running on this host?

[INFO] F8: Starting S2I Java Build .....
[INFO] F8: S2I binary build from fabric8-maven-plugin detected
[INFO] F8: Copying binaries from /tmp/src/maven to /deployments ...
[INFO] F8: ... done
[INFO] F8: 
[INFO] F8: 
[INFO] F8: Pushing image 172.30.139.137:5000/default/shiftwork:1.0.134 ...
[INFO] F8: Pushed 0/23 layers, 0% complete
[INFO] F8: Pushed 1/23 layers, 4% complete
[INFO] F8: Pushed 2/23 layers, 9% complete
[INFO] F8: Pushed 3/23 layers, 13% complete
[INFO] F8: Pushed 4/23 layers, 18% complete
[INFO] F8: Pushed 5/23 layers, 22% complete
[INFO] F8: Pushed 6/23 layers, 27% complete
[INFO] F8: Pushed 7/23 layers, 32% complete
[INFO] F8: Pushed 8/23 layers, 36% complete
[INFO] F8: Pushed 9/23 layers, 40% complete
[INFO] F8: Pushed 10/23 layers, 45% complete
[INFO] F8: Pushed 11/23 layers, 50% complete
[INFO] F8: Pushed 12/23 layers, 55% complete
[INFO] F8: Pushed 13/23 layers, 58% complete
[INFO] F8: Pushed 14/23 layers, 64% complete
[INFO] F8: Pushed 15/23 layers, 70% complete
[INFO] F8: Pushed 16/23 layers, 73% complete
[INFO] F8: Pushed 17/23 layers, 76% complete
[INFO] F8: Pushed 18/23 layers, 80% complete
[INFO] F8: Pushed 19/23 layers, 85% complete
[INFO] F8: Pushed 20/23 layers, 90% complete
[INFO] F8: Pushed 21/23 layers, 96% complete
[INFO] F8: Pushed 22/23 layers, 99% complete
[INFO] F8: Pushed 23/23 layers, 100% complete
[INFO] F8: Push successful
[INFO] F8: Build shiftwork-s2i-6 status: Complete
[INFO] F8: Build shiftwork-s2i-6 Complete
[INFO] WebSocket close received. code: 1000, reason: 
[WARNING] Ignoring onClose for already closed/closing websocket
[INFO] F8: Found tag on ImageStream shiftwork tag: sha256:6fefcf8a224cdebde0b4e07539fc15721a6de2c2e062a9b66afa51d9c74d9b44
[INFO] F8: ImageStream shiftwork written to /home/jenkins/workspace/copy-of-sw/target/shiftwork-is.yml
[INFO] 
[INFO] --- sortpom-maven-plugin:2.5.0:sort (default) @ shiftwork ---
[INFO] Sorting file /home/jenkins/workspace/copy-of-sw/pom.xml
[INFO] Saved backup of /home/jenkins/workspace/copy-of-sw/pom.xml to /home/jenkins/workspace/copy-of-sw/pom.xml.bak
[INFO] Saved sorted pom file to /home/jenkins/workspace/copy-of-sw/pom.xml
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ shiftwork ---
[INFO] Installing /home/jenkins/workspace/copy-of-sw/target/shiftwork-1.0.134.war to /root/.mvnrepository/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134.war
[INFO] Installing /home/jenkins/workspace/copy-of-sw/pom.xml to /root/.mvnrepository/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134.pom
[INFO] Installing /home/jenkins/workspace/copy-of-sw/target/classes/META-INF/fabric8/openshift.yml to /root/.mvnrepository/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-openshift.yml
[INFO] Installing /home/jenkins/workspace/copy-of-sw/target/classes/META-INF/fabric8/openshift.json to /root/.mvnrepository/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-openshift.json
[INFO] Installing /home/jenkins/workspace/copy-of-sw/target/classes/META-INF/fabric8/kubernetes.yml to /root/.mvnrepository/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-kubernetes.yml
[INFO] Installing /home/jenkins/workspace/copy-of-sw/target/classes/META-INF/fabric8/kubernetes.json to /root/.mvnrepository/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-kubernetes.json
[INFO] 
[INFO] --- maven-deploy-plugin:2.8.2:deploy (default-deploy) @ shiftwork ---
[INFO] Using alternate deployment repository local-nexus::default::http://nexus/content/repositories/staging/
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134.war
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134.war (75394 KB at 30462.2 KB/sec)
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134.pom
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134.pom (38 KB at 418.2 KB/sec)
[INFO] Downloading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/maven-metadata.xml
[INFO] Downloaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/maven-metadata.xml (993 B at 3.1 KB/sec)
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/maven-metadata.xml
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/maven-metadata.xml (2 KB at 26.4 KB/sec)
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-openshift.yml
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-openshift.yml (7 KB at 59.7 KB/sec)
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-openshift.json
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-openshift.json (9 KB at 487.0 KB/sec)
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-kubernetes.yml
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-kubernetes.yml (7 KB at 318.4 KB/sec)
[INFO] Uploading: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-kubernetes.json
[INFO] Uploaded: http://nexus/content/repositories/staging/com/teammachine/staffrostering/shiftwork/1.0.134/shiftwork-1.0.134-kubernetes.json (9 KB at 126.1 KB/sec)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:39 min
[INFO] Finished at: 2017-02-11T17:15:46+00:00
sh-4.2# exit
exit
[INFO] Final Memory: 110M/2300M
[INFO] ------------------------------------------------------------------------
[Pipeline] echo
Running on a single node, skipping docker push as not needed
[Pipeline] sh
[copy-of-sw] Running shell script
Executing shell script inside container [jhipster] of pod [kubernetes-080210addb894c2eaa11f873084d57cc-ffca95872e09]
Executing command: sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-70d741ab/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspace/copy-of-sw@tmp/durable-70d741ab/script.sh' > '/home/jenkins/workspace/copy-of-sw@tmp/durable-70d741ab/jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/workspace/copy-of-sw@tmp/durable-70d741ab/jenkins-result.txt' 
sh-4.2# cd /home/jenkins/workspace/copy-of-sw
sh-4.2# sh -c echo $$ > '/home/jenkins/workspace/copy-of-sw@tmp/durable-70d74
<ome/jenkins/workspace/copy-of-sw@tmp/durable-70d741                         ab/pid'; jsc=durable-e271
<-of-sw@tmp/durable-70d741ab/pid'; jsc=durable-e271a                         92b5b252a3996e9c5847d57b9
<b/pid'; jsc=durable-e271a92b5b252a3996e9c5847d57b90                         c; JENKINS_SERVER_COOKIE=
<2b5b252a3996e9c5847d57b90c; JENKINS_SERVER_COOKIE=$                         jsc '/home/jenkins/worksp
<; JENKINS_SERVER_COOKIE=$jsc '/home/jenkins/workspa                         ce/copy-of-sw@tmp/durable
<sc '/home/jenkins/workspace/copy-of-sw@tmp/durable-                         70d741ab/script.sh' > '/h
<e/copy-of-sw@tmp/durable-70d741ab/script.sh' > '/ho                         me/jenkins/workspace/copy
<0d741ab/script.sh' > '/home/jenkins/workspace/copy-                         of-sw@tmp/durable-70d741a
<e/jenkins/workspace/copy-of-sw@tmp/durable-70d741ab                         /jenkins-log.txt' 2>&1; e
<f-sw@tmp/durable-70d741ab/jenkins-log.txt' 2>&1; ec                         ho $? > '/home/jenkins/wo
<jenkins-log.txt' 2>&1; echo $? > '/home/jenkins/wor                         kspace/copy-of-sw@tmp/dur
<o $? > '/home/jenkins/workspace/copy-of-sw@tmp/dura                         ble-70d741ab/jenkins-resu
<space/copy-of-sw@tmp/durable-70d741ab/jenkins-resul                         t.txt' 
+ docker tag staffrostering/shiftwork:1.0.134 172.30.254.212:80/staffrostering/shiftwork:1.0.134
sh-4.2# exit
exit
Cannot connect to the Docker daemon. Is the docker daemon running on this host?
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
magick93 commented 7 years ago

Attempt

Trying to get a clearly understanding of execution flow, is it using S2i?

Result

I'm a little confused.

I have added:

def s2iMode = flow.isOpenShiftS2I()
      echo "s2i mode: ${s2iMode}"

Which displays s2i mode: false

But I also see: Starting S2I Java Build ..... S2I binary build from fabric8-maven-plugin detected

magick93 commented 7 years ago

@rawlingsj - I have been trying to apply some of your suggestions from your replies to my thread on https://groups.google.com/forum/#!msg/fabric8/IxpDpNLKBKo/g3c-t7m0GgAJ as again we get this error. However I cannot seem to get past this.

I have tried modifying the Jenkinfile - and probably made a complete ballsup of it - https://github.com/rawlingsj/shiftwork/commit/c5cee732de88ed76d231610c983159ad7d23b569 attempt to set required values so connection to the Docker daemon can be made.

magick93 commented 7 years ago

Attempt

Confirm that env vars are being set using the fabric8 syntax.

Result

This is strange. In https://github.com/rawlingsj/shiftwork/commit/e24021e643f25e0b47db11d8ac114a050c29c518 I added echo "DOCKER_CONFIG is :${env.DOCKER_CONFIG}" but it outputs DOCKER_CONFIG is :null.