eclipse-che / che

Kubernetes based Cloud Development Environments for Enterprise Teams
http://eclipse.org/che
Eclipse Public License 2.0
6.95k stars 1.19k forks source link

Dashboard on AKS not working (too many redirects) #20761

Closed sebastiankomander closed 2 years ago

sebastiankomander commented 2 years ago

Describe the bug

Infrastructure:

Tried multiple variants: chectl server:deploy --installer=operator --platform=k8s --domain= chectl server:deploy --installer=operator --platform=k8s --domain= --multiuser chectl server:deploy --installer=helm --platform=k8s --domain= chectl server:deploy --installer=helm --platform=k8s --domain= --multiuser

Deployment is successfull and plugin registry, devfile registry, keycloak are working and reachable. If I navigate to dashboard (mydomain.com/dashboard) every browser (testes chrome and mozilla in different versions) if get "Too many redirects"

Request-URL: https://che-eclipse-che.vwtg.cloud.suedleasing-dev.com/dashboard/ Response Header: Location: /dashboard/

Tried with chectl/0.0.20211104-next.393086a linux-x64 node-v12.22.7 chectl/7.38.1 linux-x64 node-v12.22.7

Same result

Che version

other (please specify in additional context)

Steps to reproduce

  1. Create DNS-Zone and AKS
  2. Deploy Cert-Manager (all steps according to https://www.eclipse.org/che/docs/che-7/installation-guide/installing-che-on-microsoft-azure/)
  3. run one of 3.1 chectl server:deploy --installer=operator --platform=k8s --domain= 3.2 chectl server:deploy --installer=operator --platform=k8s --domain= --multiuser 3.3 chectl server:deploy --installer=helm --platform=k8s --domain= 3.4 chectl server:deploy --installer=helm --platform=k8s --domain= --multiuser 3.5 Navigate to /dashboard

Expected behavior

Dashboard up an running

Runtime

Kubernetes (vanilla)

Screenshots

No response

Installation method

chectl/latest

Environment

Azure

Eclipse Che Logs

Starting Dashboard backend server...
Static server's serving "/public" on 0.0.0.0:8080/dashboard/
Che Dashboard swagger is running on "dashboard/api/swagger".
└── / (GET)
    ├── dashboard (GET)
    │   ├── /
    │   │   ├── * (HEAD)
    │   │   │   * (GET)
    │   │   └── api
    │   │       ├── /swagger/static/* (HEAD)
    │   │       └── /
    │   │           ├── swagger (GET)
    │   │           │   └── / (GET)
    │   │           │       ├── uiConfig (GET)
    │   │           │       ├── initOAuth (GET)
    │   │           │       ├── json (GET)
    │   │           │       ├── yaml (GET)
    │   │           │       ├── static/* (GET)
    │   │           │       └── * (GET)
    │   │           ├── namespace/:/d
    │   │           │   ├── evworkspaces (GET)
    │   │           │   │   └── /:workspaceName (GET)
    │   │           │   └── ockerconfig (GET)
    │   │           └── websocket (GET)
    │   └── /api/namespace/:
    │       ├── /devworkspace
    │       │   ├── s (POST)
    │       │   └── templates (POST)
    │       ├── /devworkspaces/
    │       │   └── :workspaceName (PATCH)
    │       │       :workspaceName (DELETE)
    │       └── /dockerconfig (PUT)
    ├── /
    └── * (OPTIONS)

(node:6) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
Server listening at http://0.0.0.0:8080
Checking devfile /var/www/html/devfiles/quarkus-command-mode/devfile.yaml
Checking devfile /var/www/html/devfiles/scala-sbt/devfile.yaml
Checking devfile /var/www/html/devfiles/dotnet-asp.net/devfile.yaml
Checking devfile /var/www/html/devfiles/java-web-spring/devfile.yaml
Checking devfile /var/www/html/devfiles/python/devfile.yaml
Checking devfile /var/www/html/devfiles/dotnet/devfile.yaml
Checking devfile /var/www/html/devfiles/python-django/devfile.yaml
Checking devfile /var/www/html/devfiles/go/devfile.yaml
Checking devfile /var/www/html/devfiles/php-symfony/devfile.yaml
Checking devfile /var/www/html/devfiles/nodejs-yarn/devfile.yaml
Checking devfile /var/www/html/devfiles/nodejs-mongo/devfile.yaml
Checking devfile /var/www/html/devfiles/rust/devfile.yaml
Checking devfile /var/www/html/devfiles/bash/devfile.yaml
Checking devfile /var/www/html/devfiles/java-mongo/devfile.yaml
Checking devfile /var/www/html/devfiles/java-web-vertx/devfile.yaml
Checking devfile /var/www/html/devfiles/java-maven/devfile.yaml
Checking devfile /var/www/html/devfiles/nodejs-react/devfile.yaml
Checking devfile /var/www/html/devfiles/java-lombok/devfile.yaml
Checking devfile /var/www/html/devfiles/php-laravel/devfile.yaml
Checking devfile /var/www/html/devfiles/nodejs-angular/devfile.yaml
Checking devfile /var/www/html/devfiles/quarkus/devfile.yaml
Checking devfile /var/www/html/devfiles/php-mysql/devfile.yaml
Checking devfile /var/www/html/devfiles/php-web-simple/devfile.yaml
Checking devfile /var/www/html/devfiles/nodejs/devfile.yaml
Checking devfile /var/www/html/devfiles/apache-camel-k/devfile.yaml
Checking devfile /var/www/html/devfiles/java-mysql/devfile.yaml
Checking devfile /var/www/html/devfiles/cpp/devfile.yaml
Checking devfile /var/www/html/devfiles/java-gradle/devfile.yaml
Checking devfile /var/www/html/devfiles/apache-camel-springboot/devfile.yaml
Checking devfile /var/www/html/devfiles/che4z/devfile.yaml
Updating devfiles to point at internal project zip files
[Wed Nov 10 22:20:59.909815 2021] [unixd:alert] [pid 17:tid 139653893041480] (1)Operation not permitted: AH02156: setgid: unable to set group id to Group 2
[Wed Nov 10 22:20:59.911550 2021] [unixd:alert] [pid 16:tid 139653893041480] (1)Operation not permitted: AH02156: setgid: unable to set group id to Group 2
[Wed Nov 10 22:20:59.927886 2021] [mpm_event:notice] [pid 1:tid 139653893041480] AH00489: Apache/2.4.43 (Unix) configured -- resuming normal operations
[Wed Nov 10 22:20:59.927918 2021] [core:notice] [pid 1:tid 139653893041480] AH00094: Command line: 'httpd -D FOREGROUND'
[Wed Nov 10 22:20:59.927951 2021] [unixd:alert] [pid 18:tid 139653893041480] (1)Operation not permitted: AH02156: setgid: unable to set group id to Group 2
10.230.13.4 - - [10/Nov/2021:22:21:11 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:21 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:31 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:31 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:41 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:41 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:51 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:21:51 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:01 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:01 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:11 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:11 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:21 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:21 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:31 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:31 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:41 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:41 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:51 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:22:51 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:23:01 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
10.230.13.4 - - [10/Nov/2021:22:23:01 +0000] "GET /devfiles/ HTTP/1.1" 200 14226
ssl_required WAS UPDATED for master realm.
Certificate was added to keystore
Importing keystore /etc/pki/ca-trust/extracted/java/cacerts to /scripts/openshift.jks...
Entry for alias digicertassuredidrootca successfully imported.
Entry for alias anfsecureserverrootca successfully imported.
Entry for alias affirmtrustcommercial successfully imported.
Entry for alias trustwaveglobaleccp256certificationauthority successfully imported.
Entry for alias t-telesecglobalrootclass3 successfully imported.
Entry for alias t-telesecglobalrootclass2 successfully imported.
Entry for alias comodoecccertificationauthority successfully imported.
Entry for alias swisssignsilverca-g2 successfully imported.
Entry for alias cadisigrootr2 successfully imported.
Entry for alias securetrustca successfully imported.
Entry for alias accvraiz1 successfully imported.
Entry for alias entrustrootcertificationauthority successfully imported.
Entry for alias identrustpublicsectorrootca1 successfully imported.
Entry for alias entrust.netpremium2048secureserverca successfully imported.
Entry for alias secureglobalca successfully imported.
Entry for alias netlockarany(classgold)ftanstvny successfully imported.
Entry for alias teliasonerarootcav1 successfully imported.
Entry for alias autoridaddecertificacionfirmaprofesionalcifa62634068 successfully imported.
Entry for alias acraizfnmt-rcm successfully imported.
Entry for alias gdcatrustauthr5root successfully imported.
Entry for alias izenpe.com successfully imported.
Entry for alias oistewisekeyglobalrootgcca successfully imported.
Entry for alias e-tugracertificationauthority successfully imported.
Entry for alias quovadisrootca3 successfully imported.
Entry for alias quovadisrootca2 successfully imported.
Entry for alias entrustrootcertificationauthority-ec1 successfully imported.
Entry for alias oistewisekeyglobalrootgbca successfully imported.
Entry for alias naverglobalrootcertificationauthority successfully imported.
Entry for alias gtsrootr4 successfully imported.
Entry for alias gtsrootr3 successfully imported.
Entry for alias digicertglobalrootg3 successfully imported.
Entry for alias gtsrootr2 successfully imported.
Entry for alias swisssigngoldca-g2 successfully imported.
Entry for alias comodoaaaservicesroot successfully imported.
Entry for alias digicertglobalrootg2 successfully imported.
Entry for alias gtsrootr1 successfully imported.
Entry for alias dstrootcax3 successfully imported.
Entry for alias certigna successfully imported.
Entry for alias digicerthighassuranceevrootca successfully imported.
Entry for alias usertrustrsacertificationauthority successfully imported.
Entry for alias certsignrootca successfully imported.
Entry for alias amazonrootca4 successfully imported.
Entry for alias certsignrootcag2 successfully imported.
Entry for alias amazonrootca3 successfully imported.
Entry for alias amazonrootca2 successfully imported.
Entry for alias trustcorrootcertca-2 successfully imported.
Entry for alias amazonrootca1 successfully imported.
Entry for alias trustcorrootcertca-1 successfully imported.
Entry for alias ssl.comrootcertificationauthorityecc successfully imported.
Entry for alias ssl.comrootcertificationauthorityrsa successfully imported.
Entry for alias d-trustrootclass3ca2ev2009 successfully imported.
Entry for alias networksolutionscertificateauthority successfully imported.
Entry for alias affirmtrustnetworking successfully imported.
Entry for alias globalsignrootca-r6 successfully imported.
Entry for alias globalsigneccrootca-r5 successfully imported.
Entry for alias globalsigneccrootca-r4 successfully imported.
Entry for alias szafirrootca2 successfully imported.
Entry for alias globalsignrootca-r3 successfully imported.
Entry for alias globalsignrootca-r2 successfully imported.
Entry for alias emsignrootca-c1 successfully imported.
Entry for alias emsigneccrootca-c3 successfully imported.
Entry for alias globaltrust2020 successfully imported.
Entry for alias buypassclass3rootca successfully imported.
Entry for alias comodorsacertificationauthority successfully imported.
Entry for alias certumec-384ca successfully imported.
Entry for alias securitycommunicationrootca2 successfully imported.
Entry for alias starfieldclass2ca successfully imported.
Entry for alias actalisauthenticationrootca successfully imported.
Entry for alias trustwaveglobalcertificationauthority successfully imported.
Entry for alias cfcaevroot successfully imported.
Entry for alias digicerttrustedrootg4 successfully imported.
Entry for alias entrustrootcertificationauthority-g4 successfully imported.
Entry for alias certumtrustednetworkca2 successfully imported.
Entry for alias entrustrootcertificationauthority-g2 successfully imported.
Entry for alias hellenicacademicandresearchinstitutionseccrootca2015 successfully imported.
Entry for alias twcarootcertificationauthority successfully imported.
Entry for alias twcaglobalrootca successfully imported.
Entry for alias globalsignrootr46 successfully imported.
Entry for alias baltimorecybertrustroot successfully imported.
Entry for alias buypassclass2rootca successfully imported.
Entry for alias digicertassuredidrootg3 successfully imported.
Entry for alias certumtrustednetworkca successfully imported.
Entry for alias digicertassuredidrootg2 successfully imported.
Entry for alias isrgrootx1 successfully imported.
Entry for alias ucaextendedvalidationroot successfully imported.
Entry for alias ec-acc successfully imported.
Entry for alias ssl.comevrootcertificationauthorityecc successfully imported.
Entry for alias digicertglobalrootca successfully imported.
Entry for alias d-trustrootclass3ca22009 successfully imported.
Entry for alias starfieldservicesrootcertificateauthority-g2 successfully imported.
Entry for alias certignarootca successfully imported.
Entry for alias atostrustedroot2011 successfully imported.
Entry for alias certumtrustedrootca successfully imported.
Entry for alias identrustcommercialrootca1 successfully imported.
Entry for alias staatdernederlandenevrootca successfully imported.
Entry for alias tubitakkamusmsslkoksertifikasi-surum1 successfully imported.
Entry for alias trustcoreca-1 successfully imported.
Entry for alias emsignrootca-g1 successfully imported.
Entry for alias ucaglobalg2root successfully imported.
Entry for alias emsigneccrootca-g3 successfully imported.
Entry for alias securitycommunicationrootca successfully imported.
Entry for alias comodocertificationauthority successfully imported.
Entry for alias xrampglobalcaroot successfully imported.
Entry for alias quovadisrootca3g3 successfully imported.
Entry for alias securesignrootca11 successfully imported.
Entry for alias affirmtrustpremium successfully imported.
Entry for alias globalsignrootca successfully imported.
Entry for alias quovadisrootca2g3 successfully imported.
Entry for alias affirmtrustpremiumecc successfully imported.
Entry for alias hongkongpostrootca3 successfully imported.
Entry for alias e-szignorootca2017 successfully imported.
Entry for alias acraizfnmt-rcmservidoresseguros successfully imported.
Entry for alias quovadisrootca1g3 successfully imported.
Entry for alias hongkongpostrootca1 successfully imported.
Entry for alias usertrustecccertificationauthority successfully imported.
Entry for alias cybertrustglobalroot successfully imported.
Entry for alias microsoftrsarootcertificateauthority2017 successfully imported.
Entry for alias godaddyclass2ca successfully imported.
Entry for alias microsece-szignorootca2009 successfully imported.
Entry for alias hellenicacademicandresearchinstitutionsrootca2015 successfully imported.
Entry for alias microsofteccrootcertificateauthority2017 successfully imported.
Entry for alias hellenicacademicandresearchinstitutionsrootca2011 successfully imported.
Entry for alias godaddyrootcertificateauthority-g2 successfully imported.
Entry for alias epkirootcertificationauthority successfully imported.
Entry for alias trustwaveglobaleccp384certificationauthority successfully imported.
Entry for alias globalsignroote46 successfully imported.
Entry for alias starfieldrootcertificateauthority-g2 successfully imported.
Entry for alias ssl.comevrootcertificationauthorityrsar2 successfully imported.
Import command completed:  128 entries successfully imported, 0 entries failed or cancelled

Warning:
<digicertassuredidrootca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<swisssignsilverca-g2> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<securetrustca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<accvraiz1> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<entrustrootcertificationauthority> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<entrust.netpremium2048secureserverca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<secureglobalca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<teliasonerarootcav1> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<autoridaddecertificacionfirmaprofesionalcifa62634068> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<quovadisrootca3> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<quovadisrootca2> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<swisssigngoldca-g2> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<comodoaaaservicesroot> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<dstrootcax3> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<certigna> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<digicerthighassuranceevrootca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<certsignrootca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<networksolutionscertificateauthority> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<affirmtrustnetworking> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<globalsignrootca-r2> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<starfieldclass2ca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<twcarootcertificationauthority> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<baltimorecybertrustroot> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<certumtrustednetworkca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<ec-acc> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<digicertglobalrootca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<securitycommunicationrootca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<comodocertificationauthority> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<xrampglobalcaroot> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<securesignrootca11> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<globalsignrootca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<hongkongpostrootca1> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<cybertrustglobalroot> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<godaddyclass2ca> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<hellenicacademicandresearchinstitutionsrootca2011> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
<epkirootcertificationauthority> uses the SHA1withRSA signature algorithm which is considered a security risk. This algorithm will be disabled in a future update.
Installing certificates into Keycloak
22:21:04,191 INFO  [org.jboss.modules] (CLI command executor) JBoss Modules version 1.11.0.Final
22:21:04,269 INFO  [org.jboss.msc] (CLI command executor) JBoss MSC version 1.4.12.Final
22:21:04,276 INFO  [org.jboss.threads] (CLI command executor) JBoss Threads version 2.4.0.Final
22:21:04,384 INFO  [org.jboss.as] (MSC service thread 1-2) WFLYSRV0049: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) starting
22:21:05,245 INFO  [org.wildfly.security] (ServerService Thread Pool -- 20) ELY00001: WildFly Elytron version 1.15.3.Final
22:21:05,621 INFO  [org.jboss.as.controller.management-deprecated] (ServerService Thread Pool -- 16) WFLYCTL0033: Extension 'security' is deprecated and may not be supported in future versions
22:21:05,980 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/core-service=management/management-interface=http-interface' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:06,110 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/subsystem=undertow/server=default-server/https-listener=https' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:06,310 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-4) WFLYELY00023: KeyStore file '/opt/jboss/keycloak/standalone/configuration/application.keystore' does not exist. Used blank.
22:21:06,326 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-2) WFLYELY01084: KeyStore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self-signed certificate for host localhost
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.wildfly.extension.elytron.SSLDefinitions (jar:file:/opt/jboss/keycloak/modules/system/layers/base/org/wildfly/extension/elytron/main/wildfly-elytron-integration-15.0.1.Final.jar!/) to method com.sun.net.ssl.internal.ssl.Provider.isFIPS()
WARNING: Please consider reporting this to the maintainers of org.wildfly.extension.elytron.SSLDefinitions
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
22:21:06,392 INFO  [org.jboss.as.patching] (MSC service thread 1-1) WFLYPAT0050: Keycloak cumulative patch ID is: base, one-off patches include: none
22:21:06,414 WARN  [org.jboss.as.domain.management.security] (MSC service thread 1-4) WFLYDM0111: Keystore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self signed certificate for host localhost
22:21:06,564 INFO  [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0212: Resuming server
22:21:06,575 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) started in 2365ms - Started 59 of 82 services (32 services are lazy, passive or on-demand)
{"outcome" => "success"}
{"outcome" => "success"}
22:21:06,935 INFO  [org.jboss.as] (MSC service thread 1-2) WFLYSRV0050: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) stopped in 30ms
Added 'admin' to '/opt/jboss/keycloak/standalone/configuration/keycloak-add-user.json', restart server to load user
=========================================================================

  Using PostgreSQL database

=========================================================================

22:21:09,836 INFO  [org.jboss.modules] (CLI command executor) JBoss Modules version 1.11.0.Final
22:21:09,912 INFO  [org.jboss.msc] (CLI command executor) JBoss MSC version 1.4.12.Final
22:21:09,920 INFO  [org.jboss.threads] (CLI command executor) JBoss Threads version 2.4.0.Final
22:21:10,028 INFO  [org.jboss.as] (MSC service thread 1-1) WFLYSRV0049: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) starting
22:21:10,863 INFO  [org.wildfly.security] (ServerService Thread Pool -- 19) ELY00001: WildFly Elytron version 1.15.3.Final
22:21:11,200 INFO  [org.jboss.as.controller.management-deprecated] (ServerService Thread Pool -- 14) WFLYCTL0033: Extension 'security' is deprecated and may not be supported in future versions
22:21:11,688 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/core-service=management/management-interface=http-interface' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:11,778 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/subsystem=undertow/server=default-server/https-listener=https' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:11,944 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-4) WFLYELY00023: KeyStore file '/opt/jboss/keycloak/standalone/configuration/application.keystore' does not exist. Used blank.
22:21:11,963 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-2) WFLYELY01084: KeyStore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self-signed certificate for host localhost
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.wildfly.extension.elytron.SSLDefinitions (jar:file:/opt/jboss/keycloak/modules/system/layers/base/org/wildfly/extension/elytron/main/wildfly-elytron-integration-15.0.1.Final.jar!/) to method com.sun.net.ssl.internal.ssl.Provider.isFIPS()
WARNING: Please consider reporting this to the maintainers of org.wildfly.extension.elytron.SSLDefinitions
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
22:21:11,997 INFO  [org.jboss.as.patching] (MSC service thread 1-2) WFLYPAT0050: Keycloak cumulative patch ID is: base, one-off patches include: none
22:21:12,112 WARN  [org.jboss.as.domain.management.security] (MSC service thread 1-2) WFLYDM0111: Keystore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self signed certificate for host localhost
22:21:12,219 INFO  [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0212: Resuming server
22:21:12,230 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) started in 2380ms - Started 59 of 82 services (32 services are lazy, passive or on-demand)
The batch executed successfully
22:21:12,509 INFO  [org.jboss.as] (MSC service thread 1-3) WFLYSRV0050: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) stopped in 18ms
22:21:14,088 INFO  [org.jboss.modules] (CLI command executor) JBoss Modules version 1.11.0.Final
22:21:14,150 INFO  [org.jboss.msc] (CLI command executor) JBoss MSC version 1.4.12.Final
22:21:14,161 INFO  [org.jboss.threads] (CLI command executor) JBoss Threads version 2.4.0.Final
22:21:14,315 INFO  [org.jboss.as] (MSC service thread 1-1) WFLYSRV0049: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) starting
22:21:15,378 INFO  [org.wildfly.security] (ServerService Thread Pool -- 22) ELY00001: WildFly Elytron version 1.15.3.Final
22:21:15,677 INFO  [org.jboss.as.controller.management-deprecated] (ServerService Thread Pool -- 18) WFLYCTL0033: Extension 'security' is deprecated and may not be supported in future versions
22:21:16,079 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/core-service=management/management-interface=http-interface' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:16,245 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/subsystem=undertow/server=default-server/https-listener=https' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:16,444 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-3) WFLYELY00023: KeyStore file '/opt/jboss/keycloak/standalone/configuration/application.keystore' does not exist. Used blank.
22:21:16,458 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-4) WFLYELY01084: KeyStore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self-signed certificate for host localhost
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.wildfly.extension.elytron.SSLDefinitions (jar:file:/opt/jboss/keycloak/modules/system/layers/base/org/wildfly/extension/elytron/main/wildfly-elytron-integration-15.0.1.Final.jar!/) to method com.sun.net.ssl.internal.ssl.Provider.isFIPS()
WARNING: Please consider reporting this to the maintainers of org.wildfly.extension.elytron.SSLDefinitions
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
22:21:16,486 INFO  [org.jboss.as.patching] (MSC service thread 1-3) WFLYPAT0050: Keycloak cumulative patch ID is: base, one-off patches include: none
22:21:16,521 WARN  [org.jboss.as.domain.management.security] (MSC service thread 1-1) WFLYDM0111: Keystore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self signed certificate for host localhost
22:21:16,642 INFO  [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0212: Resuming server
22:21:16,653 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) started in 2550ms - Started 59 of 89 services (39 services are lazy, passive or on-demand)
The batch executed successfully
22:21:16,928 INFO  [org.jboss.as] (MSC service thread 1-3) WFLYSRV0050: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) stopped in 8ms
=========================================================================

  JBoss Bootstrap Environment

  JBOSS_HOME: /opt/jboss/keycloak

  JAVA: java

  JAVA_OPTS:  -server -Xms64m -Xmx512m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -Djava.net.preferIPv4Stack=true -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true   --add-exports=java.base/sun.nio.ch=ALL-UNNAMED --add-exports=jdk.unsupported/sun.misc=ALL-UNNAMED --add-exports=jdk.unsupported/sun.reflect=ALL-UNNAMED

=========================================================================

22:21:17,595 INFO  [org.jboss.modules] (main) JBoss Modules version 1.11.0.Final
22:21:18,065 INFO  [org.jboss.msc] (main) JBoss MSC version 1.4.12.Final
22:21:18,072 INFO  [org.jboss.threads] (main) JBoss Threads version 2.4.0.Final
22:21:18,176 INFO  [org.jboss.as] (MSC service thread 1-1) WFLYSRV0049: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) starting
22:21:19,215 INFO  [org.wildfly.security] (ServerService Thread Pool -- 19) ELY00001: WildFly Elytron version 1.15.3.Final
22:21:19,641 INFO  [org.jboss.as.controller.management-deprecated] (ServerService Thread Pool -- 11) WFLYCTL0033: Extension 'security' is deprecated and may not be supported in future versions
22:21:20,027 INFO  [org.jboss.as.controller.management-deprecated] (Controller Boot Thread) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/core-service=management/management-interface=http-interface' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:20,125 INFO  [org.jboss.as.controller.management-deprecated] (ServerService Thread Pool -- 26) WFLYCTL0028: Attribute 'security-realm' in the resource at address '/subsystem=undertow/server=default-server/https-listener=https' is deprecated, and may be removed in a future version. See the attribute description in the output of the read-resource-description operation to learn more about the deprecation.
22:21:20,254 INFO  [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0039: Creating http management service using socket-binding (management-http)
22:21:20,273 INFO  [org.xnio] (MSC service thread 1-3) XNIO version 3.8.4.Final
22:21:20,283 INFO  [org.xnio.nio] (MSC service thread 1-3) XNIO NIO Implementation Version 3.8.4.Final
22:21:20,369 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 37) WFLYCLINF0001: Activating Infinispan subsystem.
22:21:20,384 INFO  [org.jboss.remoting] (MSC service thread 1-1) JBoss Remoting version 5.0.20.Final
22:21:20,386 INFO  [org.jboss.as.security] (ServerService Thread Pool -- 49) WFLYSEC0002: Activating Security Subsystem
22:21:20,391 INFO  [org.jboss.as.naming] (ServerService Thread Pool -- 46) WFLYNAM0001: Activating Naming Subsystem
22:21:20,389 WARN  [org.jboss.as.txn] (ServerService Thread Pool -- 51) WFLYTX0013: The node-identifier attribute on the /subsystem=transactions is set to the default value. This is a danger for environments running multiple servers. Please make sure the attribute value is unique.
22:21:20,427 INFO  [org.wildfly.extension.io] (ServerService Thread Pool -- 38) WFLYIO001: Worker 'default' has auto-configured to 4 IO threads with 32 max task threads based on your 2 available processors
22:21:20,432 INFO  [org.wildfly.extension.health] (ServerService Thread Pool -- 36) WFLYHEALTH0001: Activating Base Health Subsystem
22:21:20,428 INFO  [org.jboss.as.security] (MSC service thread 1-2) WFLYSEC0001: Current PicketBox version=5.0.3.Final-redhat-00007
22:21:20,468 INFO  [org.jboss.as.mail.extension] (MSC service thread 1-1) WFLYMAIL0001: Bound mail session [java:jboss/mail/Default]
22:21:20,465 INFO  [org.jboss.as.naming] (MSC service thread 1-4) WFLYNAM0003: Starting Naming Service
22:21:20,500 INFO  [org.wildfly.extension.metrics] (ServerService Thread Pool -- 45) WFLYMETRICS0001: Activating Base Metrics Subsystem
22:21:20,503 INFO  [org.jboss.as.jaxrs] (ServerService Thread Pool -- 39) WFLYRS0016: RESTEasy version 3.15.1.Final
22:21:20,546 INFO  [org.jboss.as.connector.subsystems.datasources] (ServerService Thread Pool -- 31) WFLYJCA0004: Deploying JDBC-compliant driver class org.h2.Driver (version 1.4)
22:21:20,598 INFO  [org.jboss.as.connector] (MSC service thread 1-4) WFLYJCA0009: Starting Jakarta Connectors Subsystem (WildFly/IronJacamar 1.4.27.Final)
22:21:20,676 INFO  [org.wildfly.extension.undertow] (MSC service thread 1-4) WFLYUT0003: Undertow 2.2.5.Final starting
22:21:20,680 INFO  [org.jboss.as.connector.subsystems.datasources] (ServerService Thread Pool -- 31) WFLYJCA0005: Deploying non-JDBC-compliant driver class org.postgresql.Driver (version 42.2)
22:21:20,713 WARN  [org.wildfly.clustering.web.undertow] (ServerService Thread Pool -- 52) WFLYCLWEBUT0007: No routing provider found for default-server; using legacy provider based on static configuration
22:21:20,734 INFO  [org.jboss.as.connector.deployers.jdbc] (MSC service thread 1-1) WFLYJCA0018: Started Driver service with driver-name = h2
22:21:20,739 INFO  [org.jboss.as.connector.deployers.jdbc] (MSC service thread 1-4) WFLYJCA0018: Started Driver service with driver-name = postgresql
22:21:20,776 INFO  [org.jboss.as.ejb3] (MSC service thread 1-1) WFLYEJB0482: Strict pool mdb-strict-max-pool is using a max instance size of 8 (per class), which is derived from the number of CPUs on this host.
22:21:20,784 INFO  [org.jboss.as.ejb3] (MSC service thread 1-4) WFLYEJB0481: Strict pool slsb-strict-max-pool is using a max instance size of 32 (per class), which is derived from thread worker pool sizing.
22:21:20,840 INFO  [org.wildfly.extension.undertow] (ServerService Thread Pool -- 52) WFLYUT0014: Creating file handler for path '/opt/jboss/keycloak/welcome-content' with options [directory-listing: 'false', follow-symlink: 'false', case-sensitive: 'true', safe-symlink-paths: '[]']
22:21:20,925 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-3) WFLYELY00023: KeyStore file '/opt/jboss/keycloak/standalone/configuration/application.keystore' does not exist. Used blank.
22:21:20,986 WARN  [org.wildfly.extension.elytron] (MSC service thread 1-3) WFLYELY01084: KeyStore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self-signed certificate for host localhost
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.wildfly.extension.elytron.SSLDefinitions (jar:file:/opt/jboss/keycloak/modules/system/layers/base/org/wildfly/extension/elytron/main/wildfly-elytron-integration-15.0.1.Final.jar!/) to method com.sun.net.ssl.internal.ssl.Provider.isFIPS()
WARNING: Please consider reporting this to the maintainers of org.wildfly.extension.elytron.SSLDefinitions
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
22:21:21,026 INFO  [org.wildfly.extension.undertow] (MSC service thread 1-2) WFLYUT0012: Started server default-server.
22:21:21,172 INFO  [org.wildfly.extension.undertow] (MSC service thread 1-1) Queuing requests.
22:21:21,176 INFO  [org.wildfly.extension.undertow] (MSC service thread 1-1) WFLYUT0018: Host default-host starting
22:21:21,191 INFO  [org.wildfly.extension.undertow] (MSC service thread 1-2) WFLYUT0006: Undertow HTTP listener default listening on 0.0.0.0:8080
22:21:21,305 INFO  [org.jboss.as.ejb3] (MSC service thread 1-1) WFLYEJB0493: Jakarta Enterprise Beans subsystem suspension complete
22:21:21,397 INFO  [org.jboss.as.connector.subsystems.datasources] (MSC service thread 1-1) WFLYJCA0001: Bound data source [java:jboss/datasources/ExampleDS]
22:21:21,403 INFO  [org.jboss.as.connector.subsystems.datasources] (MSC service thread 1-2) WFLYJCA0001: Bound data source [java:jboss/datasources/KeycloakDS]
22:21:21,577 INFO  [org.jboss.as.patching] (MSC service thread 1-1) WFLYPAT0050: Keycloak cumulative patch ID is: base, one-off patches include: none
22:21:21,611 WARN  [org.jboss.as.domain.management.security] (MSC service thread 1-4) WFLYDM0111: Keystore /opt/jboss/keycloak/standalone/configuration/application.keystore not found, it will be auto generated on first use with a self signed certificate for host localhost
22:21:21,635 INFO  [org.jboss.as.server.deployment] (MSC service thread 1-3) WFLYSRV0027: Starting deployment of "keycloak-server.war" (runtime-name: "keycloak-server.war")
22:21:21,649 INFO  [org.jboss.as.server.deployment.scanner] (MSC service thread 1-4) WFLYDS0013: Started FileSystemDeploymentService for directory /opt/jboss/keycloak/standalone/deployments
22:21:21,665 INFO  [org.wildfly.extension.undertow] (MSC service thread 1-1) WFLYUT0006: Undertow HTTPS listener https listening on 0.0.0.0:8443
22:21:22,455 INFO  [org.infinispan.CONTAINER] (ServerService Thread Pool -- 54) ISPN000128: Infinispan version: Infinispan 'Corona Extra' 11.0.9.Final
22:21:22,517 INFO  [org.infinispan.CONFIG] (MSC service thread 1-2) ISPN000152: Passivation configured without an eviction policy being selected. Only manually evicted entities will be passivated.
22:21:22,521 INFO  [org.infinispan.CONFIG] (MSC service thread 1-2) ISPN000152: Passivation configured without an eviction policy being selected. Only manually evicted entities will be passivated.
22:21:22,584 INFO  [org.infinispan.PERSISTENCE] (ServerService Thread Pool -- 54) ISPN000556: Starting user marshaller 'org.wildfly.clustering.infinispan.spi.marshalling.InfinispanProtoStreamMarshaller'
22:21:22,589 INFO  [org.infinispan.PERSISTENCE] (ServerService Thread Pool -- 55) ISPN000556: Starting user marshaller 'org.wildfly.clustering.infinispan.marshalling.jboss.JBossMarshaller'
22:21:22,822 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 54) WFLYCLINF0002: Started http-remoting-connector cache from ejb container
22:21:22,858 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 56) WFLYCLINF0002: Started loginFailures cache from keycloak container
22:21:22,867 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 64) WFLYCLINF0002: Started clientSessions cache from keycloak container
22:21:22,875 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 59) WFLYCLINF0002: Started authenticationSessions cache from keycloak container
22:21:22,864 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 63) WFLYCLINF0002: Started sessions cache from keycloak container
22:21:22,860 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 60) WFLYCLINF0002: Started actionTokens cache from keycloak container
22:21:22,858 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 61) WFLYCLINF0002: Started offlineClientSessions cache from keycloak container
22:21:22,877 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 58) WFLYCLINF0002: Started work cache from keycloak container
22:21:22,894 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 66) WFLYCLINF0002: Started offlineSessions cache from keycloak container
22:21:22,902 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 65) WFLYCLINF0002: Started authorization cache from keycloak container
22:21:22,903 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 57) WFLYCLINF0002: Started realms cache from keycloak container
22:21:22,910 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 62) WFLYCLINF0002: Started users cache from keycloak container
22:21:22,912 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 55) WFLYCLINF0002: Started keys cache from keycloak container
22:21:23,012 WARN  [org.jboss.as.server.deployment] (MSC service thread 1-1) WFLYSRV0273: Excluded subsystem webservices via jboss-deployment-structure.xml does not exist.
22:21:23,630 INFO  [org.keycloak.services] (ServerService Thread Pool -- 55) KC-SERVICES0001: Loading config from standalone.xml or domain.xml
22:21:23,680 INFO  [org.keycloak.common.Profile] (ServerService Thread Pool -- 55) Preview feature enabled: admin_fine_grained_authz
22:21:23,681 INFO  [org.keycloak.common.Profile] (ServerService Thread Pool -- 55) Preview feature enabled: token_exchange
22:21:23,760 INFO  [org.keycloak.url.DefaultHostnameProviderFactory] (ServerService Thread Pool -- 55) Frontend: https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth, Admin: <frontend>, Backend: <request>
22:21:24,653 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 55) WFLYCLINF0002: Started realmRevisions cache from keycloak container
22:21:24,661 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 55) WFLYCLINF0002: Started userRevisions cache from keycloak container
22:21:24,674 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 55) WFLYCLINF0002: Started authorizationRevisions cache from keycloak container
22:21:24,676 INFO  [org.keycloak.connections.infinispan.DefaultInfinispanConnectionProviderFactory] (ServerService Thread Pool -- 55) Node name: keycloak-65777d7db7-fpl22, Site name: null
22:21:25,711 INFO  [org.keycloak.connections.jpa.DefaultJpaConnectionProviderFactory] (ServerService Thread Pool -- 55) Database info: {databaseUrl=jdbc:postgresql://postgres:5432/keycloak, databaseUser=keycloak, databaseProduct=PostgreSQL 13.3, databaseDriver=PostgreSQL JDBC Driver 42.2.5}
22:21:26,999 INFO  [org.hibernate.jpa.internal.util.LogHelper] (ServerService Thread Pool -- 55) HHH000204: Processing PersistenceUnitInfo [
    name: keycloak-default
    ...]
22:21:27,048 INFO  [org.hibernate.Version] (ServerService Thread Pool -- 55) HHH000412: Hibernate Core {5.3.20.Final}
22:21:27,049 INFO  [org.hibernate.cfg.Environment] (ServerService Thread Pool -- 55) HHH000206: hibernate.properties not found
22:21:27,165 INFO  [org.hibernate.annotations.common.Version] (ServerService Thread Pool -- 55) HCANN000001: Hibernate Commons Annotations {5.0.5.Final}
22:21:27,347 INFO  [org.hibernate.dialect.Dialect] (ServerService Thread Pool -- 55) HHH000400: Using dialect: org.hibernate.dialect.PostgreSQL95Dialect
22:21:27,541 INFO  [org.hibernate.engine.jdbc.env.internal.LobCreatorBuilderImpl] (ServerService Thread Pool -- 55) HHH000424: Disabling contextual LOB creation as createClob() method threw error : java.lang.reflect.InvocationTargetException
22:21:27,545 INFO  [org.hibernate.type.BasicTypeRegistry] (ServerService Thread Pool -- 55) HHH000270: Type registration [java.util.UUID] overrides previous : org.hibernate.type.UUIDBinaryType@37fc99e8
22:21:27,550 INFO  [org.hibernate.envers.boot.internal.EnversServiceImpl] (ServerService Thread Pool -- 55) Envers integration enabled? : true
22:21:27,885 INFO  [org.hibernate.orm.beans] (ServerService Thread Pool -- 55) HHH10005002: No explicit CDI BeanManager reference was passed to Hibernate, but CDI is available on the Hibernate ClassLoader.
22:21:27,944 INFO  [org.hibernate.validator.internal.util.Version] (ServerService Thread Pool -- 55) HV000001: Hibernate Validator 6.0.22.Final
22:21:29,075 INFO  [org.hibernate.hql.internal.QueryTranslatorFactoryInitiator] (ServerService Thread Pool -- 55) HHH000397: Using ASTQueryTranslatorFactory
22:21:29,912 INFO  [org.keycloak.services] (ServerService Thread Pool -- 55) KC-SERVICES0006: Importing users from '/opt/jboss/keycloak/standalone/configuration/keycloak-add-user.json'
22:21:30,093 WARN  [org.keycloak.services] (ServerService Thread Pool -- 55) KC-SERVICES0104: Not creating user admin. It already exists.
22:21:30,145 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002225: Deploying javax.ws.rs.core.Application: class org.keycloak.services.resources.KeycloakApplication
22:21:30,147 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002200: Adding class resource org.keycloak.services.resources.ThemeResource from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,148 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002200: Adding class resource org.keycloak.services.resources.JsResource from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,149 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002205: Adding provider class org.keycloak.services.filters.KeycloakSecurityHeadersFilter from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,149 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002205: Adding provider class org.keycloak.services.error.KeycloakErrorHandler from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,151 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002220: Adding singleton resource org.keycloak.services.resources.RealmsResource from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,151 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002210: Adding provider singleton org.keycloak.services.util.ObjectMapperResolver from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,153 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002220: Adding singleton resource org.keycloak.services.resources.admin.AdminRoot from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,154 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002220: Adding singleton resource org.keycloak.services.resources.RobotsResource from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,155 INFO  [org.jboss.resteasy.resteasy_jaxrs.i18n] (ServerService Thread Pool -- 55) RESTEASY002220: Adding singleton resource org.keycloak.services.resources.WelcomeResource from Application class org.keycloak.services.resources.KeycloakApplication
22:21:30,251 INFO  [org.wildfly.extension.undertow] (ServerService Thread Pool -- 55) WFLYUT0021: Registered web context: '/auth' for server 'default-server'
22:21:30,320 INFO  [org.jboss.as.server] (ServerService Thread Pool -- 43) WFLYSRV0010: Deployed "keycloak-server.war" (runtime-name : "keycloak-server.war")
22:21:30,349 INFO  [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0212: Resuming server
22:21:30,351 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: Keycloak 15.0.2 (WildFly Core 15.0.1.Final) started in 13091ms - Started 595 of 873 services (584 services are lazy, passive or on-demand)
22:21:30,352 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0060: Http management interface listening on http://127.0.0.1:9990/management
22:21:30,352 INFO  [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990
Checking meta /var/www/html/v3/plugins/eamodio/vscode-gitlens/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/eamodio/vscode-gitlens/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/eamodio/gitlens/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/eamodio/gitlens/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ms-python/python/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ms-python/python/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/atlassian/atlascode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/atlassian/atlascode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/timonwong/shellcheck/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/timonwong/shellcheck/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/intellij-community/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/intellij-community/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/pycharm/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/pycharm/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/theia-dev/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/theia-dev/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/typescript/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/che-incubator/typescript/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/zxh404/protobuf/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/zxh404/protobuf/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/zxh404/vscode-proto3/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/zxh404/vscode-proto3/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/dirigiblelabs/dirigible/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/dirigiblelabs/dirigible/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/cdr/code-server/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/cdr/code-server/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/sonarsource/sonarlint-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/sonarsource/sonarlint-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/debugger-for-mainframe/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/debugger-for-mainframe/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/hlasm-language-support/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/hlasm-language-support/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/ccf/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/ccf/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/cobol-language-support/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/cobol-language-support/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/explorer-for-endevor/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/broadcommfd/explorer-for-endevor/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/microprofile-community/mp-starter-vscode-ext/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/microprofile-community/mp-starter-vscode-ext/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/stylelint/vscode-stylelint/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/stylelint/vscode-stylelint/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/js-debug/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/js-debug/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/node-debug2/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/node-debug2/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/node-debug/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/node-debug/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/vscode-github-pullrequest/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ms-vscode/vscode-github-pullrequest/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/bierner/markdown-mermaid/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/bierner/markdown-mermaid/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/scalameta/metals/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/scalameta/metals/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/vuejs/vetur/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/vuejs/vetur/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/dart-code/flutter/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/dart-code/flutter/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/dart-code/dart-code/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/dart-code/dart-code/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/castwide/solargraph/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/castwide/solargraph/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ivory-lab/jenkinsfile-support/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ivory-lab/jenkinsfile-support/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/4ops/terraform/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/4ops/terraform/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/errata-ai/vale-server/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/errata-ai/vale-server/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/dbaeumer/vscode-eslint/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/dbaeumer/vscode-eslint/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/gattytto/dart-code/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/gattytto/dart-code/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/donjayamanne/githistory/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/donjayamanne/githistory/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/github/vscode-pull-request-github/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/github/vscode-pull-request-github/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/java8/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/java8/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/mta-vscode-extension/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/mta-vscode-extension/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-quarkus/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-quarkus/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-xml/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-xml/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/java/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/java/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-camelk/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-camelk/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-wsdl2rest/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-wsdl2rest/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-openshift-connector/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-openshift-connector/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-microprofile/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-microprofile/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/fabric8-analytics/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/fabric8-analytics/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-didact/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-didact/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-yaml/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-yaml/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/project-initializer/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/project-initializer/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-tekton-pipelines/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-tekton-pipelines/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-apache-camel/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-apache-camel/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/php-debugger/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/php-debugger/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/java11/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/java11/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-commons/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-commons/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/mta/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/mta/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/dependency-analytics/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/dependency-analytics/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-tekton/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/vscode-tekton/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/quarkus-java8/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/quarkus-java8/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat/quarkus-java11/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat/quarkus-java11/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/rogalmic/bash-debug/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/rogalmic/bash-debug/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/golang/go/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/golang/go/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/sdkbox/vscode-libra-move/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/sdkbox/vscode-libra-move/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/bpruitt-goddard/mermaid-markdown-syntax-highlighting/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/bpruitt-goddard/mermaid-markdown-syntax-highlighting/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/bmewburn/vscode-intelephense-client/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/bmewburn/vscode-intelephense-client/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/scala-lang/scala/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/scala-lang/scala/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-theia/next/devfile.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-theia/next/meta.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-theia/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-theia/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-async-pv-plugin/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-async-pv-plugin/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-async-pv-plugin/nightly/devfile.yaml
Checking meta /var/www/html/v3/plugins/eclipse/che-async-pv-plugin/nightly/meta.yaml
Checking meta /var/www/html/v3/plugins/esbenp/prettier-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/esbenp/prettier-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/octref/vetur/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/octref/vetur/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/zowe/vscode-extension-for-zowe/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/zowe/vscode-extension-for-zowe/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/containers/buildah/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/containers/buildah/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/vscodevim/vim/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/vscodevim/vim/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/jebbs/plantuml/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/jebbs/plantuml/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/mads-hartmann/bash-ide-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/mads-hartmann/bash-ide-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/mads-hartmann/bash-ide/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/mads-hartmann/bash-ide/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/vscode/typescript-language-features/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/vscode/typescript-language-features/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/randomchance/logstash/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/randomchance/logstash/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/llvm-vs-code-extensions/vscode-clangd/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/llvm-vs-code-extensions/vscode-clangd/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/bazelbuild/vscode-bazel/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/bazelbuild/vscode-bazel/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/eclipse-cdt/cdt-gdb-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/eclipse-cdt/cdt-gdb-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/eclipse-cdt/cdt-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/eclipse-cdt/cdt-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ms-kubernetes-tools/vscode-kubernetes-tools/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/ms-kubernetes-tools/vscode-kubernetes-tools/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/microshed/mp-starter-vscode-ext/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/microshed/mp-starter-vscode-ext/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat-developer/netcoredbg-theia-plugin/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat-developer/netcoredbg-theia-plugin/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/redhat-developer/che-omnisharp-plugin/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/redhat-developer/che-omnisharp-plugin/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/felixfbecker/php-debug/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/felixfbecker/php-debug/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/felixfbecker/vscode-php-debug/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/felixfbecker/vscode-php-debug/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/asciidoctor/asciidoctor-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/asciidoctor/asciidoctor-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/che-theia/che-openshift-authentication-plugin/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/che-theia/che-openshift-authentication-plugin/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/fbaligand/vscode-logstash-editor/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/fbaligand/vscode-logstash-editor/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/lfs/vscode-emacs-friendly/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/lfs/vscode-emacs-friendly/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/joaompinto/asciidoctor-vscode/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/joaompinto/asciidoctor-vscode/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/moby/buildkit/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/moby/buildkit/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/vscjava/vscode-java-debug/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/vscjava/vscode-java-debug/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/vscjava/vscode-java-test/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/vscjava/vscode-java-test/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/pizzafactory/vscode-libra-move/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/pizzafactory/vscode-libra-move/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/pizzafactory/xtend-lang/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/pizzafactory/xtend-lang/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/pizzafactory/xtext-lang/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/pizzafactory/xtext-lang/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ws-skeleton/jupyter/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/ws-skeleton/jupyter/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/ws-skeleton/eclipseide/latest/devfile.yaml
Checking meta /var/www/html/v3/plugins/ws-skeleton/eclipseide/latest/meta.yaml
Checking meta /var/www/html/v3/plugins/rust-lang/rust/latest/che-theia-plugin.yaml
Checking meta /var/www/html/v3/plugins/rust-lang/rust/latest/meta.yaml
[Wed Nov 10 22:21:41.147908 2021] [mpm_event:notice] [pid 1:tid 139708038937416] AH00489: Apache/2.4.46 (Unix) configured -- resuming normal operations
[Wed Nov 10 22:21:41.147954 2021] [core:notice] [pid 1:tid 139708038937416] AH00094: Command line: 'httpd -D FOREGROUND'
[Wed Nov 10 22:21:41.148065 2021] [unixd:alert] [pid 14:tid 139708038937416] (1)Operation not permitted: AH02156: setgid: unable to set group id to Group 2
[Wed Nov 10 22:21:41.155715 2021] [unixd:alert] [pid 15:tid 139708038937416] (1)Operation not permitted: AH02156: setgid: unable to set group id to Group 2
[Wed Nov 10 22:21:41.155932 2021] [unixd:alert] [pid 13:tid 139708038937416] (1)Operation not permitted: AH02156: setgid: unable to set group id to Group 2
10.230.13.4 - - [10/Nov/2021:22:21:44 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:21:44 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:21:54 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:21:54 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:04 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:04 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:14 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:14 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:16 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:16 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:24 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:24 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:26 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:26 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:34 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:34 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:36 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:36 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:44 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:44 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:46 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:46 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:54 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:22:54 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:56 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:22:56 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:23:04 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:23:04 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
10.230.13.4 - - [10/Nov/2021:22:23:06 +0000] "GET /plugins/ HTTP/1.1" 302 220
10.230.13.4 - - [10/Nov/2021:22:23:06 +0000] "GET /v3/plugins/ HTTP/1.1" 200 44970
2021-11-10T22:18:14.356Z    INFO    Binary info     {"Go version": "go1.15.14"}
2021-11-10T22:18:14.356Z    INFO    Binary info     {"OS": "linux", "Arch": "amd64"}
2021-11-10T22:18:14.356Z    INFO    Address     {"Metrics": ":60000"}
2021-11-10T22:18:14.356Z    INFO    Address     {"Probe": ":6789"}
2021-11-10T22:18:14.356Z    INFO    Operator is running on  {"Infrastructure": "Kubernetes"}
I1110 22:18:15.408291       1 request.go:668] Waited for 1.042505968s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/org.eclipse.che/v1?timeout=32s
2021-11-10T22:18:16.568Z    INFO    controller-runtime.metrics  metrics server is starting to listen    {"addr": ":60000"}
time="2021-11-10T22:18:17Z" level=info msg="Use 'terminationGracePeriodSeconds' 20 sec. from operator deployment."
time="2021-11-10T22:18:17Z" level=info msg="Set up process signal handler"
2021-11-10T22:18:18.795Z    INFO    setup   DevWorkspace support disabled. Will initiate restart when CheCluster with devworkspaces enabled will appear
2021-11-10T22:18:18.795Z    INFO    setup   starting manager
I1110 22:18:18.795209       1 leaderelection.go:243] attempting to acquire leader lease eclipse-che/e79b08a4.org.eclipse.che...
2021-11-10T22:18:18.795Z    INFO    controller-runtime.manager  starting metrics server {"path": "/metrics"}
I1110 22:18:58.996587       1 leaderelection.go:253] successfully acquired lease eclipse-che/e79b08a4.org.eclipse.che
2021-11-10T22:18:58.996Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting Controller {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster"}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.997Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.998Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:58.999Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checluster    Starting Controller {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster"}
2021-11-10T22:18:58.998Z    INFO    controller-runtime.manager.controller.checlusterrestore-controller  Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterRestore", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checlusterrestore-controller  Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterRestore", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checlusterrestore-controller  Starting Controller {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterRestore"}
2021-11-10T22:18:58.998Z    INFO    controller-runtime.manager.controller.checlusterbackup-controller   Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterBackup", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checlusterbackup-controller   Starting EventSource    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterBackup", "source": "kind source: /, Kind="}
2021-11-10T22:18:59.000Z    INFO    controller-runtime.manager.controller.checlusterbackup-controller   Starting Controller {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterBackup"}
2021-11-10T22:18:59.099Z    INFO    controller-runtime.manager.controller.checluster    Starting workers    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "worker count": 1}
2021-11-10T22:18:59.115Z    INFO    controller-runtime.manager.controller.checlusterrestore-controller  Starting workers    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterRestore", "worker count": 1}
2021-11-10T22:18:59.115Z    INFO    controller-runtime.manager.controller.checluster    Starting workers    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "worker count": 1}
2021-11-10T22:18:59.114Z    INFO    controller-runtime.manager.controller.checlusterbackup-controller   Starting workers    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheClusterBackup", "worker count": 1}
I1110 22:19:00.166088       1 request.go:668] Waited for 1.035823223s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/scheduling.k8s.io/v1?timeout=32s
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ConfigMap, name: ca-certs-merged"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ServiceAccount, name: che"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ClusterRole, name: eclipse-che-cheworkspaces-clusterrole"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ClusterRoleBinding, name: eclipse-che-cheworkspaces-clusterrole"
time="2021-11-10T22:19:00Z" level=info msg="Added finalizer: cheWorkspaces.clusterpermissions.finalizers.che.eclipse.org"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ClusterRole, name: eclipse-che-cheworkspaces-namespaces-clusterrole"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ClusterRoleBinding, name: eclipse-che-cheworkspaces-namespaces-clusterrole"
time="2021-11-10T22:19:00Z" level=info msg="Added finalizer: namespaces-editor.permissions.finalizers.che.eclipse.org"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ClusterRole, name: eclipse-che-cheworkspaces-devworkspace-clusterrole"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.ClusterRoleBinding, name: eclipse-che-cheworkspaces-devworkspace-clusterrole"
time="2021-11-10T22:19:00Z" level=info msg="Added finalizer: devWorkspace.permissions.finalizers.che.eclipse.org"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with installation flavor: che"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.Secret, name: che-postgres-secret"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Postgres Secret: che-postgres-secret"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.Secret, name: che-identity-postgres-secret"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Identity Provider Postgres Secret: che-identity-postgres-secret"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.Secret, name: che-identity-secret"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Identity Provider Secret: che-identity-secret"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Postgres DB: dbche"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Postgres hostname: postgres"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Postgres port: 5432"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Keycloak realm: che"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with Keycloak client ID: che-public"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with log level: INFO"
time="2021-11-10T22:19:00Z" level=info msg="Custom resource spec eclipse-che updated with serverExposureStrategy: multi-host"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.Service, name: postgres"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.PersistentVolumeClaim, name: postgres-data"
time="2021-11-10T22:19:00Z" level=info msg="Creating a new object: v1.Deployment, name: postgres"
I1110 22:19:54.384571       1 request.go:668] Waited for 1.041946204s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/appprotect.f5.com/v1beta1?timeout=32s
time="2021-11-10T22:19:54Z" level=info msg="Running exec for 'create Keycloak DB, user, privileges' in the pod 'postgres-8fdd8d497-jbz28'"
time="2021-11-10T22:19:55Z" level=info msg="Exec successfully completed."
time="2021-11-10T22:19:55Z" level=info msg="Custom resource status eclipse-che updated with status: provisioned with DB and user: true"
time="2021-11-10T22:19:55Z" level=info msg="Running exec for 'get PostgreSQL version' in the pod 'postgres-8fdd8d497-jbz28'"
time="2021-11-10T22:19:55Z" level=info msg="Exec successfully completed."
time="2021-11-10T22:19:55Z" level=info msg="Custom resource spec eclipse-che updated with database.postgresVersion: 13.3"
time="2021-11-10T22:19:55Z" level=info msg="Creating a new object: v1.Service, name: che-host"
time="2021-11-10T22:19:55Z" level=info msg="Creating a new object: v1.Ingress, name: che"
time="2021-11-10T22:19:57Z" level=info msg="Custom resource spec eclipse-che updated with CheHost URL: che-eclipse-che.vwtg.cloud.suedleasing-dev.com"
time="2021-11-10T22:19:57Z" level=info msg="Creating a new object: v1.Service, name: keycloak"
time="2021-11-10T22:19:57Z" level=info msg="Creating a new object: v1.Ingress, name: keycloak"
time="2021-11-10T22:19:57Z" level=info msg="Custom resource spec eclipse-che updated with Keycloak URL: https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth"
time="2021-11-10T22:19:57Z" level=info msg="Custom resource status eclipse-che updated with Keycloak URL: https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth"
time="2021-11-10T22:19:57Z" level=info msg="Creating a new object: v1.Deployment, name: keycloak"
I1110 22:20:41.032621       1 request.go:668] Waited for 1.043851166s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/networking.k8s.io/v1beta1?timeout=32s
time="2021-11-10T22:20:41Z" level=info msg="Running exec for 'Update ssl_required to NONE' in the pod 'postgres-8fdd8d497-jbz28'"
time="2021-11-10T22:20:42Z" level=info msg="Exec successfully completed."
time="2021-11-10T22:20:42Z" level=info msg="Running exec for 'create realm, client and user' in the pod 'keycloak-74c47757d8-7kmzf'"
time="2021-11-10T22:20:55Z" level=info msg="Exec successfully completed."
time="2021-11-10T22:20:55Z" level=info msg="Custom resource status eclipse-che updated with status: provisioned with Keycloak: true"
time="2021-11-10T22:20:55Z" level=info msg="Running exec for 'Update redirect URI-s and webOrigins' in the pod 'keycloak-74c47757d8-7kmzf'"
time="2021-11-10T22:20:58Z" level=info msg="Exec successfully completed."
time="2021-11-10T22:20:58Z" level=info msg="Creating a new object: v1.Service, name: devfile-registry"
time="2021-11-10T22:20:58Z" level=info msg="Creating a new object: v1.Ingress, name: devfile-registry"
time="2021-11-10T22:20:58Z" level=info msg="Custom resource status eclipse-che updated with status: Devfile Registry URL: https://devfile-registry-eclipse-che.vwtg.cloud.suedleasing-dev.com/"
time="2021-11-10T22:20:58Z" level=info msg="Creating a new object: v1.ConfigMap, name: devfile-registry"
time="2021-11-10T22:20:58Z" level=info msg="Creating a new object: v1.Deployment, name: devfile-registry"
I1110 22:20:59.446253       1 request.go:668] Waited for 1.042929104s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/appprotect.f5.com/v1beta1?timeout=32s
time="2021-11-10T22:20:59Z" level=info msg="Updating existing object: v1.Deployment, name: keycloak"
Difference:
  &v1.Deployment{
    ... // 2 ignored fields
    Spec: v1.DeploymentSpec{
        ... // 1 ignored field
        Selector: &{MatchLabels: {"app": "che", "component": "keycloak"}},
        Template: v1.PodTemplateSpec{
            ObjectMeta: {Labels: {"app": "che", "app.kubernetes.io/component": "keycloak", "app.kubernetes.io/instance": "che", "app.kubernetes.io/managed-by": "che-operator", ...}},
            Spec: v1.PodSpec{
                Volumes:        {{Name: "che-public-certs", VolumeSource: {ConfigMap: &{LocalObjectReference: {Name: "ca-certs-merged"}}}}},
                InitContainers: nil,
                Containers: []v1.Container{
                    {
                        Name:    "keycloak",
                        Image:   "quay.io/eclipse/che-keycloak:next",
                        Command: {"/bin/sh"},
                        Args: []string{
                            "-c",
                            strings.Join({
+                               `echo "ssl_required WAS UPDATED for master realm." && `,
                                "\n\tfunction jks_import_ca_bundle {\n\t\tCA_FILE=$1\n\t\tKEYSTORE_PATH=$",
                                "2\n\t\tKEYSTORE_PASSWORD=$3\n\n\t\tif [ ! -f $CA_FILE ]; then\n\t\t\t# CA b",
                                ... // 3504 identical bytes
                            }, ""),
                        },
                        WorkingDir: "",
                        Ports:      {{Name: "keycloak", ContainerPort: 8080, Protocol: "TCP"}},
                        ... // 3 ignored and 13 identical fields
                    },
                },
                EphemeralContainers: nil,
                RestartPolicy:       "Always",
                ... // 4 ignored and 26 identical fields
            },
        },
        Strategy:        {Type: "RollingUpdate"},
        MinReadySeconds: 0,
        ... // 2 ignored and 1 identical fields
    },
    ... // 1 ignored field
  }
time="2021-11-10T22:21:02Z" level=info msg="Deployment keycloak is in the rolling update state."
time="2021-11-10T22:21:04Z" level=info msg="Deployment keycloak is in the rolling update state."
I1110 22:21:12.361117       1 request.go:668] Waited for 1.022916051s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/appprotect.f5.com/v1beta1?timeout=32s
time="2021-11-10T22:21:12Z" level=info msg="Deployment keycloak is in the rolling update state."
I1110 22:21:39.573216       1 request.go:668] Waited for 1.034982181s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/admissionregistration.k8s.io/v1beta1?timeout=32s
time="2021-11-10T22:21:39Z" level=info msg="Creating a new object: v1.Service, name: plugin-registry"
time="2021-11-10T22:21:39Z" level=info msg="Creating a new object: v1.Ingress, name: plugin-registry"
time="2021-11-10T22:21:39Z" level=info msg="Custom resource status eclipse-che updated with status: Plugin Registry URL: https://plugin-registry-eclipse-che.vwtg.cloud.suedleasing-dev.com/v3"
time="2021-11-10T22:21:39Z" level=info msg="Creating a new object: v1.ConfigMap, name: plugin-registry"
time="2021-11-10T22:21:39Z" level=info msg="Creating a new object: v1.Deployment, name: plugin-registry"
time="2021-11-10T22:21:44Z" level=info msg="Creating a new object: v1.Service, name: che-dashboard"
time="2021-11-10T22:21:44Z" level=info msg="Creating a new object: v1.Ingress, name: che-dashboard"
time="2021-11-10T22:21:44Z" level=info msg="Creating a new object: v1.ServiceAccount, name: che-dashboard"
time="2021-11-10T22:21:44Z" level=info msg="Creating a new object: v1.ClusterRole, name: eclipse-che-che-dashboard"
time="2021-11-10T22:21:44Z" level=info msg="Creating a new object: v1.ClusterRoleBinding, name: eclipse-che-che-dashboard"
time="2021-11-10T22:21:44Z" level=info msg="Added finalizer: dashboard.clusterpermissions.finalizers.che.eclipse.org"
time="2021-11-10T22:21:44Z" level=info msg="Creating a new object: v1.Deployment, name: che-dashboard"
I1110 22:21:49.623279       1 request.go:668] Waited for 2.09684203s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/k8s.nginx.org/v1alpha1?timeout=32s
time="2021-11-10T22:21:56Z" level=info msg="Creating a new object: v1.ConfigMap, name: che"
time="2021-11-10T22:21:58Z" level=info msg="Creating a new object: v1.Deployment, name: che"
I1110 22:21:59.888510       1 request.go:668] Waited for 1.046623828s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/extensions/v1beta1?timeout=32s
I1110 22:22:41.145352       1 request.go:668] Waited for 1.04230459s due to client-side throttling, not priority and fairness, request: GET:https://10.0.0.1:443/apis/org.eclipse.che/v1?timeout=32s
time="2021-11-10T22:22:41Z" level=info msg="Eclipse Che is now available at: "
time="2021-11-10T22:22:41Z" level=info msg="Custom resource status eclipse-che updated with status: Che API: Available"
time="2021-11-10T22:22:41Z" level=info msg="Custom resource status eclipse-che updated with che server URL: https://che-eclipse-che.vwtg.cloud.suedleasing-dev.com"
time="2021-11-10T22:22:41Z" level=info msg="Custom resource status eclipse-che updated with version: next"
time="2021-11-10T22:22:41Z" level=info msg="Successfully reconciled."
time="2021-11-10T22:22:43Z" level=error msg="Operation cannot be fulfilled on checlusters.org.eclipse.che \"eclipse-che\": the object has been modified; please apply your changes to the latest version and try again"
2021-11-10T22:22:43.874Z    ERROR   controller-runtime.manager.controller.checluster    Reconciler error    {"reconciler group": "org.eclipse.che", "reconciler kind": "CheCluster", "name": "eclipse-che", "namespace": "eclipse-che", "error": "Operation cannot be fulfilled on checlusters.org.eclipse.che \"eclipse-che\": the object has been modified; please apply your changes to the latest version and try again"}
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem
    /che-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:253
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2
    /che-operator/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:214
time="2021-11-10T22:22:46Z" level=info msg="Successfully reconciled."
time="2021-11-10T22:22:49Z" level=info msg="Successfully reconciled."
LAST SEEN   TYPE      REASON                      OBJECT                                          MESSAGE
38m         Normal    Scheduled                   pod/che-598f5cbfd6-8j4fp                        Successfully assigned eclipse-che/che-598f5cbfd6-8j4fp to aks-worker-16068719-vmss000000
38m         Normal    Pulled                      pod/che-598f5cbfd6-8j4fp                        Container image "quay.io/eclipse/che-endpoint-watcher:next" already present on machine
38m         Normal    Created                     pod/che-598f5cbfd6-8j4fp                        Created container wait-for-postgres
38m         Normal    Started                     pod/che-598f5cbfd6-8j4fp                        Started container wait-for-postgres
37m         Normal    Pulled                      pod/che-598f5cbfd6-8j4fp                        Container image "quay.io/eclipse/che-endpoint-watcher:next" already present on machine
37m         Normal    Created                     pod/che-598f5cbfd6-8j4fp                        Created container wait-for-keycloak
37m         Normal    Started                     pod/che-598f5cbfd6-8j4fp                        Started container wait-for-keycloak
36m         Normal    Pulling                     pod/che-598f5cbfd6-8j4fp                        Pulling image "quay.io/eclipse/che-server:7.38.1"
36m         Normal    Pulled                      pod/che-598f5cbfd6-8j4fp                        Successfully pulled image "quay.io/eclipse/che-server:7.38.1" in 456.8245ms
36m         Normal    Created                     pod/che-598f5cbfd6-8j4fp                        Created container che
36m         Normal    Started                     pod/che-598f5cbfd6-8j4fp                        Started container che
28m         Normal    Killing                     pod/che-598f5cbfd6-8j4fp                        Stopping container che
28m         Warning   Unhealthy                   pod/che-598f5cbfd6-8j4fp                        Readiness probe failed: Get "http://10.230.13.6:8080/api/system/state": read tcp 10.230.13.4:53304->10.230.13.6:8080: read: connection reset by peer
28m         Warning   Unhealthy                   pod/che-598f5cbfd6-8j4fp                        Liveness probe failed: Get "http://10.230.13.6:8080/api/system/state": dial tcp 10.230.13.6:8080: connect: connection refused
26m         Normal    Scheduled                   pod/che-598f5cbfd6-tb4wk                        Successfully assigned eclipse-che/che-598f5cbfd6-tb4wk to aks-worker-16068719-vmss000000
26m         Normal    Pulled                      pod/che-598f5cbfd6-tb4wk                        Container image "quay.io/eclipse/che-endpoint-watcher:next" already present on machine
26m         Normal    Created                     pod/che-598f5cbfd6-tb4wk                        Created container wait-for-postgres
26m         Normal    Started                     pod/che-598f5cbfd6-tb4wk                        Started container wait-for-postgres
24m         Normal    Pulled                      pod/che-598f5cbfd6-tb4wk                        Container image "quay.io/eclipse/che-endpoint-watcher:next" already present on machine
24m         Normal    Created                     pod/che-598f5cbfd6-tb4wk                        Created container wait-for-keycloak
24m         Normal    Started                     pod/che-598f5cbfd6-tb4wk                        Started container wait-for-keycloak
23m         Normal    Pulling                     pod/che-598f5cbfd6-tb4wk                        Pulling image "quay.io/eclipse/che-server:7.38.1"
23m         Normal    Pulled                      pod/che-598f5cbfd6-tb4wk                        Successfully pulled image "quay.io/eclipse/che-server:7.38.1" in 440.078544ms
23m         Normal    Created                     pod/che-598f5cbfd6-tb4wk                        Created container che
23m         Normal    Started                     pod/che-598f5cbfd6-tb4wk                        Started container che
17m         Normal    Killing                     pod/che-598f5cbfd6-tb4wk                        Stopping container che
17m         Warning   Unhealthy                   pod/che-598f5cbfd6-tb4wk                        Liveness probe failed: Get "http://10.230.13.23:8080/api/system/state": dial tcp 10.230.13.23:8080: connect: connection refused
17m         Warning   FailedKillPod               pod/che-598f5cbfd6-tb4wk                        error killing pod: failed to "KillPodSandbox" for "707bc9ae-3028-42d6-b8a3-95513fcc80e5" with KillPodSandboxError: "rpc error: code = Unknown desc = failed to destroy network for sandbox \"50159fcd4c9f983f9860649225ef347021d44093948e8e3b7cf08965b2bffe78\": could not teardown ipv6 dnat: running [/sbin/ip6tables -t nat -X CNI-DN-dbe135b5249137920b175 --wait]: exit status 1: ip6tables: No chain/target/match by that name.\n"
38m         Normal    SuccessfulCreate            replicaset/che-598f5cbfd6                       Created pod: che-598f5cbfd6-8j4fp
27m         Normal    SuccessfulCreate            replicaset/che-598f5cbfd6                       Created pod: che-598f5cbfd6-tb4wk
11m         Normal    Scheduled                   pod/che-8bc6d5749-rzqwt                         Successfully assigned eclipse-che/che-8bc6d5749-rzqwt to aks-worker-16068719-vmss000000
11m         Normal    Pulling                     pod/che-8bc6d5749-rzqwt                         Pulling image "quay.io/eclipse/che-server:next"
11m         Normal    Pulled                      pod/che-8bc6d5749-rzqwt                         Successfully pulled image "quay.io/eclipse/che-server:next" in 7.426714598s
11m         Normal    Created                     pod/che-8bc6d5749-rzqwt                         Created container che
11m         Normal    Started                     pod/che-8bc6d5749-rzqwt                         Started container che
6m42s       Warning   Unhealthy                   pod/che-8bc6d5749-rzqwt                         Readiness probe failed: Get "http://10.230.13.18:8080/api/system/state": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
6m52s       Normal    Killing                     pod/che-8bc6d5749-rzqwt                         Stopping container che
11m         Normal    SuccessfulCreate            replicaset/che-8bc6d5749                        Created pod: che-8bc6d5749-rzqwt
69s         Normal    Scheduled                   pod/che-957c65554-2nbx2                         Successfully assigned eclipse-che/che-957c65554-2nbx2 to aks-worker-16068719-vmss000000
69s         Normal    Pulling                     pod/che-957c65554-2nbx2                         Pulling image "quay.io/eclipse/che-server:next"
69s         Normal    Pulled                      pod/che-957c65554-2nbx2                         Successfully pulled image "quay.io/eclipse/che-server:next" in 455.844156ms
68s         Normal    Created                     pod/che-957c65554-2nbx2                         Created container che
68s         Normal    Started                     pod/che-957c65554-2nbx2                         Started container che
38s         Warning   Unhealthy                   pod/che-957c65554-2nbx2                         Readiness probe failed: Get "http://10.230.13.23:8080/api/system/state": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
70s         Normal    SuccessfulCreate            replicaset/che-957c65554                        Created pod: che-957c65554-2nbx2
38m         Normal    Scheduled                   pod/che-dashboard-7cbd954c6f-lnc5h              Successfully assigned eclipse-che/che-dashboard-7cbd954c6f-lnc5h to aks-worker-16068719-vmss000000
38m         Normal    Pulling                     pod/che-dashboard-7cbd954c6f-lnc5h              Pulling image "quay.io/eclipse/che-dashboard:7.38.1"
38m         Normal    Pulled                      pod/che-dashboard-7cbd954c6f-lnc5h              Successfully pulled image "quay.io/eclipse/che-dashboard:7.38.1" in 1.154581429s
38m         Normal    Created                     pod/che-dashboard-7cbd954c6f-lnc5h              Created container che-dashboard
38m         Normal    Started                     pod/che-dashboard-7cbd954c6f-lnc5h              Started container che-dashboard
28m         Normal    Killing                     pod/che-dashboard-7cbd954c6f-lnc5h              Stopping container che-dashboard
26m         Normal    Scheduled                   pod/che-dashboard-7cbd954c6f-sd5c4              Successfully assigned eclipse-che/che-dashboard-7cbd954c6f-sd5c4 to aks-worker-16068719-vmss000000
26m         Normal    Pulling                     pod/che-dashboard-7cbd954c6f-sd5c4              Pulling image "quay.io/eclipse/che-dashboard:7.38.1"
26m         Normal    Pulled                      pod/che-dashboard-7cbd954c6f-sd5c4              Successfully pulled image "quay.io/eclipse/che-dashboard:7.38.1" in 863.899208ms
26m         Normal    Created                     pod/che-dashboard-7cbd954c6f-sd5c4              Created container che-dashboard
26m         Normal    Started                     pod/che-dashboard-7cbd954c6f-sd5c4              Started container che-dashboard
17m         Normal    Killing                     pod/che-dashboard-7cbd954c6f-sd5c4              Stopping container che-dashboard
38m         Normal    SuccessfulCreate            replicaset/che-dashboard-7cbd954c6f             Created pod: che-dashboard-7cbd954c6f-lnc5h
27m         Normal    SuccessfulCreate            replicaset/che-dashboard-7cbd954c6f             Created pod: che-dashboard-7cbd954c6f-sd5c4
83s         Normal    Scheduled                   pod/che-dashboard-f5f4cc944-5vkcz               Successfully assigned eclipse-che/che-dashboard-f5f4cc944-5vkcz to aks-worker-16068719-vmss000000
83s         Normal    Pulling                     pod/che-dashboard-f5f4cc944-5vkcz               Pulling image "quay.io/eclipse/che-dashboard:next"
82s         Normal    Pulled                      pod/che-dashboard-f5f4cc944-5vkcz               Successfully pulled image "quay.io/eclipse/che-dashboard:next" in 528.803401ms
82s         Normal    Created                     pod/che-dashboard-f5f4cc944-5vkcz               Created container che-dashboard
82s         Normal    Started                     pod/che-dashboard-f5f4cc944-5vkcz               Started container che-dashboard
11m         Normal    Scheduled                   pod/che-dashboard-f5f4cc944-9wzvv               Successfully assigned eclipse-che/che-dashboard-f5f4cc944-9wzvv to aks-worker-16068719-vmss000000
11m         Normal    Pulling                     pod/che-dashboard-f5f4cc944-9wzvv               Pulling image "quay.io/eclipse/che-dashboard:next"
11m         Normal    Pulled                      pod/che-dashboard-f5f4cc944-9wzvv               Successfully pulled image "quay.io/eclipse/che-dashboard:next" in 4.195542097s
11m         Normal    Created                     pod/che-dashboard-f5f4cc944-9wzvv               Created container che-dashboard
11m         Normal    Started                     pod/che-dashboard-f5f4cc944-9wzvv               Started container che-dashboard
6m52s       Normal    Killing                     pod/che-dashboard-f5f4cc944-9wzvv               Stopping container che-dashboard
11m         Normal    SuccessfulCreate            replicaset/che-dashboard-f5f4cc944              Created pod: che-dashboard-f5f4cc944-9wzvv
83s         Normal    SuccessfulCreate            replicaset/che-dashboard-f5f4cc944              Created pod: che-dashboard-f5f4cc944-5vkcz
38m         Normal    AddedOrUpdated              ingress/che-dashboard-ingress                   Configuration for eclipse-che/che-dashboard-ingress was added or updated
38m         Normal    AddedOrUpdated              ingress/che-dashboard-ingress                   Configuration for eclipse-che/che-dashboard-ingress was added or updated
17m         Normal    AddedOrUpdated              ingress/che-dashboard-ingress                   Configuration for eclipse-che/che-dashboard-ingress was added or updated
17m         Normal    AddedOrUpdated              ingress/che-dashboard-ingress                   Configuration for eclipse-che/che-dashboard-ingress was added or updated
38m         Normal    ScalingReplicaSet           deployment/che-dashboard                        Scaled up replica set che-dashboard-7cbd954c6f to 1
27m         Normal    ScalingReplicaSet           deployment/che-dashboard                        Scaled up replica set che-dashboard-7cbd954c6f to 1
11m         Warning   Rejected                    ingress/che-dashboard                           All hosts are taken by other resources
11m         Warning   Rejected                    ingress/che-dashboard                           All hosts are taken by other resources
11m         Normal    ScalingReplicaSet           deployment/che-dashboard                        Scaled up replica set che-dashboard-f5f4cc944 to 1
6m53s       Warning   AddedOrUpdatedWithWarning   ingress/che-dashboard                           Configuration for eclipse-che/che-dashboard was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
6m53s       Warning   AddedOrUpdatedWithWarning   ingress/che-dashboard                           Configuration for eclipse-che/che-dashboard was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
84s         Warning   Rejected                    ingress/che-dashboard                           All hosts are taken by other resources
84s         Warning   Rejected                    ingress/che-dashboard                           All hosts are taken by other resources
83s         Normal    ScalingReplicaSet           deployment/che-dashboard                        Scaled up replica set che-dashboard-f5f4cc944 to 1
38m         Warning   Rejected                    ingress/che-ingress                             All hosts are taken by other resources
38m         Warning   Rejected                    ingress/che-ingress                             All hosts are taken by other resources
28m         Normal    AddedOrUpdated              ingress/che-ingress                             Configuration for eclipse-che/che-ingress was added or updated
28m         Normal    AddedOrUpdated              ingress/che-ingress                             Configuration for eclipse-che/che-ingress was added or updated
26m         Warning   Rejected                    ingress/che-ingress                             All hosts are taken by other resources
26m         Warning   Rejected                    ingress/che-ingress                             All hosts are taken by other resources
17m         Normal    AddedOrUpdated              ingress/che-ingress                             Configuration for eclipse-che/che-ingress was added or updated
17m         Normal    AddedOrUpdated              ingress/che-ingress                             Configuration for eclipse-che/che-ingress was added or updated
15m         Normal    Scheduled                   pod/che-operator-59458894bb-2c6kx               Successfully assigned eclipse-che/che-operator-59458894bb-2c6kx to aks-worker-16068719-vmss000000
15m         Normal    Pulling                     pod/che-operator-59458894bb-2c6kx               Pulling image "quay.io/eclipse/che-operator:next"
15m         Normal    Pulled                      pod/che-operator-59458894bb-2c6kx               Successfully pulled image "quay.io/eclipse/che-operator:next" in 9.862416593s
15m         Normal    Created                     pod/che-operator-59458894bb-2c6kx               Created container che-operator
15m         Normal    Started                     pod/che-operator-59458894bb-2c6kx               Started container che-operator
6m52s       Normal    Killing                     pod/che-operator-59458894bb-2c6kx               Stopping container che-operator
6m26s       Warning   Unhealthy                   pod/che-operator-59458894bb-2c6kx               Liveness probe failed: Get "http://10.230.13.33:6789/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
4m55s       Normal    Scheduled                   pod/che-operator-59458894bb-tgbld               Successfully assigned eclipse-che/che-operator-59458894bb-tgbld to aks-worker-16068719-vmss000000
4m55s       Normal    Pulling                     pod/che-operator-59458894bb-tgbld               Pulling image "quay.io/eclipse/che-operator:next"
4m54s       Normal    Pulled                      pod/che-operator-59458894bb-tgbld               Successfully pulled image "quay.io/eclipse/che-operator:next" in 457.525071ms
4m54s       Normal    Created                     pod/che-operator-59458894bb-tgbld               Created container che-operator
4m54s       Normal    Started                     pod/che-operator-59458894bb-tgbld               Started container che-operator
15m         Normal    SuccessfulCreate            replicaset/che-operator-59458894bb              Created pod: che-operator-59458894bb-2c6kx
4m55s       Normal    SuccessfulCreate            replicaset/che-operator-59458894bb              Created pod: che-operator-59458894bb-tgbld
15m         Normal    ScalingReplicaSet           deployment/che-operator                         Scaled up replica set che-operator-59458894bb to 1
4m56s       Normal    ScalingReplicaSet           deployment/che-operator                         Scaled up replica set che-operator-59458894bb to 1
6m50s       Normal    Complete                    order/che-tls-gqwj5-1629466457                  Order completed successfully
6m54s       Normal    OrderCreated                certificaterequest/che-tls-gqwj5                Created Order resource eclipse-che/che-tls-gqwj5-1629466457
6m54s       Normal    cert-manager.io             certificaterequest/che-tls-gqwj5                Certificate request has been approved by cert-manager.io
6m54s       Normal    OrderPending                certificaterequest/che-tls-gqwj5                Waiting on certificate issuance from order eclipse-che/che-tls-gqwj5-1629466457: ""
6m50s       Normal    CertificateIssued           certificaterequest/che-tls-gqwj5                Certificate fetched from issuer successfully
47m         Normal    Started                     challenge/che-tls-rnvmz-1629466457-2934519269   Challenge scheduled for processing
47m         Normal    Presented                   challenge/che-tls-rnvmz-1629466457-2934519269   Presented challenge using DNS-01 challenge mechanism
46m         Normal    DomainVerified              challenge/che-tls-rnvmz-1629466457-2934519269   Domain "vwtg.cloud.suedleasing-dev.com" verified with "DNS-01" validation
47m         Normal    Created                     order/che-tls-rnvmz-1629466457                  Created Challenge resource "che-tls-rnvmz-1629466457-2934519269" for domain "vwtg.cloud.suedleasing-dev.com"
46m         Normal    Complete                    order/che-tls-rnvmz-1629466457                  Order completed successfully
47m         Normal    IssuerNotReady              certificaterequest/che-tls-rnvmz                Referenced issuer does not have a Ready status condition
47m         Normal    cert-manager.io             certificaterequest/che-tls-rnvmz                Certificate request has been approved by cert-manager.io
46m         Normal    CertificateIssued           certificaterequest/che-tls-rnvmz                Certificate fetched from issuer successfully
6m55s       Normal    Issuing                     certificate/che-tls                             Issuing certificate as Secret does not exist
6m49s       Normal    Issuing                     certificate/che-tls                             The certificate has been successfully issued
47m         Normal    Issuing                     certificate/che-tls                             Issuing certificate as Secret was previously issued by ClusterIssuer.cert-manager.io/letsencrypt-staging
47m         Normal    Reused                      certificate/che-tls                             Reusing private key stored in existing Secret resource "che-tls"
47m         Normal    Requested                   certificate/che-tls                             Created new CertificateRequest resource "che-tls-rnvmz"
6m54s       Normal    Generated                   certificate/che-tls                             Stored new private key in temporary Secret resource "che-tls-vhq8s"
6m54s       Normal    Requested                   certificate/che-tls                             Created new CertificateRequest resource "che-tls-gqwj5"
38m         Normal    ScalingReplicaSet           deployment/che                                  Scaled up replica set che-598f5cbfd6 to 1
27m         Normal    ScalingReplicaSet           deployment/che                                  Scaled up replica set che-598f5cbfd6 to 1
13m         Normal    AddedOrUpdated              ingress/che                                     Configuration for eclipse-che/che was added or updated
13m         Normal    AddedOrUpdated              ingress/che                                     Configuration for eclipse-che/che was added or updated
11m         Normal    ScalingReplicaSet           deployment/che                                  Scaled up replica set che-8bc6d5749 to 1
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/che                                     Configuration for eclipse-che/che was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/che                                     Configuration for eclipse-che/che was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
3m13s       Normal    AddedOrUpdated              ingress/che                                     Configuration for eclipse-che/che was added or updated
3m13s       Normal    AddedOrUpdated              ingress/che                                     Configuration for eclipse-che/che was added or updated
70s         Normal    ScalingReplicaSet           deployment/che                                  Scaled up replica set che-957c65554 to 1
38m         Normal    Scheduled                   pod/devfile-registry-5f96d5bd8d-2zwkd           Successfully assigned eclipse-che/devfile-registry-5f96d5bd8d-2zwkd to aks-worker-16068719-vmss000000
38m         Normal    Pulling                     pod/devfile-registry-5f96d5bd8d-2zwkd           Pulling image "quay.io/eclipse/che-devfile-registry:7.38.1"
38m         Normal    Pulled                      pod/devfile-registry-5f96d5bd8d-2zwkd           Successfully pulled image "quay.io/eclipse/che-devfile-registry:7.38.1" in 513.30667ms
38m         Normal    Created                     pod/devfile-registry-5f96d5bd8d-2zwkd           Created container che-devfile-registry
38m         Normal    Started                     pod/devfile-registry-5f96d5bd8d-2zwkd           Started container che-devfile-registry
28m         Normal    Killing                     pod/devfile-registry-5f96d5bd8d-2zwkd           Stopping container che-devfile-registry
28m         Warning   Unhealthy                   pod/devfile-registry-5f96d5bd8d-2zwkd           Liveness probe failed: Get "http://10.230.13.7:8080/devfiles/": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
28m         Warning   Unhealthy                   pod/devfile-registry-5f96d5bd8d-2zwkd           Readiness probe failed: Get "http://10.230.13.7:8080/devfiles/": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
26m         Normal    Scheduled                   pod/devfile-registry-5f96d5bd8d-gxtvp           Successfully assigned eclipse-che/devfile-registry-5f96d5bd8d-gxtvp to aks-worker-16068719-vmss000000
26m         Normal    Pulling                     pod/devfile-registry-5f96d5bd8d-gxtvp           Pulling image "quay.io/eclipse/che-devfile-registry:7.38.1"
26m         Normal    Pulled                      pod/devfile-registry-5f96d5bd8d-gxtvp           Successfully pulled image "quay.io/eclipse/che-devfile-registry:7.38.1" in 637.235169ms
26m         Normal    Created                     pod/devfile-registry-5f96d5bd8d-gxtvp           Created container che-devfile-registry
26m         Normal    Started                     pod/devfile-registry-5f96d5bd8d-gxtvp           Started container che-devfile-registry
17m         Normal    Killing                     pod/devfile-registry-5f96d5bd8d-gxtvp           Stopping container che-devfile-registry
17m         Warning   Unhealthy                   pod/devfile-registry-5f96d5bd8d-gxtvp           Liveness probe failed: Get "http://10.230.13.11:8080/devfiles/": dial tcp 10.230.13.11:8080: connect: connection refused
38m         Normal    SuccessfulCreate            replicaset/devfile-registry-5f96d5bd8d          Created pod: devfile-registry-5f96d5bd8d-2zwkd
27m         Normal    SuccessfulCreate            replicaset/devfile-registry-5f96d5bd8d          Created pod: devfile-registry-5f96d5bd8d-gxtvp
12m         Normal    Scheduled                   pod/devfile-registry-867947dc65-ht2b5           Successfully assigned eclipse-che/devfile-registry-867947dc65-ht2b5 to aks-worker-16068719-vmss000000
12m         Normal    Pulling                     pod/devfile-registry-867947dc65-ht2b5           Pulling image "quay.io/eclipse/che-devfile-registry:next"
12m         Normal    Pulled                      pod/devfile-registry-867947dc65-ht2b5           Successfully pulled image "quay.io/eclipse/che-devfile-registry:next" in 4.239761099s
12m         Normal    Created                     pod/devfile-registry-867947dc65-ht2b5           Created container che-devfile-registry
12m         Normal    Started                     pod/devfile-registry-867947dc65-ht2b5           Started container che-devfile-registry
6m52s       Normal    Killing                     pod/devfile-registry-867947dc65-ht2b5           Stopping container che-devfile-registry
2m9s        Normal    Scheduled                   pod/devfile-registry-867947dc65-slbfd           Successfully assigned eclipse-che/devfile-registry-867947dc65-slbfd to aks-worker-16068719-vmss000000
2m9s        Normal    Pulling                     pod/devfile-registry-867947dc65-slbfd           Pulling image "quay.io/eclipse/che-devfile-registry:next"
2m9s        Normal    Pulled                      pod/devfile-registry-867947dc65-slbfd           Successfully pulled image "quay.io/eclipse/che-devfile-registry:next" in 485.317857ms
2m9s        Normal    Created                     pod/devfile-registry-867947dc65-slbfd           Created container che-devfile-registry
2m9s        Normal    Started                     pod/devfile-registry-867947dc65-slbfd           Started container che-devfile-registry
12m         Normal    SuccessfulCreate            replicaset/devfile-registry-867947dc65          Created pod: devfile-registry-867947dc65-ht2b5
2m10s       Normal    SuccessfulCreate            replicaset/devfile-registry-867947dc65          Created pod: devfile-registry-867947dc65-slbfd
38m         Normal    ScalingReplicaSet           deployment/devfile-registry                     Scaled up replica set devfile-registry-5f96d5bd8d to 1
28m         Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
28m         Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
27m         Normal    ScalingReplicaSet           deployment/devfile-registry                     Scaled up replica set devfile-registry-5f96d5bd8d to 1
17m         Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
17m         Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
12m         Normal    ScalingReplicaSet           deployment/devfile-registry                     Scaled up replica set devfile-registry-867947dc65 to 1
12m         Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
6m54s       Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
2m10s       Normal    ScalingReplicaSet           deployment/devfile-registry                     Scaled up replica set devfile-registry-867947dc65 to 1
2m10s       Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
2m10s       Normal    AddedOrUpdated              ingress/devfile-registry                        Configuration for eclipse-che/devfile-registry was added or updated
15m         Normal    LeaderElection              configmap/e79b08a4.org.eclipse.che              che-operator-59458894bb-2c6kx_c3defcb7-2515-4d51-9f5a-37f362eebe4e became leader
15m         Normal    LeaderElection              lease/e79b08a4.org.eclipse.che                  che-operator-59458894bb-2c6kx_c3defcb7-2515-4d51-9f5a-37f362eebe4e became leader
4m10s       Normal    LeaderElection              configmap/e79b08a4.org.eclipse.che              che-operator-59458894bb-tgbld_dc22c981-d2d4-4e01-abde-6faed9eca76f became leader
4m10s       Normal    LeaderElection              lease/e79b08a4.org.eclipse.che                  che-operator-59458894bb-tgbld_dc22c981-d2d4-4e01-abde-6faed9eca76f became leader
2m8s        Normal    Scheduled                   pod/keycloak-65777d7db7-fpl22                   Successfully assigned eclipse-che/keycloak-65777d7db7-fpl22 to aks-worker-16068719-vmss000000
2m8s        Normal    Pulling                     pod/keycloak-65777d7db7-fpl22                   Pulling image "quay.io/eclipse/che-keycloak:next"
2m7s        Normal    Pulled                      pod/keycloak-65777d7db7-fpl22                   Successfully pulled image "quay.io/eclipse/che-keycloak:next" in 469.174926ms
2m7s        Normal    Created                     pod/keycloak-65777d7db7-fpl22                   Created container keycloak
2m7s        Normal    Started                     pod/keycloak-65777d7db7-fpl22                   Started container keycloak
2m8s        Normal    SuccessfulCreate            replicaset/keycloak-65777d7db7                  Created pod: keycloak-65777d7db7-fpl22
38m         Normal    Scheduled                   pod/keycloak-69d7f47cd9-nfphw                   Successfully assigned eclipse-che/keycloak-69d7f47cd9-nfphw to aks-worker-16068719-vmss000000
38m         Normal    SuccessfulAttachVolume      pod/keycloak-69d7f47cd9-nfphw                   AttachVolume.Attach succeeded for volume "pvc-8f64f377-2d59-46fc-8c83-693fcac1c27f"
37m         Normal    SuccessfulAttachVolume      pod/keycloak-69d7f47cd9-nfphw                   AttachVolume.Attach succeeded for volume "pvc-ab4e489c-f9b2-4abd-b04f-527597d26c10"
37m         Normal    Pulled                      pod/keycloak-69d7f47cd9-nfphw                   Container image "quay.io/eclipse/che-endpoint-watcher:next" already present on machine
37m         Normal    Created                     pod/keycloak-69d7f47cd9-nfphw                   Created container wait-for-postgres
37m         Normal    Started                     pod/keycloak-69d7f47cd9-nfphw                   Started container wait-for-postgres
37m         Normal    Pulling                     pod/keycloak-69d7f47cd9-nfphw                   Pulling image "quay.io/eclipse/che-keycloak:7.38.1"
37m         Normal    Pulled                      pod/keycloak-69d7f47cd9-nfphw                   Successfully pulled image "quay.io/eclipse/che-keycloak:7.38.1" in 440.498426ms
37m         Normal    Created                     pod/keycloak-69d7f47cd9-nfphw                   Created container keycloak
37m         Normal    Started                     pod/keycloak-69d7f47cd9-nfphw                   Started container keycloak
37m         Warning   Unhealthy                   pod/keycloak-69d7f47cd9-nfphw                   Liveness probe failed: dial tcp 10.230.13.32:8080: connect: connection refused
37m         Warning   Unhealthy                   pod/keycloak-69d7f47cd9-nfphw                   Readiness probe failed: Get "http://10.230.13.32:8080/auth/js/keycloak.js": dial tcp 10.230.13.32:8080: connect: connection refused
36m         Warning   Unhealthy                   pod/keycloak-69d7f47cd9-nfphw                   Readiness probe failed: Get "http://10.230.13.32:8080/auth/js/keycloak.js": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
28m         Normal    Killing                     pod/keycloak-69d7f47cd9-nfphw                   Stopping container keycloak
38m         Normal    SuccessfulCreate            replicaset/keycloak-69d7f47cd9                  Created pod: keycloak-69d7f47cd9-nfphw
12m         Normal    Scheduled                   pod/keycloak-6fcfc8d74b-rcs56                   Successfully assigned eclipse-che/keycloak-6fcfc8d74b-rcs56 to aks-worker-16068719-vmss000000
12m         Normal    Pulling                     pod/keycloak-6fcfc8d74b-rcs56                   Pulling image "quay.io/eclipse/che-keycloak:next"
12m         Normal    Pulled                      pod/keycloak-6fcfc8d74b-rcs56                   Successfully pulled image "quay.io/eclipse/che-keycloak:next" in 3.244602314s
12m         Normal    Created                     pod/keycloak-6fcfc8d74b-rcs56                   Created container keycloak
12m         Normal    Started                     pod/keycloak-6fcfc8d74b-rcs56                   Started container keycloak
6m52s       Normal    Killing                     pod/keycloak-6fcfc8d74b-rcs56                   Stopping container keycloak
12m         Normal    SuccessfulCreate            replicaset/keycloak-6fcfc8d74b                  Created pod: keycloak-6fcfc8d74b-rcs56
3m10s       Normal    Scheduled                   pod/keycloak-74c47757d8-7kmzf                   Successfully assigned eclipse-che/keycloak-74c47757d8-7kmzf to aks-worker-16068719-vmss000000
3m11s       Normal    Pulling                     pod/keycloak-74c47757d8-7kmzf                   Pulling image "quay.io/eclipse/che-keycloak:next"
3m10s       Normal    Pulled                      pod/keycloak-74c47757d8-7kmzf                   Successfully pulled image "quay.io/eclipse/che-keycloak:next" in 460.916219ms
3m10s       Normal    Created                     pod/keycloak-74c47757d8-7kmzf                   Created container keycloak
3m10s       Normal    Started                     pod/keycloak-74c47757d8-7kmzf                   Started container keycloak
84s         Warning   Unhealthy                   pod/keycloak-74c47757d8-7kmzf                   Readiness probe failed: Get "http://10.230.13.9:8080/auth/js/keycloak.js": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
90s         Normal    Killing                     pod/keycloak-74c47757d8-7kmzf                   Stopping container keycloak
3m11s       Normal    SuccessfulCreate            replicaset/keycloak-74c47757d8                  Created pod: keycloak-74c47757d8-7kmzf
90s         Normal    SuccessfulDelete            replicaset/keycloak-74c47757d8                  Deleted pod: keycloak-74c47757d8-7kmzf
13m         Normal    Scheduled                   pod/keycloak-849cddc78f-gpfjp                   Successfully assigned eclipse-che/keycloak-849cddc78f-gpfjp to aks-worker-16068719-vmss000000
13m         Normal    Pulling                     pod/keycloak-849cddc78f-gpfjp                   Pulling image "quay.io/eclipse/che-keycloak:next"
13m         Normal    Pulled                      pod/keycloak-849cddc78f-gpfjp                   Successfully pulled image "quay.io/eclipse/che-keycloak:next" in 3.690199632s
13m         Normal    Created                     pod/keycloak-849cddc78f-gpfjp                   Created container keycloak
13m         Normal    Started                     pod/keycloak-849cddc78f-gpfjp                   Started container keycloak
12m         Normal    Killing                     pod/keycloak-849cddc78f-gpfjp                   Stopping container keycloak
13m         Normal    SuccessfulCreate            replicaset/keycloak-849cddc78f                  Created pod: keycloak-849cddc78f-gpfjp
12m         Normal    SuccessfulDelete            replicaset/keycloak-849cddc78f                  Deleted pod: keycloak-849cddc78f-gpfjp
38m         Normal    WaitForFirstConsumer        persistentvolumeclaim/keycloak-data             waiting for first consumer to be created before binding
38m         Normal    ProvisioningSucceeded       persistentvolumeclaim/keycloak-data             Successfully provisioned volume pvc-ab4e489c-f9b2-4abd-b04f-527597d26c10 using kubernetes.io/azure-disk
27m         Normal    WaitForFirstConsumer        persistentvolumeclaim/keycloak-data             waiting for first consumer to be created before binding
26m         Normal    ProvisioningSucceeded       persistentvolumeclaim/keycloak-data             Successfully provisioned volume pvc-80badf1c-4d31-4538-aaf8-996ba06049e2 using kubernetes.io/azure-disk
26m         Normal    Scheduled                   pod/keycloak-fbdb664dd-94krw                    Successfully assigned eclipse-che/keycloak-fbdb664dd-94krw to aks-worker-16068719-vmss000000
26m         Normal    SuccessfulAttachVolume      pod/keycloak-fbdb664dd-94krw                    AttachVolume.Attach succeeded for volume "pvc-80badf1c-4d31-4538-aaf8-996ba06049e2"
26m         Normal    SuccessfulAttachVolume      pod/keycloak-fbdb664dd-94krw                    AttachVolume.Attach succeeded for volume "pvc-5cfea3bc-e2fd-4ed6-8268-ac58425add11"
25m         Normal    Pulled                      pod/keycloak-fbdb664dd-94krw                    Container image "quay.io/eclipse/che-endpoint-watcher:next" already present on machine
25m         Normal    Created                     pod/keycloak-fbdb664dd-94krw                    Created container wait-for-postgres
25m         Normal    Started                     pod/keycloak-fbdb664dd-94krw                    Started container wait-for-postgres
24m         Normal    Pulling                     pod/keycloak-fbdb664dd-94krw                    Pulling image "quay.io/eclipse/che-keycloak:7.38.1"
24m         Normal    Pulled                      pod/keycloak-fbdb664dd-94krw                    Successfully pulled image "quay.io/eclipse/che-keycloak:7.38.1" in 442.064897ms
24m         Normal    Created                     pod/keycloak-fbdb664dd-94krw                    Created container keycloak
24m         Normal    Started                     pod/keycloak-fbdb664dd-94krw                    Started container keycloak
24m         Warning   Unhealthy                   pod/keycloak-fbdb664dd-94krw                    Liveness probe failed: dial tcp 10.230.13.13:8080: connect: connection refused
17m         Warning   Unhealthy                   pod/keycloak-fbdb664dd-94krw                    Readiness probe failed: Get "http://10.230.13.13:8080/auth/js/keycloak.js": dial tcp 10.230.13.13:8080: connect: connection refused
23m         Warning   Unhealthy                   pod/keycloak-fbdb664dd-94krw                    Readiness probe failed: Get "http://10.230.13.13:8080/auth/js/keycloak.js": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
17m         Normal    Killing                     pod/keycloak-fbdb664dd-94krw                    Stopping container keycloak
27m         Normal    SuccessfulCreate            replicaset/keycloak-fbdb664dd                   Created pod: keycloak-fbdb664dd-94krw
28m         Normal    AddedOrUpdated              ingress/keycloak-ingress                        Configuration for eclipse-che/keycloak-ingress was added or updated
28m         Normal    AddedOrUpdated              ingress/keycloak-ingress                        Configuration for eclipse-che/keycloak-ingress was added or updated
17m         Normal    AddedOrUpdated              ingress/keycloak-ingress                        Configuration for eclipse-che/keycloak-ingress was added or updated
17m         Normal    AddedOrUpdated              ingress/keycloak-ingress                        Configuration for eclipse-che/keycloak-ingress was added or updated
38m         Normal    WaitForFirstConsumer        persistentvolumeclaim/keycloak-log              waiting for first consumer to be created before binding
38m         Normal    ProvisioningSucceeded       persistentvolumeclaim/keycloak-log              Successfully provisioned volume pvc-8f64f377-2d59-46fc-8c83-693fcac1c27f using kubernetes.io/azure-disk
27m         Normal    WaitForFirstConsumer        persistentvolumeclaim/keycloak-log              waiting for first consumer to be created before binding
26m         Normal    ProvisioningSucceeded       persistentvolumeclaim/keycloak-log              Successfully provisioned volume pvc-5cfea3bc-e2fd-4ed6-8268-ac58425add11 using kubernetes.io/azure-disk
38m         Normal    ScalingReplicaSet           deployment/keycloak                             Scaled up replica set keycloak-69d7f47cd9 to 1
27m         Normal    ScalingReplicaSet           deployment/keycloak                             Scaled up replica set keycloak-fbdb664dd to 1
13m         Normal    ScalingReplicaSet           deployment/keycloak                             Scaled up replica set keycloak-849cddc78f to 1
6m55s       Normal    AddedOrUpdated              ingress/keycloak                                Configuration for eclipse-che/keycloak was added or updated
13m         Normal    AddedOrUpdated              ingress/keycloak                                Configuration for eclipse-che/keycloak was added or updated
12m         Normal    ScalingReplicaSet           deployment/keycloak                             Scaled up replica set keycloak-6fcfc8d74b to 1
12m         Normal    ScalingReplicaSet           deployment/keycloak                             Scaled down replica set keycloak-849cddc78f to 0
6m55s       Warning   AddedOrUpdatedWithWarning   ingress/keycloak                                Configuration for eclipse-che/keycloak was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/keycloak                                Configuration for eclipse-che/keycloak was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
3m11s       Normal    ScalingReplicaSet           deployment/keycloak                             Scaled up replica set keycloak-74c47757d8 to 1
3m11s       Normal    AddedOrUpdated              ingress/keycloak                                Configuration for eclipse-che/keycloak was added or updated
3m11s       Normal    AddedOrUpdated              ingress/keycloak                                Configuration for eclipse-che/keycloak was added or updated
2m8s        Normal    ScalingReplicaSet           deployment/keycloak                             Scaled up replica set keycloak-65777d7db7 to 1
90s         Normal    ScalingReplicaSet           deployment/keycloak                             Scaled down replica set keycloak-74c47757d8 to 0
26m         Normal    Scheduled                   pod/plugin-registry-58c95878f6-25qnz            Successfully assigned eclipse-che/plugin-registry-58c95878f6-25qnz to aks-worker-16068719-vmss000000
26m         Normal    Pulling                     pod/plugin-registry-58c95878f6-25qnz            Pulling image "quay.io/eclipse/che-plugin-registry:7.38.1"
26m         Normal    Pulled                      pod/plugin-registry-58c95878f6-25qnz            Successfully pulled image "quay.io/eclipse/che-plugin-registry:7.38.1" in 1.034175288s
26m         Normal    Created                     pod/plugin-registry-58c95878f6-25qnz            Created container che-plugin-registry
26m         Normal    Started                     pod/plugin-registry-58c95878f6-25qnz            Started container che-plugin-registry
17m         Normal    Killing                     pod/plugin-registry-58c95878f6-25qnz            Stopping container che-plugin-registry
17m         Warning   Unhealthy                   pod/plugin-registry-58c95878f6-25qnz            Liveness probe failed: Get "http://10.230.13.7:8080/v3/plugins/": dial tcp 10.230.13.7:8080: connect: connection refused
38m         Normal    Scheduled                   pod/plugin-registry-58c95878f6-lg5z7            Successfully assigned eclipse-che/plugin-registry-58c95878f6-lg5z7 to aks-worker-16068719-vmss000000
38m         Normal    Pulling                     pod/plugin-registry-58c95878f6-lg5z7            Pulling image "quay.io/eclipse/che-plugin-registry:7.38.1"
38m         Normal    Pulled                      pod/plugin-registry-58c95878f6-lg5z7            Successfully pulled image "quay.io/eclipse/che-plugin-registry:7.38.1" in 846.140323ms
38m         Normal    Created                     pod/plugin-registry-58c95878f6-lg5z7            Created container che-plugin-registry
38m         Normal    Started                     pod/plugin-registry-58c95878f6-lg5z7            Started container che-plugin-registry
28m         Normal    Killing                     pod/plugin-registry-58c95878f6-lg5z7            Stopping container che-plugin-registry
28m         Warning   Unhealthy                   pod/plugin-registry-58c95878f6-lg5z7            Liveness probe failed: Get "http://10.230.13.13:8080/v3/plugins/": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
38m         Normal    SuccessfulCreate            replicaset/plugin-registry-58c95878f6           Created pod: plugin-registry-58c95878f6-lg5z7
27m         Normal    SuccessfulCreate            replicaset/plugin-registry-58c95878f6           Created pod: plugin-registry-58c95878f6-25qnz
12m         Normal    Scheduled                   pod/plugin-registry-66c9df7b5c-445dw            Successfully assigned eclipse-che/plugin-registry-66c9df7b5c-445dw to aks-worker-16068719-vmss000000
12m         Normal    Pulling                     pod/plugin-registry-66c9df7b5c-445dw            Pulling image "quay.io/eclipse/che-plugin-registry:next"
11m         Normal    Pulled                      pod/plugin-registry-66c9df7b5c-445dw            Successfully pulled image "quay.io/eclipse/che-plugin-registry:next" in 3.928885635s
11m         Normal    Created                     pod/plugin-registry-66c9df7b5c-445dw            Created container che-plugin-registry
11m         Normal    Started                     pod/plugin-registry-66c9df7b5c-445dw            Started container che-plugin-registry
6m52s       Normal    Killing                     pod/plugin-registry-66c9df7b5c-445dw            Stopping container che-plugin-registry
6m50s       Warning   Unhealthy                   pod/plugin-registry-66c9df7b5c-445dw            Liveness probe failed: Get "http://10.230.13.25:8080/plugins/": dial tcp 10.230.13.25:8080: connect: connection refused
88s         Normal    Scheduled                   pod/plugin-registry-66c9df7b5c-hzx2x            Successfully assigned eclipse-che/plugin-registry-66c9df7b5c-hzx2x to aks-worker-16068719-vmss000000
88s         Normal    Pulling                     pod/plugin-registry-66c9df7b5c-hzx2x            Pulling image "quay.io/eclipse/che-plugin-registry:next"
88s         Normal    Pulled                      pod/plugin-registry-66c9df7b5c-hzx2x            Successfully pulled image "quay.io/eclipse/che-plugin-registry:next" in 443.045016ms
87s         Normal    Created                     pod/plugin-registry-66c9df7b5c-hzx2x            Created container che-plugin-registry
87s         Normal    Started                     pod/plugin-registry-66c9df7b5c-hzx2x            Started container che-plugin-registry
12m         Normal    SuccessfulCreate            replicaset/plugin-registry-66c9df7b5c           Created pod: plugin-registry-66c9df7b5c-445dw
89s         Normal    SuccessfulCreate            replicaset/plugin-registry-66c9df7b5c           Created pod: plugin-registry-66c9df7b5c-hzx2x
38m         Normal    ScalingReplicaSet           deployment/plugin-registry                      Scaled up replica set plugin-registry-58c95878f6 to 1
28m         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
28m         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
27m         Normal    ScalingReplicaSet           deployment/plugin-registry                      Scaled up replica set plugin-registry-58c95878f6 to 1
17m         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
17m         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
12m         Normal    ScalingReplicaSet           deployment/plugin-registry                      Scaled up replica set plugin-registry-66c9df7b5c to 1
12m         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
6m55s       Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
6m54s       Warning   AddedOrUpdatedWithWarning   ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated ; with warning(s): TLS secret che-tls is invalid: secret doesn't exist or of an unsupported type
89s         Normal    ScalingReplicaSet           deployment/plugin-registry                      Scaled up replica set plugin-registry-66c9df7b5c to 1
89s         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
89s         Normal    AddedOrUpdated              ingress/plugin-registry                         Configuration for eclipse-che/plugin-registry was added or updated
38m         Normal    Scheduled                   pod/postgres-5689df4d6f-8mthm                   Successfully assigned eclipse-che/postgres-5689df4d6f-8mthm to aks-worker-16068719-vmss000000
37m         Normal    SuccessfulAttachVolume      pod/postgres-5689df4d6f-8mthm                   AttachVolume.Attach succeeded for volume "pvc-ea5aa790-6627-498c-8dc4-141ce997a307"
37m         Normal    Pulling                     pod/postgres-5689df4d6f-8mthm                   Pulling image "quay.io/eclipse/che-postgres:7.38.1"
37m         Normal    Pulled                      pod/postgres-5689df4d6f-8mthm                   Successfully pulled image "quay.io/eclipse/che-postgres:7.38.1" in 458.100875ms
37m         Normal    Created                     pod/postgres-5689df4d6f-8mthm                   Created container postgres
37m         Normal    Started                     pod/postgres-5689df4d6f-8mthm                   Started container postgres
37m         Warning   Unhealthy                   pod/postgres-5689df4d6f-8mthm                   Readiness probe failed: psql: could not connect to server: Connection refused
            Is the server running on host "127.0.0.1" and accepting
            TCP/IP connections on port 5432?
28m         Normal    Killing                     pod/postgres-5689df4d6f-8mthm                   Stopping container postgres
28m         Warning   Unhealthy                   pod/postgres-5689df4d6f-8mthm                   Readiness probe errored: rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state
26m         Normal    Scheduled                   pod/postgres-5689df4d6f-t7s4k                   Successfully assigned eclipse-che/postgres-5689df4d6f-t7s4k to aks-worker-16068719-vmss000000
25m         Normal    SuccessfulAttachVolume      pod/postgres-5689df4d6f-t7s4k                   AttachVolume.Attach succeeded for volume "pvc-8847bbdb-1faf-4b1b-8a92-bc68d99ef2a3"
24m         Warning   FailedMount                 pod/postgres-5689df4d6f-t7s4k                   Unable to attach or mount volumes: unmounted volumes=[postgres-data], unattached volumes=[default-token-b6kcx postgres-data]: timed out waiting for the condition
24m         Normal    Pulling                     pod/postgres-5689df4d6f-t7s4k                   Pulling image "quay.io/eclipse/che-postgres:7.38.1"
24m         Normal    Pulled                      pod/postgres-5689df4d6f-t7s4k                   Successfully pulled image "quay.io/eclipse/che-postgres:7.38.1" in 492.861826ms
24m         Normal    Created                     pod/postgres-5689df4d6f-t7s4k                   Created container postgres
24m         Normal    Started                     pod/postgres-5689df4d6f-t7s4k                   Started container postgres
24m         Warning   Unhealthy                   pod/postgres-5689df4d6f-t7s4k                   Readiness probe failed: psql: could not connect to server: Connection refused
            Is the server running on host "127.0.0.1" and accepting
            TCP/IP connections on port 5432?
17m         Normal    Killing                     pod/postgres-5689df4d6f-t7s4k                   Stopping container postgres
17m         Warning   Unhealthy                   pod/postgres-5689df4d6f-t7s4k                   Readiness probe failed: psql: FATAL:  the database system is shutting down
38m         Normal    SuccessfulCreate            replicaset/postgres-5689df4d6f                  Created pod: postgres-5689df4d6f-8mthm
27m         Normal    SuccessfulCreate            replicaset/postgres-5689df4d6f                  Created pod: postgres-5689df4d6f-t7s4k
15m         Normal    Scheduled                   pod/postgres-56f4d74c5c-zhnmr                   Successfully assigned eclipse-che/postgres-56f4d74c5c-zhnmr to aks-worker-16068719-vmss000000
14m         Normal    SuccessfulAttachVolume      pod/postgres-56f4d74c5c-zhnmr                   AttachVolume.Attach succeeded for volume "pvc-0eae5b25-1933-457d-b98b-545f3b4b12b3"
14m         Normal    Pulling                     pod/postgres-56f4d74c5c-zhnmr                   Pulling image "quay.io/eclipse/che--centos--postgresql-13-centos7:1-71b24684d64da46f960682cc4216222a7e4ed8b1a31dd5a865b3e71afdea20d2"
14m         Normal    Pulled                      pod/postgres-56f4d74c5c-zhnmr                   Successfully pulled image "quay.io/eclipse/che--centos--postgresql-13-centos7:1-71b24684d64da46f960682cc4216222a7e4ed8b1a31dd5a865b3e71afdea20d2" in 11.077773818s
14m         Normal    Created                     pod/postgres-56f4d74c5c-zhnmr                   Created container postgres
14m         Normal    Started                     pod/postgres-56f4d74c5c-zhnmr                   Started container postgres
6m52s       Normal    Killing                     pod/postgres-56f4d74c5c-zhnmr                   Stopping container postgres
6m50s       Warning   Unhealthy                   pod/postgres-56f4d74c5c-zhnmr                   Readiness probe errored: rpc error: code = Unknown desc = failed to exec in container: failed to create exec "6bc006dfc0c301f710fe5cb35a500ffa6457d1c2b863ada8de5e8919d12bf9bb": cannot exec in a stopped state: unknown
6m44s       Warning   Unhealthy                   pod/postgres-56f4d74c5c-zhnmr                   Liveness probe failed: dial tcp 10.230.13.6:5432: i/o timeout
15m         Normal    SuccessfulCreate            replicaset/postgres-56f4d74c5c                  Created pod: postgres-56f4d74c5c-zhnmr
4m4s        Normal    Scheduled                   pod/postgres-8fdd8d497-jbz28                    Successfully assigned eclipse-che/postgres-8fdd8d497-jbz28 to aks-worker-16068719-vmss000000
3m44s       Normal    SuccessfulAttachVolume      pod/postgres-8fdd8d497-jbz28                    AttachVolume.Attach succeeded for volume "pvc-11826995-1fb2-4861-b5a4-a42a5113136a"
3m31s       Normal    Pulled                      pod/postgres-8fdd8d497-jbz28                    Container image "quay.io/eclipse/che--centos--postgresql-13-centos7:1-71b24684d64da46f960682cc4216222a7e4ed8b1a31dd5a865b3e71afdea20d2" already present on machine
3m31s       Normal    Created                     pod/postgres-8fdd8d497-jbz28                    Created container postgres
3m31s       Normal    Started                     pod/postgres-8fdd8d497-jbz28                    Started container postgres
4m8s        Normal    SuccessfulCreate            replicaset/postgres-8fdd8d497                   Created pod: postgres-8fdd8d497-jbz28
38m         Normal    WaitForFirstConsumer        persistentvolumeclaim/postgres-data             waiting for first consumer to be created before binding
38m         Normal    ProvisioningSucceeded       persistentvolumeclaim/postgres-data             Successfully provisioned volume pvc-ea5aa790-6627-498c-8dc4-141ce997a307 using kubernetes.io/azure-disk
27m         Normal    WaitForFirstConsumer        persistentvolumeclaim/postgres-data             waiting for first consumer to be created before binding
26m         Normal    ProvisioningSucceeded       persistentvolumeclaim/postgres-data             Successfully provisioned volume pvc-8847bbdb-1faf-4b1b-8a92-bc68d99ef2a3 using kubernetes.io/azure-disk
15m         Normal    WaitForFirstConsumer        persistentvolumeclaim/postgres-data             waiting for first consumer to be created before binding
15m         Normal    ProvisioningSucceeded       persistentvolumeclaim/postgres-data             Successfully provisioned volume pvc-0eae5b25-1933-457d-b98b-545f3b4b12b3 using kubernetes.io/azure-disk
4m8s        Normal    WaitForFirstConsumer        persistentvolumeclaim/postgres-data             waiting for first consumer to be created before binding
4m5s        Normal    ProvisioningSucceeded       persistentvolumeclaim/postgres-data             Successfully provisioned volume pvc-11826995-1fb2-4861-b5a4-a42a5113136a using kubernetes.io/azure-disk
38m         Normal    ScalingReplicaSet           deployment/postgres                             Scaled up replica set postgres-5689df4d6f to 1
27m         Normal    ScalingReplicaSet           deployment/postgres                             Scaled up replica set postgres-5689df4d6f to 1
15m         Normal    ScalingReplicaSet           deployment/postgres                             Scaled up replica set postgres-56f4d74c5c to 1
4m8s        Normal    ScalingReplicaSet           deployment/postgres                             Scaled up replica set postgres-8fdd8d497 to 1
Using embedded assembly in /home/user/eclipse-che.
NOTE: Picked up JDK_JAVA_OPTIONS:  --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.rmi/sun.rmi.transport=ALL-UNNAMED
10-Nov-2021 22:22:00.943 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version name:   Apache Tomcat/10.0.11
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server built:          Sep 6 2021 16:22:12 UTC
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version number: 10.0.11.0
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Name:               Linux
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Version:            5.4.0-1061-azure
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Architecture:          amd64
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Java Home:             /opt/java/openjdk
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Version:           11.0.11+9
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Vendor:            AdoptOpenJDK
10-Nov-2021 22:22:00.946 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE:         /home/user/eclipse-che/tomcat
10-Nov-2021 22:22:00.947 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME:         /home/user/eclipse-che/tomcat
10-Nov-2021 22:22:00.970 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: --add-opens=java.base/java.lang=ALL-UNNAMED
10-Nov-2021 22:22:00.970 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: --add-opens=java.base/java.io=ALL-UNNAMED
10-Nov-2021 22:22:00.970 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: --add-opens=java.base/java.util=ALL-UNNAMED
10-Nov-2021 22:22:00.971 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: --add-opens=java.base/java.util.concurrent=ALL-UNNAMED
10-Nov-2021 22:22:00.971 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: --add-opens=java.rmi/sun.rmi.transport=ALL-UNNAMED
10-Nov-2021 22:22:00.971 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.config.file=/home/user/eclipse-che/tomcat/conf/logging.properties
10-Nov-2021 22:22:00.971 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
10-Nov-2021 22:22:00.971 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -XX:MaxRAMPercentage=85.0
10-Nov-2021 22:22:00.971 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dche.docker.network=bridge
10-Nov-2021 22:22:00.973 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djavax.net.ssl.trustStore=/home/user/cacerts
10-Nov-2021 22:22:00.973 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djavax.net.ssl.trustStorePassword=changeit
10-Nov-2021 22:22:00.973 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dport.http=8080
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dche.home=/home/user/eclipse-che
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dche.logs.dir=/logs/
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dche.logs.level=INFO
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djuli-logback.configurationFile=file:/home/user/eclipse-che/tomcat/conf/tomcat-logger.xml
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djdk.tls.ephemeralDHKeySize=2048
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.protocol.handler.pkgs=org.apache.catalina.webresources
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dorg.apache.catalina.security.SecurityListener.UMASK=0022
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dche.local.conf.dir=/home/user/eclipse-che/tomcat/conf/
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dignore.endorsed.dirs=
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.base=/home/user/eclipse-che/tomcat
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.home=/home/user/eclipse-che/tomcat
10-Nov-2021 22:22:00.974 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.io.tmpdir=/home/user/eclipse-che/tomcat/temp
10-Nov-2021 22:22:01.644 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["http-nio-8080"]
10-Nov-2021 22:22:01.720 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [1185] milliseconds
10-Nov-2021 22:22:01.778 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
10-Nov-2021 22:22:01.778 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/10.0.11]
10-Nov-2021 22:22:01.847 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deploying web application archive [/home/user/eclipse-che/tomcat/webapps/api.war]
2021-11-10 22:22:10,821[main]             [INFO ] [.e.c.c.d.JNDIDataSourceFactory 63]   - This=org.eclipse.che.core.db.postgresql.PostgreSQLJndiDataSourceFactory@4a707c45 obj=ResourceRef[className=javax.sql.DataSource,factoryClassLocation=null,factoryClassName=org.apache.naming.factory.ResourceFactory,{type=scope,content=Shareable},{type=auth,content=Container},{type=singleton,content=true},{type=factory,content=org.eclipse.che.core.db.postgresql.PostgreSQLJndiDataSourceFactory}] name=che Context=org.apache.naming.NamingContext@18067295 environment={}
2021-11-10 22:22:11,656[main]             [INFO ] [.e.c.a.d.WsMasterServletModule 53]   - Running in classic multi-user mode ...
2021-11-10 22:22:14,674[main]             [INFO ] [o.e.c.m.k.s.OIDCInfoProvider 72]     - Retrieving OpenId configuration from endpoint: http://keycloak.eclipse-che.svc:8080/auth/realms/che/.well-known/openid-configuration
2021-11-10 22:22:15,416[main]             [INFO ] [o.e.c.m.k.s.OIDCInfoProvider 81]     - openid configuration = {issuer=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che, authorization_endpoint=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/auth, token_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/token, introspection_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/token/introspect, userinfo_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/userinfo, end_session_endpoint=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/logout, jwks_uri=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/certs, check_session_iframe=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/login-status-iframe.html, grant_types_supported=[authorization_code, implicit, refresh_token, password, client_credentials, urn:ietf:params:oauth:grant-type:device_code, urn:openid:params:grant-type:ciba], response_types_supported=[code, none, id_token, token, id_token token, code id_token, code token, code id_token token], subject_types_supported=[public, pairwise], id_token_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512], id_token_encryption_alg_values_supported=[RSA-OAEP, RSA-OAEP-256, RSA1_5], id_token_encryption_enc_values_supported=[A256GCM, A192GCM, A128GCM, A128CBC-HS256, A192CBC-HS384, A256CBC-HS512], userinfo_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512, none], request_object_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512, none], request_object_encryption_alg_values_supported=[RSA-OAEP, RSA-OAEP-256, RSA1_5], request_object_encryption_enc_values_supported=[A256GCM, A192GCM, A128GCM, A128CBC-HS256, A192CBC-HS384, A256CBC-HS512], response_modes_supported=[query, fragment, form_post, query.jwt, fragment.jwt, form_post.jwt, jwt], registration_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/clients-registrations/openid-connect, token_endpoint_auth_methods_supported=[private_key_jwt, client_secret_basic, client_secret_post, tls_client_auth, client_secret_jwt], token_endpoint_auth_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512], introspection_endpoint_auth_methods_supported=[private_key_jwt, client_secret_basic, client_secret_post, tls_client_auth, client_secret_jwt], introspection_endpoint_auth_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512], authorization_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512], authorization_encryption_alg_values_supported=[RSA-OAEP, RSA-OAEP-256, RSA1_5], authorization_encryption_enc_values_supported=[A256GCM, A192GCM, A128GCM, A128CBC-HS256, A192CBC-HS384, A256CBC-HS512], claims_supported=[aud, sub, iss, auth_time, name, given_name, family_name, preferred_username, email, acr], claim_types_supported=[normal], claims_parameter_supported=true, scopes_supported=[openid, profile, address, email, offline_access, roles, phone, web-origins, microprofile-jwt], request_parameter_supported=true, request_uri_parameter_supported=true, require_request_uri_registration=true, code_challenge_methods_supported=[plain, S256], tls_client_certificate_bound_access_tokens=true, revocation_endpoint=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/revoke, revocation_endpoint_auth_methods_supported=[private_key_jwt, client_secret_basic, client_secret_post, tls_client_auth, client_secret_jwt], revocation_endpoint_auth_signing_alg_values_supported=[PS384, ES384, RS384, HS256, HS512, ES256, RS256, HS384, ES512, PS256, PS512, RS512], backchannel_logout_supported=true, backchannel_logout_session_supported=true, device_authorization_endpoint=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/auth/device, backchannel_token_delivery_modes_supported=[poll, ping], backchannel_authentication_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/ext/ciba/auth, backchannel_authentication_request_signing_alg_values_supported=[PS384, ES384, RS384, ES256, RS256, ES512, PS256, PS512, RS512], require_pushed_authorization_requests=false, pushed_authorization_request_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/ext/par/request, mtls_endpoint_aliases={token_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/token, revocation_endpoint=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/revoke, introspection_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/token/introspect, device_authorization_endpoint=https://keycloak-eclipse-che.vwtg.cloud.suedleasing-dev.com/auth/realms/che/protocol/openid-connect/auth/device, registration_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/clients-registrations/openid-connect, userinfo_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/userinfo, pushed_authorization_request_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/ext/par/request, backchannel_authentication_endpoint=http://keycloak.eclipse-che.svc:8080/auth/realms/che/protocol/openid-connect/ext/ciba/auth}}
2021-11-10 22:22:17,562[main]             [INFO ] [o.j.p.kubernetes.KUBE_PING 131]      - namespace eclipse-che set; clustering enabled

-------------------------------------------------------------------
GMS: address=che-957c65554-2nbx2-33097, cluster=RemoteSubscriptionChannel, physical address=10.230.13.23:7800
-------------------------------------------------------------------
2021-11-10 22:22:20,861[main]             [INFO ] [o.jgroups.protocols.pbcast.GMS 125]  - che-957c65554-2nbx2-33097: no members discovered after 3230 ms: creating cluster as coordinator
2021-11-10 22:22:20,892[main]             [INFO ] [o.j.p.kubernetes.KUBE_PING 131]      - namespace eclipse-che set; clustering enabled

-------------------------------------------------------------------
GMS: address=che-957c65554-2nbx2-49306, cluster=WorkspaceLocks, physical address=10.230.13.23:7801
-------------------------------------------------------------------
2021-11-10 22:22:23,967[main]             [INFO ] [o.jgroups.protocols.pbcast.GMS 125]  - che-957c65554-2nbx2-49306: no members discovered after 3064 ms: creating cluster as coordinator
2021-11-10 22:22:23,977[main]             [INFO ] [o.j.p.kubernetes.KUBE_PING 131]      - namespace eclipse-che set; clustering enabled

-------------------------------------------------------------------
GMS: address=che-957c65554-2nbx2-49109, cluster=WorkspaceStateCache, physical address=10.230.13.23:7802
-------------------------------------------------------------------
2021-11-10 22:22:27,035[main]             [INFO ] [o.jgroups.protocols.pbcast.GMS 125]  - che-957c65554-2nbx2-49109: no members discovered after 3049 ms: creating cluster as coordinator
2021-11-10 22:22:27,249[main]             [INFO ] [o.f.c.i.d.DbSupportFactory 44]       - Database: jdbc:postgresql://postgres:5432/dbche (PostgreSQL 13.3)
2021-11-10 22:22:27,276[main]             [INFO ] [o.f.c.i.util.VersionPrinter 44]      - Flyway 4.2.0 by Boxfuse
2021-11-10 22:22:27,281[main]             [INFO ] [o.f.c.i.d.DbSupportFactory 44]       - Database: jdbc:postgresql://postgres:5432/dbche (PostgreSQL 13.3)
2021-11-10 22:22:27,338[main]             [INFO ] [i.f.CustomSqlMigrationResolver 158]  - Searching for SQL scripts in locations [classpath:che-schema]
2021-11-10 22:22:27,427[main]             [INFO ] [o.f.c.i.command.DbValidate 44]       - Successfully validated 63 migrations (execution time 00:00.089s)
2021-11-10 22:22:27,441[main]             [INFO ] [o.f.c.i.m.MetaDataTableImpl 44]      - Creating Metadata table: "public"."schema_version"
2021-11-10 22:22:27,517[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Current version of schema "public": << Empty Schema >>
2021-11-10 22:22:27,629[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.0.0.8.1 - 1__init.sql
2021-11-10 22:22:28,207[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.0.0.9.1 - 1__add_index_on_workspace_temporary.sql
2021-11-10 22:22:28,237[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.0.0.9.2 - 2__update_local_links_in_environments.sql
2021-11-10 22:22:28,254[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.2.0.1 - 1__increase_project_attributes_values_length.sql
2021-11-10 22:22:28,315[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.4.0.1 - 1__drop_user_to_account_relation.sql
2021-11-10 22:22:28,347[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.4.0.2 - 2__create_missed_account_indexes.sql
2021-11-10 22:22:28,381[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.6.0.1 - 1__add_exec_agent_where_terminal_agent_is_present.sql
2021-11-10 22:22:28,427[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.7.0.1 - 1__add_factory.sql
2021-11-10 22:22:28,616[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.7.0.2 - 2__remove_match_policy.sql
2021-11-10 22:22:28,632[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.8.0.1 - 1__add_foreigh_key_indexes.sql
2021-11-10 22:22:28,947[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.11.0.1 - 1__optimize_user_search.sql
2021-11-10 22:22:29,007[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.19.0.0.1 - 0.1__add_permissions.sql
2021-11-10 22:22:29,252[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.19.0.0.2 - 0.2__add_resources.sql
2021-11-10 22:22:29,303[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 5.19.0.0.3 - 0.3__add_organization.sql
2021-11-10 22:22:29,421[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.1 - 1__add_path_to_serverconf.sql
2021-11-10 22:22:29,430[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.2 - 2__rename_agents_to_installers.sql
2021-11-10 22:22:29,444[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.3 - 3__add_installer.sql
2021-11-10 22:22:29,552[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.3.1 - 3.1__remove_old_recipe_permissions.sql
2021-11-10 22:22:29,566[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.4 - 4__remove_old_recipe.sql
2021-11-10 22:22:29,578[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.5 - 5__add_machine_env.sql
2021-11-10 22:22:29,620[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.6 - 6__remove_snapshots.sql
2021-11-10 22:22:29,631[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.7 - 7__add_machine_volumes.sql
2021-11-10 22:22:29,679[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.8 - 8__add_serverconf_attributes.sql
2021-11-10 22:22:29,724[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.9 - 9__increase_externalmachine_env_value_length.sql
2021-11-10 22:22:29,734[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.10 - 10__move_dockerimage_recipe_location_to_content.sql
2021-11-10 22:22:29,741[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.11 - 11__increase_workspace_attributes_values_length.sql
2021-11-10 22:22:29,751[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.0.0.12 - 12__remove_stack_sources.sql
2021-11-10 22:22:29,765[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.3.0.1 - 1__add_fk_indexes.sql
2021-11-10 22:22:29,807[main]             [WARN ] [o.f.c.i.dbsupport.JdbcTemplate 48]   - DB: identifier "che_index_factory_on_projects_loaded_action_value_action_entity_id" will be truncated to "che_index_factory_on_projects_loaded_action_value_action_entity" (SQL State: 42622 - Error Code: 0)
2021-11-10 22:22:29,814[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.3.0.1.1 - 1.1__add_fk_indexes.sql
2021-11-10 22:22:29,884[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.4.0.1 - 1__add_workspace_expirations.sql
2021-11-10 22:22:29,919[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.4.0.2 - 2__add_signature_key.sql
2021-11-10 22:22:29,975[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.4.0.3 - 3__add_k8s_runtimes.sql
2021-11-10 22:22:30,094[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.10.0.1 - 1__add_workspace_cfg_attributes.sql
2021-11-10 22:22:30,120[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.10.0.2 - 2__change_signature_key_pair_id.sql
2021-11-10 22:22:30,144[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.11.0.1 - 1__add_signature_key_constraints.sql
2021-11-10 22:22:30,181[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.12.0.1 - 1__rename_project_attributes_values_field.sql
2021-11-10 22:22:30,189[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.15.0.1 - 1__remove_not_null_constraint_from_env_name_fields.sql
2021-11-10 22:22:30,210[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.15.0.2 - 2__add_commands_to_k8s_runtime.sql
2021-11-10 22:22:30,266[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.16.0.1 - 1__increase_workspace_config_attributes_values_length.sql
2021-11-10 22:22:30,297[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.16.0.2 - 2__create_workspace_activity_table.sql
2021-11-10 22:22:30,382[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.16.0.3 - 3__bootstrap_ws_activity_data.sql
2021-11-10 22:22:30,398[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 6.17.0.1 - 1__convert_enums_to_strings.sql
2021-11-10 22:22:30,414[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.0.0.4.0.1 - 1__add_devfile.sql
2021-11-10 22:22:30,832[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.0.0.5.0.1 - 1__devfile_command_reference.sql
2021-11-10 22:22:30,846[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.0.0.6.0.1 - 1__add_devfile_component_prefs.sql
2021-11-10 22:22:30,897[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.0.0.7.0.1 - 1__add_registry_url_to_devfile_component.sql
2021-11-10 22:22:30,908[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.0.0.8.0.2.0.1 - 1__devfile_metadata.sql
2021-11-10 22:22:30,923[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.0.0.8.0.2.0.2 - 2__devfile_make_some_fields_optional.sql
2021-11-10 22:22:30,934[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.1.0.1 - 1__change_devfile_component_preferences_type.sql
2021-11-10 22:22:30,943[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.1.0.1.1 - 1.1__remove_stack_permissions.sql
2021-11-10 22:22:30,955[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.1.0.2 - 2__remove_stacks.sql
2021-11-10 22:22:30,967[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.2.0.1 - 1__remove_installers.sql
2021-11-10 22:22:30,979[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.4.0.1 - 1__add_devfile_source_sparse_checkout_dir.sql
2021-11-10 22:22:30,987[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.4.0.2 - 2__add_preview_url_to_devfile_command.sql
2021-11-10 22:22:31,007[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.6.0.1 - 1__drop_che_workspace_expiration.sql
2021-11-10 22:22:31,015[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.10.0.1 - 1__add_devfile_plugin_editor_component_cpu_limit_request.sql
2021-11-10 22:22:31,028[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.10.0.2 - 2__add_devfile_plugin_editor_component_ram_request.sql
2021-11-10 22:22:31,035[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.11.0.1 - 1__update_inconsistent_stopped_workspace_activities.sql
2021-11-10 22:22:31,042[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.16.0.1 - 1__add_devfile_component_automount_workspace_secrets.sql
2021-11-10 22:22:31,050[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.20.0.1 - 1__userdevfile.sql
2021-11-10 22:22:31,123[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.20.0.1.1 - 1.1__add_userdevfile_permissions.sql
2021-11-10 22:22:31,194[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.21.0.1 - 1__remove_installers.sql
2021-11-10 22:22:31,203[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Migrating schema "public" to version 7.26.0.1 - 1__remove_factory_button_and_image.sql
2021-11-10 22:22:31,220[main]             [INFO ] [o.f.c.i.command.DbMigrate 44]        - Successfully applied 63 migrations to schema "public" (execution time 00:03.779s).
2021-11-10 22:22:33,374[main]             [INFO ] [o.j.p.kubernetes.KUBE_PING 131]      - namespace eclipse-che set; clustering enabled

-------------------------------------------------------------------
GMS: address=che-957c65554-2nbx2-51363, cluster=EclipseLinkCommandChannel, physical address=10.230.13.23:7803
-------------------------------------------------------------------
2021-11-10 22:22:36,470[main]             [INFO ] [o.jgroups.protocols.pbcast.GMS 125]  - che-957c65554-2nbx2-51363: no members discovered after 3051 ms: creating cluster as coordinator
2021-11-10 22:22:36,536[main]             [INFO ] [o.e.c.a.w.s.WorkspaceRuntimes 182]   - Configured factories for environments: '[kubernetes, no-environment]'
2021-11-10 22:22:36,539[main]             [INFO ] [o.e.c.a.w.s.WorkspaceRuntimes 183]   - Registered infrastructure 'kubernetes'
2021-11-10 22:22:36,614[main]             [INFO ] [o.e.c.a.w.s.WorkspaceRuntimes 760]   - Infrastructure is tracking 0 active runtimes
2021-11-10 22:22:36,733[main]             [INFO ] [o.e.c.a.c.u.ApiInfoLogInformer 36]   - Eclipse Che Api Core: Build info '7.40.0-SNAPSHOT' scmRevision 'e71e5cff0ddd12995399dfd1f032de9cd60f2e06' implementationVersion '7.40.0-SNAPSHOT'
2021-11-10 22:22:36,772[main]             [WARN ] [p.s.AdminPermissionInitializer 69]   - Admin admin not found yet.
2021-11-10 22:22:36,833[main]             [INFO ] [o.e.c.c.metrics.MetricsServer 46]    - Metrics server started at port 8087 successfully 
10-Nov-2021 22:22:37.259 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deployment of web application archive [/home/user/eclipse-che/tomcat/webapps/api.war] has finished in [35,403] ms
10-Nov-2021 22:22:37.260 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deploying web application archive [/home/user/eclipse-che/tomcat/webapps/ROOT.war]
10-Nov-2021 22:22:39.544 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deployment of web application archive [/home/user/eclipse-che/tomcat/webapps/ROOT.war] has finished in [2,284] ms
10-Nov-2021 22:22:39.546 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deploying web application archive [/home/user/eclipse-che/tomcat/webapps/swagger.war]
10-Nov-2021 22:22:39.724 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deployment of web application archive [/home/user/eclipse-che/tomcat/webapps/swagger.war] has finished in [178] ms
10-Nov-2021 22:22:39.732 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["http-nio-8080"]
10-Nov-2021 22:22:39.741 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [38020] milliseconds
The files belonging to this database system will be owned by user "postgres".
This user must also own the server process.

The database cluster will be initialized with locale "en_US.utf8".
The default database encoding has accordingly been set to "UTF8".
The default text search configuration will be set to "english".

Data page checksums are disabled.

fixing permissions on existing directory /var/lib/pgsql/data/userdata ... ok
creating subdirectories ... ok
selecting dynamic shared memory implementation ... posix
selecting default max_connections ... 100
selecting default shared_buffers ... 128MB
selecting default time zone ... UTC
creating configuration files ... ok
running bootstrap script ... ok
performing post-bootstrap initialization ... ok
initdb: warning: enabling "trust" authentication for local connections
You can change this by editing pg_hba.conf or using the option -A, or
--auth-local and --auth-host, the next time you run initdb.
syncing data to disk ... ok

Success. You can now start the database server using:

    pg_ctl -D /var/lib/pgsql/data/userdata -l logfile start

waiting for server to start....2021-11-10 22:19:40.158 UTC [35] LOG:  redirecting log output to logging collector process
2021-11-10 22:19:40.158 UTC [35] HINT:  Future log output will appear in directory "log".
 done
server started
/var/run/postgresql:5432 - accepting connections
=> sourcing /usr/share/container-scripts/postgresql/start/set_passwords.sh ...
ALTER ROLE
ALTER ROLE
waiting for server to shut down.... done
server stopped
Starting server...
2021-11-10 22:19:41.047 UTC [1] LOG:  redirecting log output to logging collector process
2021-11-10 22:19:41.047 UTC [1] HINT:  Future log output will appear in directory "log".

Additional context

No response

tolusha commented 2 years ago

It might be the same reason https://github.com/eclipse/che/issues/20726

sebastiankomander commented 2 years ago

I tried patching che-dashboard-Ingress:

Name:             che-dashboard
Namespace:        eclipse-che
Address:          
Default backend:  default-http-backend:80
TLS:
  che-tls terminates che-eclipse-che.mydomain.con
Rules:
  Host                                            Path  Backends
  ----                                            ----  --------
  che-eclipse-che.mydomain.com  
                                                  /dashboard/*   che-dashboard:8080 (1.2.3.4:1234)
Annotations:                                      che.eclipse.org/managed-annotations-digest: NpIhR7gmyYZJ3KW2WvSw4lBBh-vqSAsF5n83BvCrMeE=
                                                  kubernetes.io/ingress.class: nginx
                                                  nginx.ingress.kubernetes.io/proxy-connect-timeout: 3600
                                                  nginx.ingress.kubernetes.io/proxy-read-timeout: 3600
                                                  nginx.ingress.kubernetes.io/ssl-redirect: true
                                                  nginx.ingress.kubernetes.io/use-regex: true

Same problem again..

sebastiankomander commented 2 years ago

I tried installer=operator, but che-dashboard-ingress looks wrong

kubectl describe ingress -n eclipse-che che-dashboard

----     ------    ----  ----                      -------
  Warning  Rejected  41s   nginx-ingress-controller  All hosts are taken by other resources
  Warning  Rejected  41s   nginx-ingress-controller  All hosts are taken by other resources

I looks like the problem ist located in che-ingress. The ingress catches all from "/". So che-dashboard never gets activated and - as far as I can see trying with curl - every requests gets routed to che-ingress and then forwarded to "/dashboard". But /dashboard is getting caught by che-ingress and the redirect starts from beginning. I keep tryining with helm-installer...

sebastiankomander commented 2 years ago

Ok, problem lies within the ingress installation. If you use an old version of nginx-ingress (v0.41.0) everything works finde. If you use the latest stable version it's not working - see my comment

martinelli-francesco commented 2 years ago

I've just installed Eclipse Che 7.41.2 and still have the same issue even though I followed the instructions (https://www.eclipse.org/che/docs/che-7/installation-guide/installing-che-on-microsoft-azure/) that install nginx-ingress (v0.41.0): kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/controller-**v0.41.0**/deploy/static/provider/cloud/deploy.yaml

I get redirected to /dashboard/ which returns with 200 but for all the resources (branding.css, loader.svg, ...) I get a 404.

tolusha commented 2 years ago

Could you try configure Eclipse Che to use single-host strategy?

kubectl patch checluster/eclipse-che -n eclipse-che --type=json -p '[{"op": "replace", "path": "/spec/server/serverExposureStrategy", "value": "single-host"}]'
kubectl patch checluster/eclipse-che -n eclipse-che --type=json -p '[{"op": "replace", "path": "/spec/k8s/singleHostExposureType", "value": "gateway"}]'
csa-nathan commented 2 years ago

I've just installed Eclipse Che 7.41.2 and still have the same issue even though I followed the instructions (https://www.eclipse.org/che/docs/che-7/installation-guide/installing-che-on-microsoft-azure/) that install nginx-ingress (v0.41.0): kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/controller-**v0.41.0**/deploy/static/provider/cloud/deploy.yaml

I get redirected to /dashboard/ which returns with 200 but for all the resources (branding.css, loader.svg, ...) I get a 404.

Try and remove the Che Dashboard Ingress Path from /dashboard/ to just /dashboard

It should work.

che-bot commented 2 years ago

Issues go stale after 180 days of inactivity. lifecycle/stale issues rot after an additional 7 days of inactivity and eventually close.

Mark the issue as fresh with /remove-lifecycle stale in a new comment.

If this issue is safe to close now please do so.

Moderators: Add lifecycle/frozen label to avoid stale mode.