Closed rohanKanojia closed 4 days ago
Just as preliminary work I would try to just spin crc on windows
or mac
and then spin a container within Podman (having the oc client and just trying to oc login from inside)
Target windows or mac as I am more in doubt about the usage for host network mode
on podman machine (as this is my first though on how the communication could be done)
Unfortunately, I don't have access to windows
or mac
machine. I tried running a container within Podman to run JKube tests and was able to access CRC from within Podman container.
Here is the Dockerfile I used (Not sure what adjustments would need to be done in order to make it work on mac and windows) :
FROM registry.access.redhat.com/ubi9/openjdk-17:1.20-2.1726695177
LABEL org.opencontainers.image.authors="CRCQE <devtools-crc-qe@redhat.com>"
USER root
# Install oc
RUN curl -o oc.tar https://downloads-openshift-console.apps.sandbox-m4.g2pi.p1.openshiftapps.com/amd64/linux/oc.tar
RUN tar -xf oc.tar
RUN mv oc /usr/local/bin/oc
# Install Git
RUN microdnf install -y git
USER default
RUN git clone https://github.com/eclipse-jkube/jkube-integration-tests.git
COPY <<-EOT /script.sh
oc login -u developer -p developer https://localhost:6443 --insecure-skip-tls-verify
oc new-project jkube-spring-boot-app-deploy
cd jkube-integration-tests
mvn -B -DskipTests clean install
# Run existing JKube Quarkus OpenShift Integration test to test deploy flow
mvn -B verify -Dit.test=QuarkusOcITCase -POpenShift,quarkus -Djkube.version=1.17.0
EOT
ENTRYPOINT ["sh", "/script.sh"]
Hi @rohanKanojia, so this is running an image providing java, installing oc cli and connecting to current host (api.crc.tessting right?) to run the java application.
@albfan :
If we use localhost and forward a port we can check this?
Sorry, I don't understand. Could you please elaborate your question?
Shouldn't we use a simpler java application than jkube-integration-tests?
I'm only running a single test case QuarkusOcITCase
. You can find it's source code here: QuarkusOcITCase.java . In the Dockerfile script, The test runs a simple Quarkus application (located in jkube-integration-tests/projects-to-be-tested/ ). Earlier I was trying to do everything in a bash script but then I realized we can reuse JKube integration tests to just run a single test as per our requirements.
about localhost: Should we use localhost instead of api.crc.testing on your script?
about java project: if it just reuses something you know well that's enough by now
@albfan : I tested by changing api.crc.testing
to localhost
in Dockerfile. It seems to work on my machine.
podman build . --tag redhat-ubi-openjdk17-with-oc:latest
podman run --network host redhat-ubi-openjdk17-with-oc:latest
[INFO]
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.eclipse.jkube.integrationtests.quarkus.rest.QuarkusOcITCase
[ForkJoinPool-1-worker-1] INFO org.eclipse.jetty.websocket.client.WebSocketClient - Shutdown WebSocketClient@ae5d5bd7[coreClient=WebSocketCoreClient@1c9a3e14{STARTED},openSessions.size=0]
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 141.2 s -- in org.eclipse.jkube.integrationtests.quarkus.rest.QuarkusOcITCase
[INFO]
[INFO] Results:
[INFO]
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO]
[INFO] --- maven-failsafe-plugin:3.1.2:verify (default) @ integration-tests ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for JKube :: Integration Tests :: Parent 0.0.0-SNAPSHOT:
[INFO]
[INFO] JKube :: Integration Tests :: Parent ............... SUCCESS [ 0.001 s]
[INFO] JKube :: Integration Tests :: Quarkus :: Rest ...... SUCCESS [02:58 min]
[INFO] JKube :: Integration Tests :: Quarkus :: Rest :: Trace Logging Enabled SUCCESS [ 5.440 s]
[INFO] JKube :: Integration Tests :: Tests ................ SUCCESS [02:28 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:15 min
[INFO] Finished at: 2024-10-10T10:39:20Z
[INFO] ------------------------------------------------------------------------
@albfan can you try this exact same thing on a Windows or Mac machine? as @rohanKanojia does not have access to any of them
I added some preliminary acceptance criteria, but would like to see some constraints and result based outcome there.
test on windows and mac where successful
> podman run --network host redhat-ubi-openjdk17-with-oc:latest
WARNING: Using insecure TLS client config. Setting this option is not supported!
Login successful.
but due to some limitations on windows/mac test do not end:
https://github.com/crc-org/crc/wiki/Podman-support#limitations
Ports are not automatically exposed on the host.
Workaround when using vsock network mode: Expose a port: curl --unix-socket ~/.crc/crc-http.sock http:/unix/network/services/forwarder/expose -X POST -d '{"local":":8080","remote":"192.168.127.3:8080"}'
Results from running:
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.eclipse.jkube.integrationtests.quarkus.rest.QuarkusOcITCase
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[pool-3-thread-2] WARN org.eclipse.jkube.integrationtests.assertions.KubernetesClientAssertion - Connection to http://localhost:30633/ failed, retrying
[ForkJoinPool-1-worker-1] INFO org.eclipse.jetty.websocket.client.WebSocketClient - Shutdown WebSocketClient@63f1c585[coreClient=WebSocketCoreClient@4886affc{STARTED},openSessions.size=0]
[ERROR] Tests run: 5, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 186.8 s <<< FAILURE! -- in org.eclipse.jkube.integrationtests.quarkus.rest.QuarkusOcITCase
[ERROR] org.eclipse.jkube.integrationtests.quarkus.rest.QuarkusOcITCase.ocApply -- Time elapsed: 102.3 s <<< ERROR!
java.util.concurrent.TimeoutException
at java.base/java.util.concurrent.CompletableFuture.timedGet(CompletableFuture.java:1960)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2095)
at org.eclipse.jkube.integrationtests.assertions.ServiceAssertion.assertNodePortResponse(ServiceAssertion.java:109)
at org.eclipse.jkube.integrationtests.quarkus.rest.Quarkus.assertThatShouldApplyResources(Quarkus.java:68)
at org.eclipse.jkube.integrationtests.quarkus.rest.QuarkusOcITCase.ocApply(QuarkusOcITCase.java:103)
at java.base/java.lang.reflect.Method.invoke(Method.java:569)
at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:194)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] QuarkusOcITCase.ocApply:103->Quarkus.assertThatShouldApplyResources:68 » Timeout
[INFO]
[ERROR] Tests run: 5, Failures: 0, Errors: 1, Skipped: 0
[INFO]
[INFO]
[INFO] --- maven-failsafe-plugin:3.1.2:verify (default) @ integration-tests ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for JKube :: Integration Tests :: Parent 0.0.0-SNAPSHOT:
[INFO]
[INFO] JKube :: Integration Tests :: Parent ............... SUCCESS [ 0.002 s]
[INFO] JKube :: Integration Tests :: Quarkus :: Rest ...... SUCCESS [ 16.386 s]
[INFO] JKube :: Integration Tests :: Quarkus :: Rest :: Trace Logging Enabled SUCCESS [ 3.390 s]
[INFO] JKube :: Integration Tests :: Tests ................ FAILURE [03:08 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:37 min
[INFO] Finished at: 2024-10-15T10:53:37Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:3.1.2:verify (default) on project integration-tests:
[ERROR]
[ERROR] Please refer to /home/default/jkube-integration-tests/it/target/failsafe-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :integration-tests
@rohanKanojia Is it possible to fix that port and share?
We can decide to move some of the criteria to a new issue. This means this can move forward/merge this, and adding additional tests as a follow-up. WDYT?
Just as preliminary work I would try to just spin crc on windows or mac and then spin a container within Podman (having the oc client and just trying to oc login from inside)
Target windows or mac as I am more in doubt about the usage for host network mode on podman machine (as this is my first though on how the communication could be done)
@adrianriobo: Hello. After getting help from @albfan, I managed to test the Dockerfile on Windows and MacOS. As you and Alberto pointed out, --network=host
mode does not work on Windows and MacOS.
In order to make it work on Windows/MacOS, I used host.containers.internal
DNS name (host.docker.internal
in docker). This seems to work for Linux, Windows and MacOS. For oc login
to work we also need to add a custom host mapping for oauth-openshift.apps-crc.testing
.
I modified Dockerfile like this (omitted some lines related to downloading git
and running maven build using jkube):
FROM registry.access.redhat.com/ubi9/openjdk-17:1.20-2.1726695177
USER root
# Install oc
RUN curl -o oc.tar https://downloads-openshift-console.apps.sandbox-m4.g2pi.p1.openshiftapps.com/amd64/linux/oc.tar
RUN tar -xf oc.tar
RUN mv oc /usr/local/bin/oc
USER default
COPY <<-EOT /script.sh
oc login -u developer -p developer https://host.containers.internal:6443 --insecure-skip-tls-verify
# Application Deployment using JKube to crc cluster
# ...
EOT
ENTRYPOINT ["sh", "/script.sh"]
The container is built and started like this:
# Expose CRC podman daemon
eval $(crc podman-env)
# Build Image
podman build . --tag localhost/redhat-ubi-openjdk17-with-oc:latest
# Start container
podman run --rm \
--add-host=oauth-openshift.apps-crc.testing:host-gateway \
localhost/redhat-ubi-openjdk17-with-oc:latest
How do you think we should proceed with test implementation? I have these options in my mind:
script.sh
added inside the Dockerfile in plain bash . E2E test code would be responsible for building container image and starting container. It would wait for the container to finish during test execution and would check whether it exited successfully once finished.script.sh
that depends on Java and Maven (JKube requirements). Once the application has been deployed to the CRC cluster using JKube, the rest of the verifications for application health inside the CRC cluster would be performed in the E2E test. where is this script.sh
file? I am not sure if this is coming from jkube directly (tests/script) but if that is the case then better to have this container file as part of same repo. Also api url can be passed using argument so it become more generic ?
@praveenkumar :
where is this script.sh file?
I am creating script.sh
file within the Dockerfile during the image build process. It can also be created as a standalone script and kept in CRC test codebase, we can mount it inside container and then execute it.
Also api url can be passed using argument so it become more generic ?
Sorry, I don't understand your comment. Are you talking about crc cluster URL https://host.containers.internal:6443
? Shall it be passed as build arguments?
I am creating
script.sh
file within the Dockerfile during the image build process. It can also be created as a standalone script and kept in CRC test codebase, we can mount it inside container and then execute it.
Alright so this is specific for test it on crc side.
Also api url can be passed using argument so it become more generic ?
Sorry, I don't understand your comment. Are you talking about crc cluster URL
https://host.containers.internal:6443
? Shall it be passed as build arguments?
not build argument but as run argument, you can have default arg as part of build but if it is very specific to crc then I am OK with hardcode.
Description
As we discussed internally during F2F meeting, CRC can use the JKube project to test the deployment workflow of a simple Java application.
Once CRC has been set up in E2E tests, we can add one more step to build and deploy a Java application into the installed cluster and verify whether it's accessible from the CRC cluster. JKube performs an S2I binary build to build a container image and then deploys it to the OpenShift cluster.
I had discussed this with @adrianriobo , we can try adding this step in a container.
originally posted by @adrianriobo in internal chat:
Expected Behavior
CRC pipelines are improved to verify application deployment workflow for a simple Java application. We can use some simple Quarkus application and deploy it to the created CRC cluster using Eclipse JKube to achieve this.
Acceptance Criteria