openzipkin-attic / apache-release-verification

Apache License 2.0
3 stars 1 forks source link

"Executing build-and-test for maven" fails due to using JRE (not JDK) #40

Open shakuzen opened 5 years ago

shakuzen commented 5 years ago

For example, we have an ongoing zipkin-dependencies release vote, which can be checked with the following command:

NO_CLEANUP=1 ./check.sh --module zipkin-dependencies --version 2.2.0 --gpg-key DA805D02 --git-hash 7d40f7be565116da3117f8ac24db3023a2183bd2 --github-reponame-template '{incubator_dash}{module}' --zipname-template 'apache-{module}{dash_incubating}-{version}-source-release'

This results in the "Executing build-and-test for maven" step failing (after quite a while on the step... not sure about why that is).

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:testCompile (default-testCompile) on project zipkin-dependencies-cassandra: Fatal error compiling: Error while executing the external compiler. Error while executing process. Cannot run program "javac" (in directory "/tmp/tmpcav5c0pz/unzipped/zipkin-dependencies-2.2.0/cassandra"): error=2, No such file or directory

The set JAVA_HOME points to a JRE installation, but we need a JDK installation to compile projects.

codefromthecrypt commented 5 years ago

as a note: most projects can use any late JDK. zipkin-dependencies actually can only run on JDK 1.8. There's a maven-enforcer-plugin which has this version range info.

It might be possible to run a command with maven first to see if it fails due to maven enforcer, then fallback to JDK 1.8 to avoid needing another property in the args. Ex use JDK 11, but if a test command like "mvn enforcer:enforce" except that this doesn't seem to fail :P

shakuzen commented 5 years ago

I looked into this a little bit. Since the version of the JDK depends on the project, we either need to make a new arg to control which to install (or which docker tag to use, and build different images for different versions), or we need to dynamically install a version as a step during execution. Some projects don't need any JDK, also. Especially if we want to support projects other than just Zipkin, I'm not sure how smart we can be about dynamically determining a JDK version to use.

I wonder if there is any prior art for dealing with this kind of thing. It seems like it can get pretty messy quickly. Downloading a JDK every run seems less than ideal, but it would be most flexible on the versions we can handle. Any other thoughts/ideas, @abesto?

abesto commented 5 years ago

Whoops, turns out I forgot to "Watch" this repo so missed the last two issues. That's fixed now, sorry.

The general version of this seems to be "build environments vary a lot between projects". My best hammer for that problem is "package up the environment in a Docker image". Building directly on a developers machine was always going to be fragile; I'm not sure how far we should go extending that workflow. Might be worth tweaking things a bit more, or maybe the right move is to "just" go for the configurable Docker build environment for the build-and-test step right away.

Agree with that framing? If so, thoughts on approaching this?

shakuzen commented 5 years ago

My best hammer for that problem is "package up the environment in a Docker image".

I think the difficulty with this is that we would have to prepare a Docker image (or use an existing one, I suppose) for each combination of build dependencies. Or make an uber one that has all foreseeable build dependencies that can be switched between by changing environment variables, maybe - but this is of course not lightweight at all.

If this tool is used from the script, it is easier to switch between different docker images to use, but right now it supports and some people use the Docker image directly, I think.

or maybe the right move is to "just" go for the configurable Docker build environment for the build-and-test step right away.

How would this work? Would it launch a Docker image from within the Docker image?

abesto commented 5 years ago

How would this work? Would it launch a Docker image from within the Docker image?

Correct. Last time I checked (~a year ago) the way to do that was mounting in the Docker socket into the container. Super unsafe, you never want to do it in prod, but we're not in prod :)

shakuzen commented 5 years ago

I think that makes sense then. The rest of the checks in one image that has minimal dependencies. Build/test environment in a separate image that can be switched depending on the needs of a project and launched from the main image.