game-ci / docker

Series of CI-specialised docker images for Unity.
https://hub.docker.com/u/unityci
MIT License
400 stars 124 forks source link

Allow to install multiple modules in the editor image #116

Closed shr-project closed 3 years ago

shr-project commented 3 years ago

Changes

Checklist

github-actions[bot] commented 3 years ago

Cat Gif

shr-project commented 3 years ago

Please also note that we generally discourage our users from installing all modules in one single image as it goes against parallelisation in many cases. In many cases pulling the (compressed) image directly is just as fast as caching, however when trying to cache an image that holds all modules, it will actually slow down each build (even when cached, depending on the mechanism).

Understood, I was going to use this for building SVL Simulator (https://github.com/lgsvl/simulator/blob/master/Jenkins/Jenkinsfile) where the initial checkout stages takes quite long, so instead of triggering 3 jobs for 3 different platforms in parallel it builds all 3 platforms in 1 job and the unity version isn't changed very often, so the bigger editor docker image can stay cached on the builders for relatively long.

Currently it was using UnitySetup instead of UnityHub for installation (with linux, windows, mac support all included): https://github.com/lgsvl/simulator/blob/master/Jenkins/Dockerfile#L26 but I haven't found a way how to let UnitySetup install childModules as well (which seems to be mostly useful for android).

GabLeRoux commented 3 years ago

I see that all android checks are failing above, this will most likely need to be fixed before it can be merged 👀 I haven't looked at why it's currently failing tho.

All in all, I think this contribution will be helpful in cases where a single image must be used for multiple platforms. Thanks so much for that contribution! :D

shr-project commented 3 years ago

I see that all android checks are failing above, this will most likely need to be fixed before it can be merged I haven't looked at why it's currently failing tho.

Is there some successful check since https://github.com/game-ci/docker/pull/97 was merged? The "Validate Android Utils" step fails for me even without any changes from this PR, because it depends on JAVA_HOME from ~/.bashrc while the step passes also -e HOME which will break it unless your HOME is /root (it's '/home/runner' in your CI pipeline).

2021-05-29T12:08:56.0092844Z ##[command]/usr/bin/docker run --name a33c1a032c225baa74958bdb83e761f1e8692_15bc0a --label 8a33c1 --workdir /github/workspace --rm -e UNITY_LICENSE -e CHANGESET -e MODULE -e INPUT_IMAGE -e INPUT_RUN -e INPUT_OPTIONS -e INPUT_SHELL -e INPUT_REGISTRY -e INPUT_USERNAME -e INPUT_PASSWORD -e INPUT_DOCKER_NETWORK -e HOME -e GITHUB_JOB -e GITHUB_REF -e GITHUB_SHA -e GITHUB_REPOSITORY -e GITHUB_REPOSITORY_OWNER -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_RETENTION_DAYS -e GITHUB_ACTOR -e GITHUB_WORKFLOW -e GITHUB_HEAD_REF -e GITHUB_BASE_REF -e GITHUB_EVENT_NAME -e GITHUB_SERVER_URL -e GITHUB_API_URL -e GITHUB_GRAPHQL_URL -e GITHUB_WORKSPACE -e GITHUB_ACTION -e GITHUB_EVENT_PATH -e GITHUB_ACTION_REPOSITORY -e GITHUB_ACTION_REF -e GITHUB_PATH -e GITHUB_ENV -e RUNNER_OS -e RUNNER_TOOL_CACHE -e RUNNER_TEMP -e RUNNER_WORKSPACE -e ACTIONS_RUNTIME_URL -e ACTIONS_RUNTIME_TOKEN -e ACTIONS_CACHE_URL -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" -v "/home/runner/work/_temp/_runner_file_commands":"/github/file_commands" -v "/home/runner/work/docker/docker":"/github/workspace" 8a33c1:a032c225baa74958bdb83e761f1e8692
2021-05-29T12:08:57.2402601Z mesg: ttyname failed: Inappropriate ioctl for device
2021-05-29T12:08:57.2470282Z bash: java: command not found
2021-05-29T12:08:57.2601392Z Post job cleanup.

It fails the same with docker image built from current main branch:

$ docker run -e HOME --rm -it unityci/jansa:editor-android-main /bin/bash -l -c 'java -version'
/bin/bash: java: command not found

$ docker run --rm -it unityci/jansa:editor-android-main /bin/bash -l -c 'java -version'
openjdk version "1.8.0-adoptopenjdk"
OpenJDK Runtime Environment (build 1.8.0-adoptopenjdk-jenkins_2018_05_19_01_00-b00)
OpenJDK 64-Bit Server VM (build 25.71-b00, mixed mode)

$ HOME=/root docker run -e HOME --rm -it unityci/jansa:editor-android-main /bin/bash -l -c 'java -version'
WARNING: Error loading config file: /root/.docker/config.json: open /root/.docker/config.json: permission denied
openjdk version "1.8.0-adoptopenjdk"
OpenJDK Runtime Environment (build 1.8.0-adoptopenjdk-jenkins_2018_05_19_01_00-b00)
OpenJDK 64-Bit Server VM (build 25.71-b00, mixed mode)

$ docker run -e HOME=/root --rm -it unityci/jansa:editor-android-main /bin/bash -c 'java -version'
openjdk version "1.8.0-adoptopenjdk"
OpenJDK Runtime Environment (build 1.8.0-adoptopenjdk-jenkins_2018_05_19_01_00-b00)
OpenJDK 64-Bit Server VM (build 25.71-b00, mixed mode)

$ docker run -e HOME=/foo --rm -it unityci/jansa:editor-android-main /bin/bash -l -c 'java -version'
/bin/bash: java: command not found
webbertakken commented 3 years ago

Sure, the latest workflows on main were not failing (except for 1 windows run).

image

The workflows are public and can be found here.

I'm running a test to verify the integrity of main right now.

webbertakken commented 3 years ago

The integrity of main looks to be fine. Only 1 iOS build failed because of an activation failure.

shr-project commented 3 years ago

I have no idea why many CI jobs failed now, I see logs in the successful tests, but just "This check failed" for failed like in https://github.com/game-ci/docker/pull/116/checks?check_run_id=3185307822

webbertakken commented 3 years ago

I have no idea why many CI jobs failed now, I see logs in the successful tests, but just "This check failed" for failed like in https://github.com/game-ci/docker/pull/116/checks?check_run_id=3185307822

Likely a GitHub error. Lets rerun them.

webbertakken commented 3 years ago

Reported failure here https://github.community/t/many-runners-failing-on-different-stages-seemingly-at-random/193479