lineageos4microg / docker-lineage-cicd

Docker microservice for LineageOS Continuous Integration and Continous Deployment
https://hub.docker.com/r/lineageos4microg/docker-lineage-cicd
GNU General Public License v3.0
480 stars 189 forks source link

21.0 lemonadep failures. #645

Closed ShadwDrgn closed 3 weeks ago

ShadwDrgn commented 3 weeks ago

lineage-21.0-20240623-user-lemonadep.log I've tried compiling for lemonadep a few times in the last week and have gotten the same error every time i think. not sure what's going on. tried deleting ccache and linage dirs so it repo syncs from scratch. no such luck: command to build:

docker run --rm --name android -e "BRANCH_NAME=lineage-21.0" -e "DEVICE_LIST=lemonadep" -e "SIGN_BUILDS=true" -e "RELEASE_TYPE=user" -e "WITH_GMS=true" -e "ZIP_SUBDIR=false" -e "OTA_URL=https://lineage.1ez.us/api" -v "/mnt/ds16_backups/android/lineage:/srv/src" -v "/mnt/ds16_backups/android/keys:/srv/keys" -v "/mnt/ds16_backups/android/zips:/srv/zips" -v "/mnt/ds16_backups/android/logs:/srv/logs" -v "/mnt/ds16_backups/android/ccache:/srv/ccache" -v "/mnt/ds16_backups/android/local_manifests:/srv/local_manifests" lineageos4microg/docker-lineage-cicd | tee /home/user/android_`date +'%Y%m%d'`.log`

local manifests folder only contains one file with this:

<?xml version="1.0" encoding="UTF-8"?>
<manifest>
    <project path="vendor/partner_gms" name="lineageos4microg/android_vendor_partner_gms" remote="github" revision="master" />
</manifest>

This is the first time i've tried a build in some time, and I've never tried 21.0 before this. I don't know if this is something i'm doing wrong or if the aosp repos are just busted for this hardware right now or what.

Any assistance would be greatly appreciated.

petefoth commented 3 weeks ago

A couple of things

  1. We make userdebug builds not user. Try changing the -e "RELEASE_TYPE=user" to -e "RELEASE_TYPE=userdebug"
  2. The error messages in the log indicate a problem building the tv HAL from https://github.com/LineageOS/android_hardware_interfaces/tree/lineage-21.0/tv. It's possible this is caused by having a 'dirtysource tree. Worth trying arm -rf *in/mnt/ds16_backups/android/lineage:/srv/src`.
    
    ERROR: hardware/interfaces/tv/tuner/aidl/aidl_api/android.hardware.tv.tuner/current/android/hardware/tv/tuner/RecordSettings.aidl:41.52-99: Bad internal state: !resolved_: Should be resolved first: android.hardware.tv.tuner.DataFormat.UNDEFINED
    Aborted (core dumped)
    ###############################################################################
    # ERROR: AIDL API change detected on frozen interface                         #
    ###############################################################################
    Above AIDL file(s) has changed. The AIDL API is marked `frozen: true` so it
    cannot be modified. Change this to `frozen: false`, then run `m android.hardware.tv.tuner-update-api`
    to reflect the changes to the current version so that it is reviewed by
    android-aidl-api-council@google.com.
    And then you need to change dependency on android.hardware.tv.tuner-V(n)-* to android.hardware.tv.tuner-V(n+1)-* to use
    new APIs.
petefoth commented 3 weeks ago

FWIW, there were no problems at all making the latest 21.0 build for this device

ShadwDrgn commented 3 weeks ago

I will try userdebug, but i have already tried to delete all of lineage/ and ccache/ multiple times and still get this error. Will reply here with results of userdebug and will close the issue if it succeeds. Thx.

ShadwDrgn commented 3 weeks ago

repo-20240623.log this is the repo sync log in case that somehow helps determine the issue.

petefoth commented 3 weeks ago

Another thought: have you done a docker pull lineageos4microg/docker-lineage-cicd recently? Problem could be caused by having an out-of-date docker image on your build machine.

ShadwDrgn commented 3 weeks ago

I have not, though i recently did a docker system prune -a It's possible the container was somehow running when i did it though. I will pull again if this build fails. Edit: There was indeed an update. hopefully this resolves it!

ShadwDrgn commented 3 weeks ago

This failed again after pulling latest, rm -rf lineage/ ccache/ and setting to "userdebug". Is it possible for this to somehow be due to the keys/ folder being populated from lineage 20 builds? At this point i've freed up 450G on my primary nvme so i can rule out the nfs share being janky somehow, but i don't know what to do anymore. It would feel awful to know my entire extremely expensive NAS is just broken or something, but we'll see i suppose. if this fails now (i've even deleted the keys/ files in case somehow old 20. keys are breaking something) I'm defeated.

i guess i'll close this in defeat. :~( Thanks for trying to help. Thx for providing all of this and the support to the community. Thanks to you I can at least just go get your builds since i'm somehow incapable of building it anymore.

petefoth commented 3 weeks ago

This failed again after pulling latest, rm -rf lineage/ ccache/ and setting to "userdebug". Is it possible for this to somehow be due to the keys/ folder being populated from lineage 20 builds?

Sorry it's still not working. Please will you post the log from the most recent failure, and I'll have another look when I get some time?

ShadwDrgn commented 3 weeks ago

I'm starting to think this is a repo sync problem because there're a LOT of errors at the top of that repo sync log and I actually NEARLY had success compiling it after moving it off of NFS (which concerns me as well) before I had a power failure that interrupted the build (it was literally WAY past where it was normally failing). So i'm still in the middle of cleaning up the aftermath of that and plugging gaps in my battery backup coverage and some storage upgrades. Once that's all resolved and i'm back up and running i'll do a new buiild and see what happens. For now you can safely assume this was all because of something to do with either my storage or my network and i won't take up any more of your time. Thanks so much for all your help! If i discover something that may be related to the container i'll reopen this or make a new issue.