lpenz / ghaction-cmake

cmake swiss army knife github docker action
MIT License
22 stars 13 forks source link

Support sonarcloud.io #6

Open jayvdb opened 3 years ago

jayvdb commented 3 years ago

https://SonarCloud.io requires coverage data is processed by a sonar specific tool

SonarCloud is the only code coverage reporting SaaS that is based on a self-hostable OSS software, while the backend of codecov / coveralls are closed source.

An example of sonarcloud CI is https://github.com/swift-nav/gnss-converters/blob/master/.github/workflows/ci.yaml#L60 / https://github.com/swift-nav/gnss-converters/blob/master/ci-build.sh#L37

The build wrapper isnt too large or complicated

$ unzip build-wrapper-linux-x86.zip
Archive:  build-wrapper-linux-x86.zip
   creating: build-wrapper-linux-x86/
  inflating: build-wrapper-linux-x86/libinterceptor-haswell.so
  inflating: build-wrapper-linux-x86/build-wrapper-linux-x86-64
  inflating: build-wrapper-linux-x86/libinterceptor-x86_64.so
  inflating: build-wrapper-linux-x86/libinterceptor-i686.so
$ du -s build-wrapper-linux-x86/
4072    build-wrapper-linux-x86/

@swift-nav is using this action at https://github.com/swift-nav/libsettings/blob/master/.github/workflows/cmake.yml , but we are deploying sonarcloud into all of the repos. c.f. https://sonarcloud.io/organizations/swift-nav/projects

lpenz commented 3 years ago

ghaction-cmake doesn't actually include any codecov/coveralls handling. It just runs cmake with the coverage flag, and uses lcov to generate an output file. Our example show how to use that output with coveralls/codecov by using other actions.

I'd expect the same thing to be available for sonarcloud, from what you're saying. Are you aware of any action that can upload lcov data to their server? We can put that in the examples too.

jayvdb commented 3 years ago

Ok, I'll try with our libsettings to use the sonarcloud uploader after this action has generated the coverage data.

jayvdb commented 2 years ago

I have managed to get sonarcloud working. It needs a build wrapper and a post build scanner to reside on the same image, so they need to be added to this image, which I have done - it is still a mess at https://github.com/jayvdb/ghaction-cmake/blob/0.16-poco-gst-interpipe-sonarwrapper-gcovr/Dockerfile#L60-L89 .

The reason why they need to be on the same image is not yet clear - https://community.sonarsource.com/t/missing-a-temporary-subprocess-executable-in-a-c-sonar-scan/51924 has asked the question, but no reply yet.

Here is another thread which I found to be useful reading but I ended up going with gcovr, because the gcov inject would clearly be using the gcov data, but the sonarcloud UI didnt show it until I ran it through gcovr. https://community.sonarsource.com/t/sonarcloud-code-coverage-of-c-in-github-actions/46278

In addition I need to do the following (with env.REPOSITORY_NAME being filled in an earlier step or manually)

build_command: build-wrapper-linux-x86-64 --out-dir sonar-out make VERBOSE=1
          preset: coverage
          post_command: >
            gcovr --sonarqube -o sonarqube.cov && sonar-scanner -Dsonar.host.url=https://sonarcloud.io \
              -Dsonar.organization=${{ github.repository_owner }} \
              -Dsonar.projectKey=${{ github.repository_owner }}_${{ env.REPOSITORY_NAME }} \
              -Dsonar.cfamily.build-wrapper-output=sonar-out \
              -Dsonar.typescript.file.suffixes=- \
              -Dsonar.cfamily.gcov.reportsPath=. \
              -Dsonar.coverageReportPaths=sonarqube.cov

As I expect you wouldn't want to incorporate those into your image, my intention is to create a separate action which is a permanent fork with these extra items in the base image, and a "sonarqube" preset which does all the dirty work transparently.

It looks like on the 18th the poco fixes will be in bullseye (per https://github.com/pocoproject/poco/issues/3244), so I can be using your image as a base layer after that.

I'll wait until then before creating my own action. If you'd like to incorporate sonarcloud into your project, somehow, just let me know.

lpenz commented 2 years ago

It looks like on the 18th the poco fixes will be in bullseye (per pocoproject/poco#3244), so I can be using your image as a base layer after that.

That's good news! Let me know when that's in so that I can rebuild the image

As I expect you wouldn't want to incorporate those into your image, my intention is to create a separate action which is a permanent fork with these extra items in the base image, and a "sonarqube" preset which does all the dirty work transparently. I'll wait until then before creating my own action. If you'd like to incorporate sonarcloud into your project, somehow, just let me know.

I'd like to incorporate it, yes. I still have to look into it, and into sonarcloud itself. I do think that making this action just run the build with whatever changes are needed and having a second action that just sends the results to sonarcloud could be better than having everything bundled up - and it's more similar to what we have for code coverate.

jayvdb commented 2 years ago

I did some playing with splitting it up into two actions, and succeeded at https://github.com/jayvdb/libsettings/blob/sonarcloud/.github/workflows/cmake.yml , results at https://sonarcloud.io/summary/new_code?branch=sonarcloud&id=jayvdb_test

cmake.yml here for easy reading (the dependencies_debian & cmakeflags in the first action are specific to that repos build): ..

jobs:
  coverage:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
        with:
          fetch-depth: 0
          submodules: recursive
      - uses: docker://jayvdb/ghaction-cmake:0.16-sonarcloud-buildwrapper
        with:
          dependencies_debian: python3-pip python3-setuptools python3-wheel libpython3-dev
          cmakeflags: -DPYTHON=python3 -DCMAKE_POSITION_INDEPENDENT_CODE=1
          build_command: build-wrapper-linux-x86-64 --out-dir sonar-out make VERBOSE=1
          preset: coverage
      - name: Check sonar-out/
        shell: bash
        run: ls sonar-out/
      - uses: docker://jayvdb/ghaction-cmake:0.16-sonarcloud-scanner-reduced2
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
        with:
          build_command: gcovr --sonarqube -o sonarqube.cov
          test_command: 'true'
          post_command: >
            sonar-scanner -Dsonar.host.url=https://sonarcloud.io \
              -Dsonar.organization=${{ github.repository_owner }} \
              -Dsonar.projectKey=jayvdb_test \
              -Dsonar.cfamily.build-wrapper-output=sonar-out \
              -Dsonar.typescript.file.suffixes=- \
              -Dsonar.cfamily.gcov.reportsPath=. \
              -Dsonar.coverageReportPaths=sonarqube.cov

The first action image only needs the build-wrapper added. https://github.com/jayvdb/ghaction-cmake/commit/c4180371dc2a821a42f55de41d2f989435f16d3a has a few extras that were not needed. The remaining challenge is that only works on x86 - need to work out what other arch are supported by sonarcloud.

The second action needs gcovr and the sonar-scanner, and doesn't need a lot of extra stuff. I needed to hack cmake_cmd to be "true .." to remove the dependency on cmake.

I tried moving the gcovr step outside the sonar-scanner action, thus removing the image dependency on gcc, but it produces a lot of warnings because expected header files are missing/different, and the sonar-scanner fails while producing warnings like

The compiler probe 'stdout' is expected to contain at least one '#define' directive:

I tried re-adding only prominent deps of gcc (cpp, binutils, libc6-dev-i386, libgcc-10-dev) to the sonar scanner image, but still had the same problem until I re-added gcc, when the scanner didnt exit with a failure code, but then the coverage % was 0. So either gcc or one of its dependencies is needed. Likely if the build uses clang, the scanner image might also need it.

When I moved the gcovr command invocation back into the docker action, coverage in the sonarcloud UI returned to the 38-ish% it should be.

jayvdb commented 2 years ago

That's good news! Let me know when that's in so that I can rebuild the image

libpoco-dev is now fixed in bullseye.

I would like to get started on polishing the sonarcloud integration now. I'll figure out the multi-arch aspect of the build-wrapper now, and will be creating a PR soon. Let me know if any of the above analysis has given you ideas on how you want it integrated.

lpenz commented 2 years ago

Awesome Would you mind checking if lpenz/ghaction-cmake:latest is ok? That's what currently in the main branch, including the new pre-command argument (issue #11). If it's fine I'll make the release

jayvdb commented 2 years ago

I can do a round of testing of the new bits within 24hrs - maybe sooner, but cant promise. IMO, just ship it. ;-)

jayvdb commented 2 years ago

I've done some decent testing of :latest esp pre_command. Specifically using them to include packages from custom packages and to interact with sonarcloud.

Back on topic, if the build wrapper and scanner are not updated at the same time, the scanner emits this warning

  WARN: 
  File
    /github/workspace/sonar-out/build-wrapper-dump.json
  was generated using 6.28 build-wrapper version,
  which does not match analyzer 6.29.0.41127 version.

Presumably, this will cause bigger problems if one is a major version behind the other.

Here are some steps to integrate sonarcloud, and add nvidia deb repo.

    steps:
      - uses: actions/checkout@v2
        with:
          fetch-depth: 0
          submodules: recursive
      - name: Set env.REPOSITORY_NAME
        shell: bash
        run: echo "REPOSITORY_NAME=$(echo '${{ github.repository }}' | awk -F '/' '{print $2}')" >> $GITHUB_ENV
      - uses: docker://lpenz/ghaction-cmake:latest
        name: Build
        with:
          cmakeflags: -DCMAKE_POSITION_INDEPENDENT_CODE=1
          pre_command: |
            apt-get install -y gnupg2 software-properties-common unzip wget
            apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub
            add-apt-repository "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/ /"
            add-apt-repository contrib
            mkdir /root/tmp
            cd /root/tmp
            wget --continue https://sonarcloud.io/static/cpp/build-wrapper-linux-x86.zip
            unzip build-wrapper-linux-x86.zip
            cd build-wrapper-linux-x86
            mv build-wrapper-linux-x86-64 /usr/bin
            mv libinterceptor-haswell.so libinterceptor-i686.so libinterceptor-x86_64.so /usr/bin/
            cd ..
            rmdir build-wrapper-linux-x86
          dependencies_debian: librdkafka-dev cuda-nvml-dev-11-4 libglib2.0-dev libgstrtspserver-1.0-dev rapidjson-dev libpoco-dev libpocofoundation70
          build_command: build-wrapper-linux-x86-64 --out-dir sonar-out make VERBOSE=1
      # The following step is a horrible hack for adding sonarcloud
      # support to this GitHub action.  A new 'preset' needs to be created
      # in the github action to support this.
      # See https://github.com/lpenz/ghaction-cmake/issues/6
      - name: Run sonarcloud sonar-scanner
        if: ${{ matrix.compiler == 'clang' }}
        uses: docker://lpenz/ghaction-cmake:latest
        with:
          cmakeflags: -DCMAKE_POSITION_INDEPENDENT_CODE=1
          pre_command: |
            apt-get install -y gcovr unzip
            echo "#!/usr/bin/env bash" > /usr/bin/cmake
            chmod a+x /usr/bin/cmake
            mkdir $HOME/.sonar/
            curl -sSLo $HOME/.sonar/sonar-scanner.zip https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-${SONAR_SCANNER_VERSION}-linux.zip && \
            unzip -o $HOME/.sonar/sonar-scanner.zip -d $HOME/.sonar/
          build_command: gcovr --sonarqube -o sonarqube.cov
          test_command: 'true'
          post_command: >
            ${HOME}/.sonar/sonar-scanner-${SONAR_SCANNER_VERSION}-linux/bin/sonar-scanner -Dsonar.host.url=https://sonarcloud.io \
              -Dsonar.organization=${{ github.repository_owner }} \
              -Dsonar.projectKey=${{ github.repository_owner }}_${{ env.REPOSITORY_NAME }} \
              -Dsonar.cfamily.build-wrapper-output=sonar-out \
              -Dsonar.sources=src \
              -Dsonar.typescript.file.suffixes=- \
              -Dsonar.cfamily.gcov.reportsPath=. \
              -Dsonar.coverageReportPaths=sonarqube.cov \
              -Dsonar.cfamily.cache.enabled=false \
              -Dsonar.cfamily.threads=1
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}  # Needed to get PR information, if any
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

My next step is to add this voodoo into https://github.com/lpenz/ghaction-cmake/blob/main/entrypoint and do a PR.

lpenz commented 2 years ago

I'll ship it, thank you for testing

Back on topic, if the build wrapper and scanner are not updated at the same time, the scanner emits this warning We can just make them one thing, it's not really a problem.

I'm actually surprised that such a big pre_command worked :) All commands are executed with subprocess.Popen(..., shell=True), I was not sure it would work well with newlines. We could, as an alternative for the future, write the command to a temporary file (with set -e -x on top) and tell bash to run it. That would allow us to see each command along with their output - I'm guessing today we see all commands on top and then the whole output of all of them, together.

jayvdb commented 2 years ago

We can just make them one thing, it's not really a problem.

Well if they are fetched and installed on demand, adding sonarcloud doesnt increase the image size for non-sonarcloud users, and they will always be roughly the same version except if the fetches occur exactly when new versions are uploaded to the sonarcloud website, and only in as much as they dont have a sane CD process which is atomic publishing of both.

Many when there are lots of sonarcloud users of this GHA we could revisit that and have these tools on the image, and then we'll have more people interested in ironing out any problems related to that. (e.g. the scanner is bundling its own jre, which I would want to de-vendor if building it onto our own image, but that adds maintenance burden)

I'm guessing today we see all commands on top and then the whole output of all of them, together.

Yea, that is what is happening, and a set -ex bash script would be better, but later.