INTI-CMNB / KiBot

KiCad automation utility
GNU Affero General Public License v3.0
557 stars 66 forks source link

[HOW] Best practice setup CI/CD for Kibot #575

Closed Nudelsalad closed 6 months ago

Nudelsalad commented 8 months ago

I am figuring out the best approach to have a pipeline running for all my kicad projects stored inside Gitlab.

I am currently seeing therefore three approaches:

Most general one:

Current Problem:

Advantages:

Disadvantages:

For me it is very important to have always a fine grained control on what is being outputted, if nothing is specified or no kibot.yaml is found inside the project it takes a generous one. Therefore I figured out the following 2 approaches(first might be impossible?):

First idea

  1. Defining the output jobs for every manufacturer with the same output ==names==
  2. Putting a kibot.yaml file into every repository which gets executed by pipeline if exists
  3. importing the default config and the needed config (e.g.:JLCPCB) after that (to let overwrite the default output configs of existing outputs with those of JLCPCB)
  4. executing kibot in CI to execute the output jobs

Advantages:

Disadvantages:

So my question now:

Does kibot only concatenate the yaml files or does he also substitute already defined outputs where the output name is the same with the imported ones?

Second idea (workaround to this problem)

An option would be to define only certain jobs in one file and to import the rest where the fabrication outputs all have the same name.

Generous file in every repo (or if nothing is found, take always a generous one with imported default.kibot.yml fabrication outputs):

Needs to be specified inside every fine grained controlled repository:

main.kibot.yaml

BOM, IBOM, 3D model, kiri
Import config here # needs to be set by hand
# YAML substitution or preprocessing now possible

fabrication output jobs where the output names are the same so the CI can call them seamlessly all with the same name

default.kibot.yaml
gerber, drill, pick and place

jlpcb.kibot.yaml
gerber, drill, pick and place

Your internal templates for outputs provide all different names for outputs, this is why I ask this question. Also maybe as a kind of guideline to setup CI/CD workflow for me and others.

Interesting to know would also be if I import an anchor of the same name if this does overwrite the settings

I have already running the first mentioned approach. Kind of unhappy that no fine grained control is possible there. Seeing forward to your expertise @set-soft and contributing this to the docs if wished to help future me and others.

set-soft commented 8 months ago

Hi @Nudelsalad !

Your description is somehow abstract, so I'm not sure i get the full idea.

The first thing that comes to my mind: Why don't you just generate fabrication stuff for all the manufacturers you need could need and then let the user choose the one that's needed for the current project?

Does kibot only concatenate the yaml files or does he also substitute already defined outputs where the output name is the same with the imported ones?

Outputs must have unique names. In order to "concatenate" outputs you must explicitly extend an output. If you import an output with an already defined name you'll get an error.

Interesting to know would also be if I import an anchor of the same name if this does overwrite the settings

YAML anchors are local to a file, they don't get propagated to other files.

From your description it looks like you need some extra logic. I don't fully understand what do you think is simple and what isn't. You mention that defining an environment variable isn't simple ... but it looks like adding extra files is simple for you ...

Also note that you can use groups to simplify the KiBot invocation. So you can have groups like "sch_docs", "pcb_docs", "render", etc. And then also define things like "fab_manf1", "fab_manf2", etc. So then you call KiBot asking for the outputs you want, i.e. "pcb_docs render fab_manf2"

oliv3r commented 8 months ago

@Nudelsalad You are in luck! I just released v0.2.0 of my gitlab pipeline component. It's so new, I haven't properly introduced it to @set-soft yet :D (I was about to actually :p, but have one tiny pdf scaling issue left to resolve).

If you want to know how to 'use' this component, I've also got you covered (besides the self-test and readme of the repo)!

I've been working on https://gitlab.com/riglol/lapod as my test repo-ish, to get this pipeline component working. I'm now in the process of adding the pipeline component to my other kicad projects too ;) literally years I was too lazy to do so.

The rigol/lapod repo hasn't been merged to master and tagged for release yet, so you don't see the 'release process' yet there (should happen in a few days), but the component repo itself does actually tag and release itself, with a hello-world demo project.

P.S. Note that it currently relies on kibot:dev, though (after the pdf bug) we could hopefully release kibot v1.6.4 soon! (hint hint).

Feel free to reach out if things are not clear (and thus the documentation needs improvement). Yes, it is a bit tricky to first grasp and set it up. But once it runs, it's amazing!

The really dumb 'TL;DR' version is

# SPDX-License-Identifier: AGPL-3.0-or-later
#
# Copyright (C) 2024 Olliver Schinagl <oliver@schinagl.nl>

include:
  - component: gitlab.com/ci-includes/kici/gitlab-component@v0.2.0
    inputs:
      stage_lint: 'lint'
      stage_build: 'build'
      attach_diff: 'true'
      projects: 'breakout_hdmi breakout_hdmi_mini pod_lmh7322 pod_lmh7324 pod_sn65lvds'
  - component: gitlab.com/ci-includes/pipeline-deployers/release-announcer@v0.1.0
    inputs:
      release_assets: 'assets/release.json'
      release_dependency: 'kici_deploy'
  - component: gitlab.com/ci-includes/pipeline-deployers/release-brancher@v0.1.0
  - project: 'ci-includes/masslinter'
    ref: 'master'
    file: 'all-linters.yml'

workflow:
  rules:
    - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH'
    - if: '$CI_COMMIT_BRANCH =~ /^release\/v\d+.\d+$/'
    - if: '$CI_COMMIT_TAG'
    - if: '$CI_MERGE_REQUEST_IID'
    - if: '$CI_PIPELINE_SOURCE == "web"'

stages:
  - lint
  - build
  - branch
  - deploy
  - complete

default:
  tags:
    - docker

as your .gitlab-ci.yaml file, and submodule/clone/copy/create a .kibot folder in the root of your repo that holds your (properly named) kibot.yaml file. This is the hard part to get right ... as it relies on the proper naming and/or symlinks

Nudelsalad commented 8 months ago

Hi @oliv3r , thank you very much for the valuable insight into your CI project. Amazing work, I gave it a whirl and I noticed several things.

First of all the gitlab components are a nice new feature of Gitlab. Sadly they dont support "remote-components" yet. Since I want to implement this on my very own gitlab instance, I needed to fork your projects and release them in my own CI/CD Catalog.

Second thing I noticed was that I dont want to have submodules in there and linking this again inside a projects directory seemed pretty redundant for me. So I created separate repository for the configs only, added them to the kibot container image and update the "clone" in every of your defined stages. (That sucks but I dont know another workaround for now, same for footprints, 3dmodels and symbols. maybe caching would help). So I had to rewrite your template a bit to function with the "preinstalled" repos because:

  1. rules:exists Can only look into the CI working directory if a file exists
  2. Changed official kibot image to mine
  3. Set '${KICI_PROJECTS}/kici_conf.env' to have a separate config for every project inside a repo
  4. There seems to be an issue with deploying when using your template for only one project not inside a subdir The default is set like this default: './', and this works throughout the project, but in deploy stage the request url get's build like this:
for _proj in ${KICI_PROJECTS}; do
        if [ ! -d "${CI_PROJECT_DIR}/outputs/${_proj}" ]; then
          continue
        fi
        echo "Generating packages for '${_proj}' ..."

...

_url="${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${_proj}/${CI_COMMIT_TAG#v}/${_file_name}"

there _proj gets replaced by packages/generic/./hello-world.... receiving then a 400 Http error when curling.

TBH I am not a CI/CD expert and your project is amazing but I want to avoid having a monorepo for my projects and having everywhere the submodules in e.g. for 3d-models, footprints etc. This is why i took the special approach to clone them into container. I see two possible enhancements here:

  1. Pulling symbols and footprints also from kicad_url then the kibot image can be used directly without a separate install step @set-soft
  2. Putting the configurations you made also into the kibot image so nor the submodules nor linking inside the project is necessary I also didn't understood why you linked them again inside the project.
set-soft commented 8 months ago
  1. Pulling symbols and footprints also from kicad_url then the kibot image can be used directly without a separate install step @set-soft

Not sure if I understand it. The official symbols and footprints are in the docker images. The 3D models are pulled from kicad_url because they take too much space (5 GB, when the full image is 3.4 GB and the regular image is under 1.5 GB).

You can also take a look at #453 which proposes a mechanism.

Also about "best practice": Try to separate any variants/filters, etc. from the files generation. The best mechanism is to first modify the PCB/Sch, save it and then apply all the outputs that generates something from it. But in the generation step avoid to modify the PCB/Sch. This is the ideal approach, but I think the code needs a lot of adjusts to make this work. But in general: If you'll have PCB/Sch variants and/or panleization, the best is to first generate the variants and/or panels, and then run KiBot again using the generated files.

oliv3r commented 8 months ago

Hi @oliv3r , thank you very much for the valuable insight into your CI project. Amazing work, I gave it a whirl and I noticed several things.

First of all the gitlab components are a nice new feature of Gitlab. Sadly they dont support "remote-components" yet. Since I want to implement this on my very own gitlab instance, I needed to fork your projects and release them in my own CI/CD Catalog.

Yes, that is indeed very sad, and nothing I can help with unfortunately :) It's a limitation/feature of gitlab. Instead of forking, you could consider setting up a mirror? I'm actually also working on a mirror component, as mirroring is license restricted sigh, so that could make this easier as well.

I know you can import from URL's but in the past, this failed when using more then a single file.

Second thing I noticed was that I dont want to have submodules in there and linking this again inside a projects directory seemed pretty redundant for me. So I created separate repository for the configs only, added them to the kibot container image and update the "clone" in every of your defined stages. (That sucks but I dont know another workaround for now, same for footprints, 3dmodels and symbols. maybe caching would help). So I had to rewrite your template a bit to function with the "preinstalled" repos because:

Choice, as the readme describes, completely up to you. Submodules is just a way to distribute 'the common yaml files' (the opinionated part :p). You can just copy them and maintain them yourself. You could try subtree also, which 'copies' the files into your tree at a given point. But again, its just a way to get my kibot config files into your repo. Submodules are the easiest, subtree the second, 'copy-paste' the third :). Your choice.

The symlinks is a 'hack' :p The pipeline checks the root for the .kibot/file bit using exists, to see which pipeline job to run, since the pipeline doesn't know where to look for files. I think you could also say exists: **/${JOB_NAME_CI} to go through all subdirectories and check that way. Not a bad improvement :) it would remove the need for the .kibot root directory. However! Most projects won't be nested, will have a single kicad project in the root.

But let me contemplate this, and see what downsides this may trigger ... Though I admit I had overlooked the ** here :)

1. rules:exists Can only look into the CI working directory if a file exists

gitlab-org/gitlab#438863 :)

This is also why the symlink hack exists as well. Initially, I just had the templates in the root of the submodule, so adding it as .kibot was great locally, but failed remotely. But even with ** we'd still need the symlinks to the submodule to work around this issue for single kicad projects. Multi-kicad projects are fine, as we are symlinking from the indivuduals already.

2. Changed official kibot image to mine

Obvious to your specific requirement

3. Set '${KICI_PROJECTS}/kici_conf.env' to have a separate config for every project inside a repo

It already does this? Right now, the for-loop goes through each listed projected (be it . or by name) and it sources the env for each separate kibot run. Not sure what the problem is here, unless you want all conf files in one location.

4. There seems to be an issue with deploying when using your template for only one project **not** inside a subdir
   The default is set like this `default: './'`, and this works throughout the project, but in deploy stage the request url get's build like this:
for _proj in ${KICI_PROJECTS}; do
        if [ ! -d "${CI_PROJECT_DIR}/outputs/${_proj}" ]; then
          continue
        fi
        echo "Generating packages for '${_proj}' ..."

...

_url="${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${_proj}/${CI_COMMIT_TAG#v}/${_file_name}"

there _proj gets replaced by packages/generic/./hello-world.... receiving then a 400 Http error when curling. Hmm, that sounds like a bug indeed. I haven't deployed a single-project repo yet! So thanks for the heads up (I do have a project that runs the pipeline, but haven't tagged the result yet. So I'll have to bug-report fix that. Feel free to file an issue/MR for this in the project though! Would much appreciate that. Probably have to use the shell to readlink the project to resolve it first ...

I got a fix, will push it to master, and then test it if I find a moment, as it also costs me some CI bandwidth, which is slow (takes like 30 minutes now)

TBH I am not a CI/CD expert and your project is amazing but I want to avoid having a monorepo for my projects and having everywhere the submodules in e.g. for 3d-models, footprints etc. This is why i took the special approach to clone them into container. I see two possible enhancements here: Hmm, strange. As per readme, the idea is to have self-contained repository. KiCad does this for symbols and footprints already (they all get copied into the .sch and .pcb. Likewise, packages3D is where you keep a copy of all 3d models. This can be achived via the archive3D plugin. See the test code of the kici project for examples. Everything is self-contained there as well, and the 3D models are in the repo itself. I don't think that's a bad design, because if you clone the repo, you want all bits and parts to be in the repo that belong to your release. I think the difference with software design, is that you don't 'version-lock' your symbols and footprints, you want the exact copy of what you used, and update them when needed. Slight different way of looking at dependencies. But that's a kicad design choice.

1. Pulling symbols and footprints also from kicad_url then the kibot image can be used directly without a separate install step @set-soft

Don't think this is an issue :p but see my point above :)

2. Putting the configurations you made also into the kibot image so nor the submodules nor linking inside the project is necessary
   I also didn't understood why you linked them again inside the project.

what do you mean here? The kibot templates are hand-crafted and not part of kibot per say. They are as I wrote 'opinionated'. @set-soft could decide my templates are pretty good for 'generic puprposes' and ship them in the container as a 'contributed feature'. That would remove the need for the submodule/kibot files which I can only agree with :) but does couple the kici gitlab template to the kibot docker container of course.

oliv3r commented 7 months ago

@Nudelsalad I've just pushed quite a few updates and bugfixes, that slightly help you in your use-case. I've also made the double-glob on exists work, so thanks for that tip! And I think I solved all your reported issues.

I'm about to tag v0.4 which will include present and navigate targets for gitlab pages. While super cool, navigate could do with a stylish update :p

I've also linked some projects that are (about to) use v0.4 of kici, so can help to serve as an example. I still rely on the submodule trick of course, though again, it's just a means to avoid having to copy the kibot job files :)