Open migmartri opened 1 year ago
Hello!
In the topic on what to send exactly if going all in it's true we could potentially send both attestations and materials but, what do we want to cover with that? Do we want a third party system to react to the intermidiates or the conclusion of a flow? Both of them can be implemented and could be left at user's discretion, when setting up the plugin, what to send in a differentiated way.
When it comes to the lifecycle and more or less aligned to the statement above, it could be:
For an initial approach and since it's the plugin sending messages, we can forget about more complex stuff regarding consumer of events but sets a good starting point for future improvements.
@Javirln that makes sense to me, some comments inline.
But what do we want to cover with that?
It's too early to know, but my current thinking is that we already have native mechanisms to store materials and attestations, so using Kafka for yet another storage might not be as valuable as being able to use attestation itself for processing.
By sending attestations to a Kafka topic, it could become part of a downstream data pipeline, opening many possibilities.
Also, the attestation has information about the materials and their location, so technically, users could download it (with the help of the Chainloop CLI/API) for now.
So I'd make this first version send attestations. We could send the attached materials in future versions of the plugin.
When it comes to the lifecycle and more or less aligned with the statement above, it could be:
I am unfamiliar with Kafka, but my current rule of thumb in terms of configuration is to try to offer sensible defaults as much as possible, and if that can apply to the topic configuration during attachment, then great!
Let me know if this is something you want to give a crack, and how I can help.
Thank you!!
Awesome, let me settle some thoughts and will send an implementation proposal so we can iterate!
Thanks Migue!
Chainloop has a plugin mechanism for fanOut integrations.
A fanOut plugin implements logic that will be executed when attestations or materials are received. This logic can be anything from sending a Slack message, uploading the attestation to a storage backend, or sending a Software Bill Of Materials (SBOMs) to Dependency-Track for analysis, for example. You can find the current list of plugins here
This pattern fits exceptionally well with message streams like Kafka. This task is about exploring how an integration with Kafka would look like.
To design an integration, a couple of questions need to be answered.
a) What kind of metadata we want to send. Note that plugins can subscribe to attestation metadata or to any material. For example, dependency-track plugins handle CYCLONEDX_JSON_SBOM while Slack does
ATTESTATIONS
. For the record, we could do both. b) What does it mean to configure Kafka in the context of both registration in a Chainloop org and attachment to a workflow? See lifecycle of an integrationFor example, in dependency-track at registration, users can configure a Dependency-Track instance, while at attachment, they can configure the DPTrack project where to send the SBOMs. This could enable us, for example to onboard the Kafka instance on registration and choose the topic during attachment (I am not even sure this is valid, but just as an example)
In addition to that, we also support annotations on the attestations, and those could be used in plugins too, as we did in
dependency-track
to make the project name dynamic` (in any case this is an option that could be explored in the future.