Open srisek opened 1 year ago
This would be a nice feature and not so difficult to implement.
In the implementation in the feature/external-step-logs
, you can define a template for the URL to the S3 bucket like this:
externalLogsUrlTemplate: https://externalBaseUrl/$namespace/$taskRunPodName/$stepContainer
.
The variables $namespace
, $taskRunPodName
, and $stepContainer
are interpolated on runtime.
Can you access the logs in the S3 bucket without authentication? Or are you using Authorization: AWS AWSAccessKeyId:Signature
?
That's cool, thanks for looking into this. We have a service that runs in a pod that handles the process to read from S3 that is configured with the AWS related configuration. We point the Tekton dashboard to externalURL configuration to this internal service with in our own cluster to get the logs from S3.
@srisek can you check if the imlementation in v0.3.3 works for you? :)
@rannox. We did notice that the latest version of the plugin is moving away from using the external logs feature. Plugin version 0.3.3 did work fine. When trying to update the plugin to 1.1.0 , it seems like it is falling back on reading the logs using the kubeapi to read the pod logs. We have a process of not retaining the task run logs for long in the cluster since the logs are getting sourced out to an external system. We use the default Tekton provided pruner configuration to clean up the old pods.
Is there a way that the external logs feature can still be supported by the plugin that can be provided as an addon option?
Additionally, it would be good if the plugin can have a capability to filter or limit the results displayed on the backstage dashboard. Like for ex: if there are 50 pipeline runs in a given namespace, is there a way that we can display only the last 5 pipeline runs to make the user experience much easier?
Hello Tekton-Backstage-Plugin Team,
In our current Tekton ecosystem, we don't retain the Taskrun Pods to live longer with in the cluster. As a workaround to read logs in the tekton dashboard we read it from S3 and display it in the dashboard. Due to the pods are short lived, we are getting a 404 error eventually when we click on the 'Show Log' button on the Task Runs. Is there a way you can provide an option for the task run logs feature to pull it from an external url link source instead of getting the logs from the pods using the tekton API
https://github.com/jquad-group/backstage-jquad/blob/7997dec3120e32f594b463c99dd4db8b6b36dc04/plugins/tekton-pipelines/src/components/StepRow/StepRow.tsx#L48
For Ex: It would be great if we can add an environment variable to pass in a base external url path and construct the request URI like
https://External Source URL/Namespace Name/ Task Run Pod Name/Step-Name
If this feature can be controlled with a flag by default to use the pod logs by default and if not read it from an eternal source would be handy.
Thanks and Appreciate your support on this regard.