Open camilaagw opened 1 month ago
Logs are written to s3 bucket by the Spark job. you can define a unique path for each job https://github.com/awslabs/data-on-eks/blob/9e52517badf3d3bca758544f09d1f905c07eec0f/analytics/terraform/spark-k8s-operator/examples/karpenter/nvme-ephemeral-storage/nvme-ephemeral-storage.yaml#L55
A question for my current use case: I am wondering if with this setup it is possible to have an S3 bucket with logs divided across subdirectories. For example:
s3://my-bucket/dir1/ s3://my-bucket/dir2/ s3://my-bucket/dir3/ etc