ceph / ceph-csi

CSI driver for Ceph
Apache License 2.0
1.27k stars 544 forks source link

Ceph-csi enable deployments to multiple namespaces per cluster #4742

Closed jz543fm closed 1 month ago

jz543fm commented 3 months ago

Describe the feature you'd like to have

What if the K8s cluster admin wants to deploy to the multiple namespaces in the one cluster

What new functionality do you want?

Add some bash script to generate manifests for multiple namespaces or other option

What is the value to the end user? (why is it a priority?)

Enable the K8s cluster admin to deploy ceph-csi plugin to multiple namespace without rewriting all the Kubernetes resources

How would the end user gain value from having this feature?

Faster deployment to the new namespaces, ceph-csi plugin faster options to generate multiple manifests to multiple namespaces Remove port conflicts during deployments

How will we know we have a good solution? (acceptance criteria)

The ability to deploy to multiple namespaces, without port conflicts, deployments error, we need to achieve fast/effective and simple deployments to multiple namespaces, generate multiple k8s resources for multiple manifests

nixpanic commented 3 months ago

@Madhu-1 can ceph-csi-operator help with this?

Madhu-1 commented 3 months ago

@nixpanic yes thats possible with ceph-csi-operator :)

jz543fm commented 3 months ago

I've checked that and I can see there are CRDs, if I am right, it looks interesting, where is there docs to the ceph-csi operator? I prefer plain yams, I tried and there is possibility to use the plain manifests without operator, I tried rewrite it for multiple namespaces so there is need to avoid port conflicts and use some name convention, the bash script helped me to generate manifest to multiple namespaces, use some name convention and rewrite just port name, from default to <826>{number_of_namespace_to_deploy} and name on the same basis

github-actions[bot] commented 2 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in a week if no further activity occurs. Thank you for your contributions.

github-actions[bot] commented 1 month ago

This issue has been automatically closed due to inactivity. Please re-open if this still requires investigation.