Closed lxbsz closed 3 years ago
[root@node01 ceph-iscsi]# gwcli ls
o- / ......................................................................................................................... [...]
o- cluster ......................................................................................................... [Clusters: 1]
| o- ceph .......................................................................................................... [HEALTH_WARN]
| o- pools .......................................................................................................... [Pools: 6]
| | o- cephfs.a.data ......................................................... [(x3), Commit: 0.00Y/29826642K (0%), Used: 0.00Y]
| | o- cephfs.a.meta ....................................................... [(x3), Commit: 0.00Y/29826642K (0%), Used: 133931b]
| | o- datapool ............................................................... [(x3), Commit: 2M/29826642K (0%), Used: 171996b]
| | o- device_health_metrics ................................................. [(x3), Commit: 0.00Y/29826642K (0%), Used: 0.00Y]
| | o- ecpool ................................................................. [(2+2), Commit: 0.00Y/59653284K (0%), Used: 24K]
| | o- rbd .................................................................. [(x3), Commit: 0.00Y/29826642K (0%), Used: 21263b]
| o- topology ................................................................................................ [OSDs: 3,MONs: 3]
o- disks .......................................................................................................... [2M, Disks: 2]
[root@node01 ceph-iscsi]# ceph osd erasure-code-profile get default
k=2
m=2
plugin=jerasure
technique=reed_sol_van
Have you tested the new version?
Have you tested the new version?
Yes, I did. It worked for me and show the same info like above.
Show the pool profile instead of size/min_size.
Signed-off-by: Xiubo Li xiubli@redhat.com