./dcgm-exporter: /lib64/libc.so.6: version `GLIBC_2.32' not found (required by ./dcgm-exporter)
./dcgm-exporter: /lib64/libc.so.6: version `GLIBC_2.34' not found (required by ./dcgm-exporter)
DCGM runs fine, but dcgm-exporter has glibc errors.
Out of caution I cannot install the missing version on the running production server.
Moreover, directly compiling into a statically linked binary file will always get stuck in the startup process, but can set the host:port of nv-hostengine with dcgm-exporter -r
Ask your question
I compile the DCGM locally, and dcgm-exporter, and then throw the binary executable directly to the GPU server to run.
They are all physical machines, not containers, and I don't have an Nvidia card locally
DCGM runs fine, but dcgm-exporter has glibc errors. Out of caution I cannot install the missing version on the running production server.
Moreover, directly compiling into a statically linked binary file will always get stuck in the startup process, but can set the host:port of nv-hostengine with
dcgm-exporter -r
How can I achieve my goal? please