Open kingformatty opened 5 days ago
Can you share more information on your configuration, especially which DL framework you're building with? Passing the --verbose
flag to pip install
would also provide more useful build logs. A hang makes me suspect your system is over-parallelizing the build process:
transformer_engine_torch
, then it's a failure while building a PyTorch extension. Try setting MAX_JOBS=1
in the environment (see this note). Note that building Flash Attention is especially resource-intensive and can experience problems even on relatively powerful systems.CMAKE_BUILD_PARALLEL_LEVEL=1
in the environment.
Hi, anyone faces the problem of installation gets stuck at building wheel?