open-ce / pytorch-feedstock

Apache License 2.0
5 stars 11 forks source link

set USE_TBB and package tbb headers #63

Closed cdeepali closed 3 years ago

cdeepali commented 3 years ago

Checklist before submitting

Description

As of now there is a restriction to use OMP_NUM_THREADS=1 with PyTorch to avoid following warnings and probable hangs. OpenBLAS Warning : Detect OpenMP Loop and this application may hang. Please rebuild the library with USE_OPENMP=1 option. This however, makes trainings too slow.

This PR enables usage of TBB for Intra-op Parallelism. https://pytorch.org/docs/stable/notes/cpu_threading_torchscript_inference.html

Resolves #34

Review process to land

  1. All tests and other checks must succeed.
  2. At least one maintainer must review and approve.
  3. If any maintainer requests changes, they must be addressed.
cdeepali commented 3 years ago

PyTorch currently does not package TBB headers, but it requires these at run time because it sets AT_PARALLEL_NATIVE_TBB in supplied Config.h. Details - https://github.com/pytorch/pytorch/blob/b54072a6f9993e1d65836c6be095b30b42010065/.jenkins/pytorch/test.sh#L181-L184

So adding a patch to package TBB headers.