openxla / stablehlo

Backward compatible ML compute opset inspired by HLO/MHLO
Apache License 2.0
416 stars 113 forks source link

Seeking information on low-level TPU interaction and libtpu.so API #2471

Closed notlober closed 4 months ago

notlober commented 4 months ago

I'm looking to build an automatic differentiation library for TPUs without using high-level front-ends like TensorFlow/JAX/PyTorch-XLA, but I'm finding information about lower-level TPU usage is practically non-existent.

Specifically, I'm interested in:

  1. How to interact with TPUs at a lower level than what's typically exposed in TensorFlow
  2. Information about the libtpu.so library and its API
  3. Any resources or documentation on implementing custom TPU operations

Are there any insights or suggestions on how to approach this, particularly regarding TPU support? Any ideas or help would be greatly appreciated.

I understand that some of this information might be proprietary, but any guidance on what is possible or available would be very helpful.

abhigunj commented 4 months ago

Let's follow up on the corresponding XLA ticket https://github.com/openxla/xla/issues/15657 for further discussion