apache / mxnet

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
https://mxnet.apache.org
Apache License 2.0
20.77k stars 6.79k forks source link

Support for XLA devices #16916

Open guoquan opened 4 years ago

guoquan commented 4 years ago

Description

XLA is an abstraction layer of computation graph for better efficiency, consistency, portability, and a lot as they claim. However, the most significant feature is to enable access to google's TPU or other customized accelerator hardware that uses the same abstraction.

Sample code to use XLA device could be:

from mxnet import nd
from mxnet_xla import xla
x = nd.ones((4,5), ctx=xla.xla_device())
print(x)

References

pengzhao-intel commented 4 years ago

Good idea! We're evaluating the possibility of XLA recently :)

cjolivier01 commented 4 years ago

As a start, it would be good to simply be able to generate an HloModuleProto protobuf file.