tlc-pack / tvm-tensorir

Apache License 2.0
8 stars 0 forks source link

[DISCUSS] Tensor Expr AutoDiff for TIR #46

Open tqchen opened 4 years ago

tqchen commented 4 years ago

This is something for fruit of thought as a possible future work. Not necessarily actionable right now. https://discuss.tvm.ai/t/rfc-bring-in-tensor-expression-autodiff/5987

Discusses how can we introduce tensor expr level AD to the te.compute. It would be interesting to think about how can we generalize to the TIR level. In particular, if we place restrictions, such as making sure all blocks are complete, would we be able to run autodiff on the TIR directly written in hybrid script.

It would be useful to discuss and align possible designs right now so we can prepared for such as change, if it is possible.

cc @yzhliu @Hzfengsy @spectrometerHBH

yzhliu commented 4 years ago

Does our new tir allow user to modify index variables freely?

tqchen commented 4 years ago

@yzhliu can you elaborate what do you mean?

yzhliu commented 4 years ago

For example, can user use %free_var as index like following ?

for %i = 0 to 15 {
  block(%v0[0:15]=%i) [W:[B[%v0:(%v0 + 1)]], R:[A[%v0:(%v0 + 1)]]] {
    %free_var = %v0 > 10 ? 0 : %v0
    %B[%free_var] = %A[%v0] * 2;
  }
}

and does it support index like %B[%A[%v0]]?

tqchen commented 4 years ago

I don't think we could diff wrt to indirect indices, that is a restriction. If a variable is bound once, and its input are pure, i think it might be possible to diff wrt to that

tqchen commented 4 years ago

had a discussion with @yzhliu about this, a high level summary is that we can do it, as long as all blocks are complete and we ignore the loops