mitsuba-renderer / drjit

Dr.Jit — A Just-In-Time-Compiler for Differentiable Rendering
BSD 3-Clause "New" or "Revised" License
601 stars 45 forks source link

Tensor initialization creates opaques (regression?) #267

Closed DoeringChristian closed 2 months ago

DoeringChristian commented 3 months ago

In the current master branch #6e44e23 , initializing a tensor with a single value creates an evaluated/opaque tensor.

In [1]: import mitsuba as mi

In [2]: import drjit as dr

In [3]: mi.set_variant("cuda_ad_rgb")

In [4]: dr.set_log_level(dr.LogLevel.Debug)

In [5]: a = mi.TensorXf(3)
jit_var_new(): float32 r23 = data(<0x302000800>)
jit_var_mem_copy(): float32 r23[1] = copy_from(host, <0x55a00ce41a90>)

In [6]: a.state
Out[6]: VarState.Evaluated

In previous versions of Dr.Jit (v0.4.6) this created a literal.

In [1]: import mitsuba as mi

In [2]: import drjit as dr

In [3]: mi.set_variant("cuda_ad_rgb")

In [4]: dr.set_log_level(dr.LogLevel.Debug)

In [5]: a = mi.TensorXf(3)
jit_var_new(float32 r24): literal = 3

Note the jit_var_new call creates a literal.

I wanted to ask if this is intended or if it is a regression from the previous versions.

njroussel commented 2 months ago

Hi @DoeringChristian

I'm not sure this is intended, seems like a regression to me. @wjakob can you please confirm that this wasn't intentional ?

wjakob commented 2 months ago

It's unintentional.

DoeringChristian commented 2 months ago

In this case, I could try and fix it and make a PR.

njroussel commented 2 months ago

Sure, go ahead :)

Ping me if you need any help