Open Kalkasas opened 6 years ago
short: No, I think that the current implementation of the Unknown
datatype (as a terra type) makes this impossible. But I was able to get your example to compile.
long answer: There are actually two different questions in your question:
regarding 2: I was able to get your example to compile by commenting out everything inside the function insert_dummy_energies_for_unused_unknown_index_spaces
in API/src/o.t
.
regarding 1: Here is a slightly less complicated energy that will throw the same error that you experienced
U,V = Dim("U",1), Dim("V",2)
funcParams1 = Unknown("funcParams1", opt_float, {U}, 0)
funcParams2 = Unknown("funcParams2", opt_float, {V}, 1)
Energy(funcParams1(0))
Energy(funcParams2(0))
As a short-term fix, I think that it should be possible to re-write my example-problem with graphs, which should work because the graph code in the opt-compiler does not seem to be affected by the problem discussed here. The following will at least compile (together with the other hack from above):
U,V = Dim("U",0), Dim("V",1)
funcParams1 = Unknown("funcParams1", opt_float, {U}, 0)
funcParams2 = Unknown("funcParams2", opt_float, {V}, 1)
local G1 = Graph("G1", 2, "d", {U}, 2)
local G2 = Graph("G2", 3, "d", {V}, 3)
Energy(funcParams1(G1.d))
Energy(funcParams2(G2.d))
Note to developers:
Is this a "bug"? One the one hand, in util.makeGPUFunctions
we loop over all index-spaces, passing ispace
as an argument to delegate.CenterFunctions
and delegate.GraphFunctions
. We also make sure to pass in the correct version of the automatically generated functions (e.g. evalJTF
).
But once inside delegate.CenterFunctions
(see e.g. kernels.PCGInit1
we pass idx : Index
as an arg to e.g. pd.delta
, which is of type TUnknownType
. But the metamethods of TUnknownType
require that all Unknowns(from lua code)
are in the same index space.
These metamethods do not seem to be used by the graph code, which is why it works. So alltogether it seems like someone already spotted the problem discussed here but only fixed it for graph-energies (which is alright since any "pixel-data" problem can be re-written as a graph-problem)
Basically, I think that we need to get rid of the metamethods (e.g. __apply
) for TUnknownType
, but keep them on the Image
-level. Maybe TUnknownType.__apply
could be re-implemented as a macro that emits ImageType.__apply
calls (only) for all Images (=Unknown() from Opt), that actually use it.
In the short term, it should be enough to somehow change insert_dummy_energies_for_unused_unknown_index_spaces
so that it catches the errors discussed in #91, but only those and no more.
I think solving the problem with insert_dummy_energies_for_unused_unknown_index_spaces
should be easiest by solving the general problem: Opt should support Unknowns in multiple domains, so long as the only interaction between those domains is on a graph.
As you point out, the infrastructure is partially there, we just need to modify the TUnknownTyp.__apply call to take indices of its component ispaces instead of forcing to a single ispace.
Just too make it clear I am specifically looking for a feature where you can mix shapes within the same energy.
I think a good example where this would be needed is when you also optimize the light source parameters in the shape from shading example.
At the moment I stack all unkowns into one big index-space but I have no idea what the consequences are with respect to memory requirements and performance .
@Mx7f Looking at the TUnknownType.__apply
implementation, it actually looks like somebody already anticipated the index-space issue discussed here, but instead of e.g. overloading the definition of __apply
for different index-spaces, it is over-written each time a new index-space is encountered.
@Kalkasas Could you please post a minimum example code? (Opt and cpp) As already discussed, you could hack the existing code and the mixing of shapes will work automatically as long as the "mixing" only happens in graph-based energy terms (i.e. the example from your original question will work). You do not need a specific "feature" or "keyword" to achieve the mixing.
EDIT: Regarding your question about memory-requirements and performance:
This question is actually independent of the mixing-shapes issue, so you could simply try it out:
-- note that we use "U" both times
funcParams1 = Unknown("funcParams1", opt_float, {U}, 0)
funcParams2 = Unknown("funcParams2", opt_float, {U}, 1)
instead of
-- opt_float2 instead of 2x opt_float
funcParams = Unknown("funcParams1", opt_float2, {U}, 0)
I attached the example to this post.
@Mx7f Let us assume funcParams U = 10k and V=2. In this case would there not be a difference?
@Kalkasas What do you mean by a difference? From a solver where all the parameters are put in the same index space? When we fix the TUnknownType.__apply
issue, the results will be the same, there will just be a small increase in overhead from having to launch kernels for each index space (those kernels are much, much cheaper than the kernels over the residual induced by the graph, so the observable performance difference should be negligible).
@Mx7f that was my question. Thank you for the clarification. I am not yet familiar with the inner workings of the library.
I tried to modify the curveFitting example to use Unknowns with different shape. In a real world problem this could correspond to global variables and local variables.
This leads to the error:
Is such a feature already supported?