TomographicImaging / CIL

A versatile python framework for tomographic imaging
https://tomographicimaging.github.io/CIL/
Apache License 2.0
90 stars 40 forks source link

Sphinx autodocumentation does not show KullbackLeibler #1622

Open paskino opened 7 months ago

paskino commented 7 months ago

Because the KullbackLeibler class uses the factory method __new__, upon instantiation it returns an instance of another class. Then sphinx will not show the appropriate documentation.

https://github.com/TomographicImaging/CIL/blob/d198bef918a44f88870899dcb90bfd1c0c235040/Wrappers/Python/cil/optimisation/functions/KullbackLeibler.py#L91-L105

In GradientOperator we do a similar thing, but we store an instance of the operator we use (either from the C CIL library or from numpy) and then we use that for all the calls, for instance direct

https://github.com/TomographicImaging/CIL/blob/d198bef918a44f88870899dcb90bfd1c0c235040/Wrappers/Python/cil/optimisation/operators/GradientOperator.py#L111-L114

https://github.com/TomographicImaging/CIL/blob/d198bef918a44f88870899dcb90bfd1c0c235040/Wrappers/Python/cil/optimisation/operators/GradientOperator.py#L135

In the case of the GradientOperator the documentation looks good.

In #1618 I am facing the same issue with the WeightedL1Norm.

Can sphinx add the documentation of the right class? If not, how do we get the documentation to appear correctly?

Currently #1613 is also due to this issue.

lauramurgatroyd commented 7 months ago

Please can you explain what is wrong with the current rendered documentation for KullbackLeiber?

paskino commented 7 months ago

It's missing documentation on the methods gradient, __call__, proximal, proximal_conjugate, convex_conjugate and proximal.

MargaretDuff commented 7 months ago

I had a bit of a play and got it to show: image image image Not a fix to doing the factory methods automatically but one way of explaining it in the documentation

paskino commented 6 months ago

Currently the documentation shows the docstrings from the parent class Function.

The solution is to use the same approach used with the GradientOperator.