Closed BatyLeo closed 1 year ago
Patch coverage: 94.91%
and project coverage change: +1.13%
:tada:
Comparison is base (
f9d8dab
) 85.34% compared to head (aa22430
) 86.48%. Report is 2 commits behind head on main.
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Current state of this pull request:
Seems to work, by imitation (Fenchel-Young loss) as well as by experience (PushForward)
Works, but should only really work with $\alpha = 2$. Indeed, with gradient and loss values are wrong in theory when $\alpha\neq 2$. It could be easily fixed for $\alpha\neq 1$, but we would need to be able to compute $\arg\max_y \theta^\top g(y)$ for $\alpha = 1$.
Now that #80 and #65 are merged, I think this PR is finally ready to be merged. I just need some feedback about the following:
GeneralizedMaximizer
, do you have any better suggestion?GeneralizedMaximizer
is only compatible with PerturbedAdditive
, PerturbedMultiplicative
, SPOPlusLoss
, and FenchelYoungLoss
. Do you think we should also add (if possible) support for Regularized
/SSVM
/Interpolation
? I would say no, I think it's enough for what we need this feature for.I think support for Regularized would be nice. Does it also work with the latest Perturbed revamp?
I think support for Regularized would be nice. Does it also work with the latest Perturbed revamp?
Regularized
is that I'm not sure if it's really compatible with RegularizedFrankWolfe
, because since $f(\theta) = \arg\max_y \theta^\top g(y) + h(y)$ is not a linear solver we cannot use it in the Frank Wolfe algorithm. I could probably make it compatible with AbstractRegularized
tho, even if the Frank Wolfe forward pass won't work as expected.PerturbedAdditive
and PerturbedMultiplicative
, and is really useful when combined with a FenchelYoungLoss
. It probably also works with a PerturbedOracle
but its useless since we can directly give any oracle as input.I could probably make it compatible with AbstractRegularized tho, even if the Frank Wolfe forward pass won't work as expected.
Indeed FW will fail, but the whole idea of AbstractRegularized
was to allow for custom solvers, so I think it's worth the adaptation
This would need a way for getting the maximizer of an AbstractRegularized
, should we add one method to the AbstractRegularized
interface? It would only be used when using GeneralizedMaximizer
, that's a bit weird
It would only be used when using GeneralizedMaximizer, that's a bit weird
Is it a method that's easy to code?
Yes, it would only need the following interface:
function get_maximizer end
We just need to specify that it needs to be implemented only for AbstractRegularized
supporting GeneralizedMaximizer
Your call. Btw you can also use RequiredInterfaces.jl to test the interfaces, it's really easy and convenient
I added a new abstract type AbstractRegularizedGeneralizedMaximizer <: AbstractRegularized
with the additional needed interface using RequiredInterfaces.jl. Now GeneralizedMaximizer
can be regularized!
I see no issue with this PR. Nice job!
See #25.
PerturbedAdditive
PerturbedMultiplicative
SPOPLusLoss
RegularizedGeneric
StructuredSVM
~Interpolation
~