google-deepmind / optax

Optax is a gradient processing and optimization library for JAX.
https://optax.readthedocs.io
Apache License 2.0
1.71k stars 194 forks source link

Add improved version of Hungarian algorithm. #1140

Open carlosgmartin opened 3 days ago

carlosgmartin commented 3 days ago

Follow-up to https://github.com/google-deepmind/optax/pull/1083#issuecomment-2487415673.

carlosgmartin commented 2 days ago

@vroulet Seems to me it’s cleaner to keep each implementation in a separate file, since

  1. It keeps the implementations less entangled and keeps their respective helper functions separate, avoiding potential confusion.
  2. It keeps the modules shorter/less crowded, especially if more implementations are added in the future.

WDYT?

vroulet commented 1 day ago

It keeps the implementations less entangled and keeps their respective helper functions separate, avoiding potential confusion.

In any case we should add in the docstring why they differ.

It keeps the modules shorter/less crowded, especially if more implementations are added in the future.

I would actually agree with you, but typically the alias file in optax has an ever increasing list of optimizers. So let's keep the same logic here. (That would actually be my main argument in favor of keeping them in a single file: it may be better for the package to follow a similar philosophy across folders/files).

carlosgmartin commented 7 hours ago

@vroulet Done.