PetrochukM / HParams

Configure Python functions explicitly and safely
MIT License
126 stars 8 forks source link

Ideas #8

Open PetrochukM opened 3 years ago

PetrochukM commented 3 years ago

In order to easily create a new process, the configuration should we exported from the old process, and imported into the new process.

PetrochukM commented 3 years ago

Also, it'd be helpful to export the kwargs without get_configured_partial. Additionally... We can rename get_configured_partial to get_partial. A partial by definition is "configured".

PetrochukM commented 3 years ago

The logging can be simplified with something like this... it'll ensure we don't log the same error multiple times.

@functools.lru_cache(maxsize=None)
def call_once(callable_, *args, **kwargs):
    """Call `callable_` only once with `args` and `kwargs` within the same process."""
    return callable_(*args, **kwargs)
PetrochukM commented 3 years ago

If a partial is exported, then it shouldn't trigger warnings of overwriting.

PetrochukM commented 3 years ago
PetrochukM commented 3 years ago

We could name and talk about this package around the concept of dependency injection.

PetrochukM commented 3 years ago

For some reason, the configuration is able to accept a class as a key. The config object should only be able to accept methods or functions.

PetrochukM commented 3 years ago

Add a context manager for setting the configuration temporarily. This is especially useful for "context" management. For example, if we are evaluating vs training.

PetrochukM commented 3 years ago

Fix double configuration. When a function is configured twice, it creates all sorts of errors.

PetrochukM commented 3 years ago

Allow setting the configuration, later. So that you can do something like:

configurable(torch.optim.Adam.init)(params) This prevents us from needing to set a global parameter.

PetrochukM commented 3 years ago

For some reason... after transferring the configuration to another process. torch.nn.LayerNorm.__init__ wasn't configured, and there was no warning.

Part of the reason was... that after transferring the configuration, LayerNorm was no longer configurable.

It's likely because _code_to_function requires configurable to be executed.

PetrochukM commented 3 years ago

Ensure all warnings, and checks are configurable.

PetrochukM commented 3 years ago

Support generators.

PetrochukM commented 3 years ago

Dont print warnings if the args are the same: UserWarning: @configurable: Overwriting configured argumentsuffix=.wavin modulerun.data._loader.utils.is_normalized_audio_filewith.wav.

PetrochukM commented 3 years ago

Remove setprofile since it can be overwritten

PetrochukM commented 3 years ago

We could introduce this package as... Why use YAML for config files? Just use Python.

PetrochukM commented 3 years ago

Clear our skipped configs, when clearing out config