Open FynnBe opened 5 years ago
export of default ModelZooEntry:
:crystal_ball:
entry = ModelZooEntry(map_to=MapTo.STRING)
entry.export(Path("string.yaml"))
log_config:
model_n_saved: 2
model_save_interval: 2
loss_config:
loss_class: MSELoss
loss_kwargs: {}
model_config:
model_class: null
model_kwargs: {}
optimizer_config:
optimizer_class: Adam
optimizer_kwargs: {}
:mortar_board:
entry.export(Path("pytorch.yaml"), map_to=MapTo.PYTORCH, export_hidden=True, safe_dump=False)
evaluator_callbacks: []
log_config:
checkpointer: !!python/object:ignite.handlers.checkpoint.ModelCheckpoint
_atomic: true
_dirname: C:/Users/fbeut/protozoo/checkpoints
_fname_prefix: protozoo
_iteration: 0
_n_saved: 2
_save_as_state_dict: true
_save_interval: 2
_saved: []
_score_function: null
_score_name: null
dir: !!python/object/apply:pathlib.WindowsPath
- C:\
- Users
- fbeut
- protozoo
model_n_saved: 2
model_save_interval: 2
loss_config:
loss_class: !!python/name:torch.nn.modules.loss.MSELoss ''
loss_kwargs: {}
model_config:
create_loss_input: !!python/name:protozoo.pytorch_config.default_create_loss_input ''
create_model_input: !!python/name:protozoo.pytorch_config.default_create_model_input ''
model_class: null
model_kwargs: {}
pretrained_source: null
optimizer_config:
optimizer_class: !!python/name:torch.optim.adam.Adam ''
optimizer_kwargs: {}
predictor_callbacks: []
trainer_callbacks: []
ok, so somehow these "draft" implementations went way over board... Maybe we'll get back to the details once the higher-level questions are answered and those answers still compatible with what I sketched out here.
the current string output:
log_config:
model_n_saved: 2
model_save_interval: 2
loss_config:
loss_class: MSELoss
loss_kwargs: {}
model_config:
model_class: NOT SPECIFIED
model_kwargs: {}
optimizer_config:
optimizer_class: Adam
optimizer_kwargs: {}
origin: file:///C:/repos/protozoo/protozoo/entry.py
base the ModelZooEntry and it's nested configuration elements on a BaseConfig class
Features:
This PR incorporates #1 , #2 , #3 , and #5