pyiron / pyiron_workflow

Graph-and-node based workflows
BSD 3-Clause "New" or "Revised" License
12 stars 1 forks source link

dynamic classes silently serializing wrong #196

Closed liamhuber closed 9 months ago

liamhuber commented 9 months ago

Discussed in detail here, when you dynamically create a node with a wrapper (but not using it as a decorator), when trying to re-import it you get the underlying node function object back instead of a node object. The rest of the state seems to come along for the ride, but so when we new_node_instance.__setstate__(loaded_instance.__getstate__()) during loading it can look like things go write, but you're getting the wrong object back and it hurts when you try to (de)serialize a composite.

MWE:

import h5io

from pyiron_workflow import Workflow

def some_function(x):
    return x + 1

MyNodeClass = Workflow.wrap_as.single_value_node("out")(some_function)

node = MyNodeClass(label="save_it", x=1, run_after_init=True, overwrite_save=True)
node.save()

hdf_load = h5io.read_hdf5(
    fname=node.storage._h5io_storage_file_path,
    title=node.label
)

hdf_load, hdf_load.__getstate__()

State looks reasonable, but reloaded object is the function not the node:

(<function __main__.some_function(x)>,
 {'_inputs': <pyiron_workflow.io.Inputs at 0x134833a90>,
  '_label': 'save_it',
  '_output_labels': ['out'],
  '_outputs': <pyiron_workflow.io.Outputs at 0x134873510>,
  '_parent': None,
  '_type_hints': {},
  '_working_directory': DirectoryObject(directory='save_it')
  {'dir': [], 'file': ['save_it/h5io.h5'], 'mount': [], 'symlink': [], 'block_device': [], 'char_device': [], 'fifo': [], 'socket': []},
  'executor': None,
  'failed': 0,
  'future': None,
  'running': 0,
  'save_after_run': 0,
  'signals': <pyiron_workflow.io.Signals at 0x134872cd0>})
liamhuber commented 9 months ago

Autoclose didn't work for some reason, but both backends now fail nice and early if you try to do this.