Meta-optimization / L2L

Learning to Learn: Gradient-free Optimization framework
https://meta-optimization.github.io/L2L/
GNU General Public License v3.0
15 stars 14 forks source link

[Bug?] KeyError: 'paths_obj' in JUBE_runner.py #57

Closed brenthuisman closed 3 years ago

brenthuisman commented 4 years ago

We're running into an issue with L2L master (as of time of posting). We've written an optimizee, but an error is generated when we try to run the optimization:

Traceback (most recent call last):
  File "model.py", line 180, in <module>
    main()
  File "model.py", line 173, in main
    env.run(optimizee.simulate)
  File "/home/brent/.local/lib/python3.8/site-packages/Learning_to_Learn-1.0.0b0-py3.8.egg/l2l/utils/environment.py", line 45, in run
  File "/home/brent/.local/lib/python3.8/site-packages/Learning_to_Learn-1.0.0b0-py3.8.egg/l2l/utils/JUBE_runner.py", line 53, in __init__
KeyError: 'paths_obj'

Not 100% sure if it's the source for the error, but we set the paths thus:

    root_dir_path = "."
    paths = Paths(name, dict(run_no='test'), root_dir_path=root_dir_path)
    traj.f_add_parameter_to_group("JUBE_params", "paths", paths)

Basically copied from one of the examples. Any idea what's going wrong? There is an optimizee.bin created in the path you might expect, so it seems the paths_obj is succesfully used somewhere.

Our code: https://github.com/brenthuisman/l2l-arbor We use L2L @ master and https://github.com/Meta-optimization/JUBE @ master.

alperyeg commented 4 years ago

Unfortunately this is a bug, because you are using the old format in the run script (with the path.conf). There are two ways now.

  1. (recommended) change the run script in the bin folder following the latest template. To get an idea have a look at my PR https://github.com/Meta-optimization/L2L/pull/56 which should be merged soon.
  2. The monkey patch is to change the dict key paths_obj to paths.
alperyeg commented 4 years ago

The first solution would look like this more or less:

# import 
from l2l.utils.experiment import Experiment
# ... in def main(): 
fit, swc, ref = 'fit.json', 'cell.swc', 'nrn.csv'
name = 'ARBOR-FUN'
results_folder = '../results'
trajectory_name = 'ARBOR'
experiment = Experiment(results_folder)
traj, _ = experiment.prepare_experiment(trajectory_name=trajectory_name, 
                                        name=name)
optimizee = ArbSCOptimizee(traj, fit, swc, ref)
parameters = GeneticAlgorithmParameters(seed=0, popsize=50, CXPB=0.5, MUTPB=0.3, NGEN=100, indpb=0.02, tournsize=15, matepar=0.5, mutpar=1)
optimizer  = GeneticAlgorithmOptimizer(traj, optimizee_create_individual=optimizee.create_individual, optimizee_fitness_weights=(-0.1,), parameters=parameters)
experiment.run_experiment(optimizee=optimizee, optimizer=optimizer,
                          optimizer_parameters=parameters)
experiment.end_experiment(optimizer)
thorstenhater commented 4 years ago

Hi, the faulty code was more or less copied verbatim from the bin-directory of L2L. Also, the docs indicate this as the standard way of setting up. A note about the deprecation would be helpful.

Best regards, T

thorstenhater commented 4 years ago

This gets me one step further


Created a folder at /Users/hater/src/results
All output logs can be found in directory  /Users/hater/src/results/ARBOR-FUN/logs
Traceback (most recent call last):
  File "/Users/hater/src/L2L/../l2l-arbor/model.py", line 146, in <module>
    main()
  File "/Users/hater/src/L2L/../l2l-arbor/model.py", line 133, in main
    traj, _    = experiment.prepare_experiment(trajectory_name=trajectory_name, name=name)
  File "/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/l2l/utils/experiment.py", line 139, in prepare_experiment
    if k not in kwargs.get('jube_parameter').keys():
AttributeError: 'NoneType' object has no attribute 'keys'```
alperyeg commented 4 years ago

The error should be fixed with the pr I made as well. For the moment can you try:

experiment.prepare_experiment(trajectory_name=trajectory_name, name=name, jube_parameter={})

if you do not have any parameter for jube set.

alperyeg commented 4 years ago

Yes we need to update the docs as well.

thorstenhater commented 4 years ago

Yes, I figured that out, now:


All output logs can be found in directory  /Users/hater/src/results/ARBOR-FUN/logs
JUBE parameters used: {'submit_cmd': 'sbatch', 'job_file': 'job.run', 'nodes': '1', 'walltime': '01:00:00', 'ppn': '1', 'cpu_pp': '1', 'threads_pp': '4', 'mail_mode': 'ALL', 'err_file': 'stderr', 'out_file': 'stdout', 'tasks_per_job': '1', 'exec': 'python /Users/hater/src/results/ARBOR-FUN/simulation/run_files/run_optimizee.py', 'ready_file': '/Users/hater/src/results/ARBOR-FUN/ready_files/ready_w_', 'work_path': '/Users/hater/src/results/ARBOR-FUN', 'paths_obj': <l2l.paths.Paths object at 0x1071a3a30>}
/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/deap/creator.py:138: RuntimeWarning: A class named 'FitnessMax' has already been created and it will be overwritten. Consider deleting previous creation of that class or rename it.
  warnings.warn("A class named '{0}' has already been created and it "
/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/deap/creator.py:138: RuntimeWarning: A class named 'Individual' has already been created and it will be overwritten. Consider deleting previous creation of that class or rename it.
  warnings.warn("A class named '{0}' has already been created and it "
Traceback (most recent call last):
  File "/Users/hater/src/L2L/../l2l-arbor/model.py", line 145, in <module>
    main()
  File "/Users/hater/src/L2L/../l2l-arbor/model.py", line 141, in main
    experiment.run_experiment(optimizee=optimizee, optimizer=optimizer, optimizer_parameters=parameters)
TypeError: run_experiment() missing 1 required positional argument: 'optimizee_parameters'
alperyeg commented 4 years ago

Will be fixed as well. Put optimizee_parameters=None in run_experiment

thorstenhater commented 4 years ago

Again throwing in a dummy:


JUBE parameters used: {'submit_cmd': 'sbatch', 'job_file': 'job.run', 'nodes': '1', 'walltime': '01:00:00', 'ppn': '1', 'cpu_pp': '1', 'threads_pp': '4', 'mail_mode': 'ALL', 'err_file': 'stderr', 'out_file': 'stdout', 'tasks_per_job': '1', 'exec': 'python /Users/hater/src/results/ARBOR-FUN/simulation/run_files/run_optimizee.py', 'ready_file': '/Users/hater/src/results/ARBOR-FUN/ready_files/ready_w_', 'work_path': '/Users/hater/src/results/ARBOR-FUN', 'paths_obj': <l2l.paths.Paths object at 0x105beaa30>}
/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/deap/creator.py:138: RuntimeWarning: A class named 'FitnessMax' has already been created and it will be overwritten. Consider deleting previous creation of that class or rename it.
  warnings.warn("A class named '{0}' has already been created and it "
/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/deap/creator.py:138: RuntimeWarning: A class named 'Individual' has already been created and it will be overwritten. Consider deleting previous creation of that class or rename it.
  warnings.warn("A class named '{0}' has already been created and it "
MainProcess bin.l2l ash 65133 INFO    : Optimizee parameters: {}
MainProcess bin.l2l ash 65133 INFO    : Optimizer parameters: GeneticAlgorithmParameters(seed=0, popsize=50, CXPB=0.5, MUTPB=0.3, NGEN=100, indpb=0.02, tournsize=15, matepar=0.5, mutpar=1)
Traceback (most recent call last):
  File "/Users/hater/src/L2L/../l2l-arbor/model.py", line 145, in <module>
    main()
  File "/Users/hater/src/L2L/../l2l-arbor/model.py", line 141, in main
    experiment.run_experiment(optimizee=optimizee, optimizer=optimizer, optimizer_parameters=parameters, optimizee_parameters={})
  File "/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/l2l/utils/experiment.py", line 163, in run_experiment
    jube.prepare_optimizee(optimizee, self.paths.simulation_path)
  File "/Users/hater/src/L2L/.direnv/python-venv-3.9.0/lib/python3.9/site-packages/l2l/utils/JUBE_runner.py", line 288, in prepare_optimizee
    pickle.dump(optimizee, f)
TypeError: cannot pickle 'arbor._arbor.morphology' object
alperyeg commented 4 years ago

I think this is also not documented very well. The optimizee is a stateless object. So it is a more complex problem. If you use anything in simulate() you created before the pickling doesn't have access.

alperyeg commented 4 years ago

Another option would be to make an external call to arbor from the optimizee.

thorstenhater commented 4 years ago

I figured out how to do this -- by relocating all un-pickleable entities to simulate -- but now I have a conflict in python versions

  File "/Users/hater/src/results/ARBOR-FUN/simulation/run_files/run_optimizee.py", line 6, in <module>
    trajectory = pickle.load(handle_trajectory)
  File "/usr/local/Cellar/python@2/2.7.17_1/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 1384, in load
    return Unpickler(file).load()
  File "/usr/local/Cellar/python@2/2.7.17_1/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/Cellar/python@2/2.7.17_1/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 892, in load_proto
    raise ValueError, "unsupported pickle protocol: %d" % proto
ValueError: unsupported pickle protocol: 5

Seemingly the run script does not pick up the calling python (3.9) but uses whatever python refers to.

brenthuisman commented 4 years ago

Look in the ArborSCoptimizee file (which I can't see anymore) for a line traj.f_add_parameter_to_group. In there, python is added. Change it to python3 and see what happens.

alperyeg commented 3 years ago

I believe the issue is solved. I am closing it. If there is further need to discuss reopen please.