maxpumperla / hyperas

Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization
http://maxpumperla.com/hyperas/
MIT License
2.17k stars 316 forks source link

global variable issue #289

Open BITACC opened 2 years ago

BITACC commented 2 years ago

Helo there, I have the following code. a global variable that num_gpu that indicates the number of gpus availavle. If I can data() or create_mode() separate everything is fine. But when I call optim.minimize and pass the function, it does not see the global variable and num_gpu becomes undfined. Am I doing something wrong? Thanks in advance. ...

from keras import backend as K
from livelossplot import PlotLossesKeras
from hyperas import optim
from hyperas.distributions import choice, uniform
from hyperopt import Trials, STATUS_OK, tpe

# Setup (multi) GPU usage with scalable VRAM
num_gpu = setup_multi_gpus()

def data():
    print(num_gpu)

    df, train = data_import(possible_columns)
    df = prepare_data(df)

    ...
    return X_train, y_train, X_test, y_test

def create_model(X_train, y_train, X_test, y_test):
    print(num_gpu)

if __name__ == '__main__':
    best_run, best_model = optim.minimize(model=create_model, 
                                          data=data,
                                          algo=tpe.suggest,
                                          max_evals=500,
                                          trials=Trials(),
                                          eval_space=True)

    X_train, y_train, X_test, y_test = data()

    create_model(X_train, y_train, X_test, y_test )
    print(num_gpu)
JonnoFTW commented 2 years ago

Can you just call setup_multi_gpus() inside your create_model function?

You can pass the keep_temp argument to optim.minimize and examine the python file produced to see what variables are being created.

BITACC commented 2 years ago

Yeah, that s what one should do, but the issue with not recognizing the global variable remains.