Closed find-happiness closed 5 years ago
@find-happiness I need more info from you. can you share your notebook? what Python version are you using? are you somewhere specifying encoding? definitely some weird encoding issue, but I'd have to take a closer look.
@maxpumperla Environment: Windows 7 Python: 3.6.2
def create_model(x_train,y_train,x_test,y_test):
'''
创建模型
:return: keras model
'''
model = keras.models.Sequential()
model.add(keras.layers.Dense({{choice([16,32,64,128])}}, activation='relu',kernel_regularizer=keras.regularizers.l1_l2(l1=0.001,l2=0.001)))
model.add(keras.layers.Dropout({{uniform(0.1,0.8)}}))
model.add(keras.layers.Dense({{choice([16,32,64,128])}}, activation='relu',kernel_regularizer=keras.regularizers.l1_l2(l1=0.001,l2=0.001)))
model.add(keras.layers.Dropout({{uniform(0.1,0.8)}}))
model.add(keras.layers.Dense({{choice([16,32,64,128])}}, activation='relu',kernel_regularizer=keras.regularizers.l1_l2(l1=0.001,l2=0.001)))
model.add(keras.layers.Dropout({{uniform(0.1,0.8)}}))
model.add(keras.layers.Dense(1,activation='sigmoid'))
model.compile(optimizer=keras.optimizers.RMSprop(), loss=keras.losses.binary_crossentropy, metrics=[keras.metrics.binary_accuracy])
history = model.fit(x_train,y_train,batch_size=128,validation_data=[x_test,y_test],epochs=20)
validation_acc = np.amax(history.history['val_binary_accuracy'])
return {'loss': -validation_acc, 'status': STATUS_OK, 'model': model}
looks good. maybe the special characters (创建模型) get in the way?
looks good. maybe the special characters (创建模型) get in the way?
Yes, the program comment must be English, Chinese or others are not allowed, but I don't know why.
not an encoding guru, but you need to tell your Python environment what encoding you use. remember, hyperas persists your experiment in a file and then tries to read it again. it needs to know how. alright, closing this for now. feel free to open a "support other langs" ticket or so. thanks.
'gbk' codec can't decode byte 0xae in position 48667: illegal multibyte sequence
e:\anaconda3_5_0_0\envs\deeplearning\lib\site-packages\hyperas\optim.py in minimize(model, data, algo, max_evals, trials, functions, rseed, notebook_name, verbose, eval_space, return_space) 65 full_model_string=None, 66 notebook_name=notebook_name, ---> 67 verbose=verbose) 68 69 best_model = None
e:\anaconda3_5_0_0\envs\deeplearning\lib\site-packages\hyperas\optim.py in base_minimizer(model, data, functions, algo, max_evals, trials, rseed, full_model_string, notebook_name, verbose, stack) 94 model_str = full_model_string 95 else: ---> 96 model_str = get_hyperopt_model_string(model, data, functions, notebook_name, verbose, stack) 97 temp_file = './temp_model.py' 98 write_temp_files(model_str, temp_file)
e:\anaconda3_5_0_0\envs\deeplearning\lib\site-packages\hyperas\optim.py in get_hyperopt_model_string(model, data, functions, notebook_name, verbose, stack) 171 notebook_path = os.getcwd() + "/{}.ipynb".format(notebook_name) 172 with open(notebook_path, 'r') as f: --> 173 notebook = nbformat.reads(f.read(), nbformat.NOCONVERT) 174 exporter = PythonExporter() 175 source, = exporter.from_notebook_node(notebook)
UnicodeDecodeError: 'gbk' codec can't decode byte 0xae in position 48667: illegal multibyte sequence