kotritrona / osumapper

An automatic beatmap generator using Tensorflow / Deep Learning.
Apache License 2.0
418 stars 58 forks source link

Error on step 7 #8

Closed Azn9 closed 4 years ago

Azn9 commented 5 years ago

Hey ! I've got that error on step 07 on the code after "Now we can train the model!" :

# of groups: 48

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-25-9041bcd63a5f> in <module>
    166             print("{},{},{},2,0,L|{}:{},1,{},0:0:0".format(int(ai[0]), int(ai[1]), int(timestamps[i]), int(round(ai[0] + ai[2] * slider_lengths[i])), int(round(ai[1] + ai[3] * slider_lengths[i])), int(slider_length_base[i] * slider_ticks[i])));
    167 
--> 168 osu_a = generate_map();
    169 # generate_test();

<ipython-input-25-9041bcd63a5f> in generate_map()
    145     print("# of groups: {}".format(timestamps.shape[0] // note_group_size));
    146     for i in range(timestamps.shape[0] // note_group_size):
--> 147         z = generate_set(begin = i * note_group_size, start_pos = pos, length_multiplier = dist_multiplier, group_id = i, plot_map=False) * np.array([512, 384, 1, 1, 512, 384]);
    148         pos = z[-1, 0:2];
    149         o.append(z);

<ipython-input-25-9041bcd63a5f> in generate_set(begin, start_pos, group_id, length_multiplier, plot_map)
     84     c_false_batch = GAN_PARAMS["c_false_batch"];
     85 
---> 86     gmodel = generative_model(g_input_size, note_group_size * 4, loss_function_for_generative_model);
     87 
     88     for i in range(max_epoch):

<ipython-input-25-9041bcd63a5f> in generative_model(in_params, out_params, loss_func)
     30     model.compile(loss=loss_func,
     31                 optimizer=optimizer,
---> 32                 metrics=[keras.metrics.mae])
     33     return model
     34 

c:\users\axel\appdata\local\programs\python\python37\lib\site-packages\tensorflow\python\training\checkpointable\base.py in _method_wrapper(self, *args, **kwargs)
    440     self._setattr_tracking = False  # pylint: disable=protected-access
    441     try:
--> 442       method(self, *args, **kwargs)
    443     finally:
    444       self._setattr_tracking = previous_value  # pylint: disable=protected-access

c:\users\axel\appdata\local\programs\python\python37\lib\site-packages\tensorflow\python\keras\engine\training.py in compile(self, optimizer, loss, metrics, loss_weights, sample_weight_mode, weighted_metrics, target_tensors, distribute, **kwargs)
    447             else:
    448               weighted_loss = training_utils.weighted_masked_objective(loss_fn)
--> 449               output_loss = weighted_loss(y_true, y_pred, sample_weight, mask)
    450 
    451           if len(self.outputs) > 1:

c:\users\axel\appdata\local\programs\python\python37\lib\site-packages\tensorflow\python\keras\engine\training_utils.py in weighted(y_true, y_pred, weights, mask)
    645     """
    646     # score_array has ndim >= 2
--> 647     score_array = fn(y_true, y_pred)
    648     if mask is not None:
    649       mask = math_ops.cast(mask, y_pred.dtype)

<ipython-input-25-9041bcd63a5f> in loss_function_for_generative_model(y_true, y_pred)
     68 
     69     def loss_function_for_generative_model(y_true, y_pred):
---> 70         return construct_map_and_calc_loss(y_pred, extvar);
     71 
     72 #     classifier_true_set_group = special_train_data[np.random.randint(0, special_train_data.shape[0], (500,))];

<ipython-input-24-993c3cadb30e> in construct_map_and_calc_loss(var_tensor, extvar)
    163     # first make a map from the outputs of generator, then ask the classifier (discriminator) to classify it
    164     classifier_model = extvar["classifier_model"]
--> 165     out = construct_map_with_sliders(var_tensor, extvar=extvar);
    166     cm = classifier_model(out);
    167     predmean = 1 - tf.reduce_mean(cm, axis=1);

<ipython-input-24-993c3cadb30e> in construct_map_with_sliders(var_tensor, extvar)
     45         begin_offset = 0;
     46     batch_size = var_tensor.shape[0];
---> 47     note_distances_now = length_multiplier * np.tile(np.expand_dims(note_distances[begin_offset:begin_offset+half_tensor], axis=0), (batch_size, 1));
     48     note_angles_now = np.tile(np.expand_dims(note_angles[begin_offset:begin_offset+half_tensor], axis=0), (batch_size, 1));
     49 

c:\users\axel\appdata\local\programs\python\python37\lib\site-packages\numpy\lib\shape_base.py in tile(A, reps)
   1241                 c = c.reshape(-1, n).repeat(nrep, 0)
   1242             n //= dim_in
-> 1243     return c.reshape(shape_out)

TypeError: __index__ returned non-int (type NoneType)
kotritrona commented 5 years ago

I'm glad someone still knows this after I abandoned it for 6 months!! Maybe it is related to tensorflow updates during these days, let me test it

jvyden commented 5 years ago

and i'm still being emailed about issues

kotritrona commented 5 years ago

Seems that the problem is because they changed the structure of tf.losses.* to use a class instead of plain function, since tensorflow v1.13 probably

Azn9 commented 5 years ago

Yep I've just tested with an old version of tensorflow (1.10.0) and it's working !

(pip install tensorflow==1.10.0)

Azn9 commented 5 years ago

Thanks for your help, hoping this will help people with the same issue !

kotritrona commented 5 years ago

Glad I could help.

VINXIS commented 5 years ago

any chance on updating this? older versions of tensorflow arent available on python 3.7 and it would be nice to be able to use this with the latest vers : 0

Azn9 commented 5 years ago

Install python 3.6 ? ¯_(ツ)_/¯

kotritrona commented 5 years ago

I'm trying to update this right now. New tensorflow in fact changed the algorithm a lot, it seems to convert all functions to a computational graph instead of calling directly, which caused problems to this

Currently I got the code to work in TF 2.0, but it has memory leak problems, where after a few groups it will go out of memory

kotritrona commented 5 years ago

try this https://github.com/kotritrona/osumapper/tree/master/v6.2

VINXIS commented 5 years ago

it works!!!!11

kotritrona commented 4 years ago

Since it works I have closed the issue a year and a half later

jvyden commented 4 years ago

and i'm STILL getting emailed about issues