mhjabreel / CharCNN

MIT License
234 stars 99 forks source link

problem about learning rate #13

Closed SmChester closed 5 years ago

SmChester commented 5 years ago

Hi, When I run the training.py, I faced a problem like this Traceback (most recent call last): File "training.py", line 63, in learning_rate = tf.train.piecewise_constant(global_step, boundaries, values) File "/Users/chestersima/anaconda3/lib/python3.6/site-packages/tensorflow/python/training/learning_rate_decay.py", line 168, in piecewise_constant "The length of boundaries should be 1 less than the length of values")

The source code is below: boundaries = [] br = config.training.base_rate values = [br] for i in range(1, 10): values.append(br / (2 * i)) boundaries.append(15000 i) values.append(br / (2 ** (i + 1))) print(values) print(boundaries) learning_rate = tf.train.piecewise_constant(global_step, boundaries, values)

I read the source code and find that the length of values is 11, however, the length of boundaries is 9. So how to work out this problem? Values has a initial item in config.py, and after the iteration, values append one more item. What's the point at doing so?

Markkunitomi commented 5 years ago

I noticed the same problem. You can either get rid of the initializing value br or the last value added outside of the loop. This allows the script to run.

mhjabreel commented 5 years ago

Hi @SmChester and @Markkunitomi, Now, you can use the new implementation in the branch textify_bases. The problem is solved.

Thanks.