lvapeab / nmt-keras

Neural Machine Translation with Keras
http://nmt-keras.readthedocs.io
MIT License
533 stars 130 forks source link

can't run main.py ,a wrong in keras_wrapper #100

Closed RookieCXL closed 5 years ago

RookieCXL commented 5 years ago

I need some help,when I run the main.py, an error happen in the keras_wrapper.

C:\Users\think>python C:\Users\think\Desktop\nmt-keras-master/main.py Using TensorFlow backend. [09/04/2019 11:48:58] <<< Cupy not available. Using numpy. >>> [09/04/2019 11:48:59] Running training. [09/04/2019 11:48:59] Building EuTrans_esen dataset Traceback (most recent call last): File "C:\Users\think\Desktop\nmt-keras-master/main.py", line 49, in train_model(parameters, args.dataset) File "C:\Users\think\Desktop\nmt-keras-master\nmt_keras\training.py", line 64, in train_model dataset = build_dataset(params) File "C:\Users\think\Desktop\nmt-keras-master\data_engine\prepare_data.py", line 151, in build_dataset label_smoothing=params.get('LABEL_SMOOTHING', 0.)) File "c:\users\think\src\keras-wrapper\keras_wrapper\dataset.py", line 1270, in setOutput bpe_codes=bpe_codes, separator=separator, use_unk_class=use_unk_class) File "c:\users\think\src\keras-wrapper\keras_wrapper\dataset.py", line 1701, in preprocessTextFeatures 'It currently is: %s' % (str(annotations_list))) Exception: Wrong type for "annotations_list". It must be a path to a text file with the sentences or a list of sentences. It currently is: examples/EuTrans//training.en

lvapeab commented 5 years ago

Hi,

I think it's a path problem. You can either:

  1. cd to the nmt-keras folder and run main.py there (cd C:\Users\think\Desktop\nmt-keras-master; python main.py)

  2. Use absolute paths in the config.py. That means, changing this line by something like:

DATA_ROOT_PATH = 'C:\Users\think\Desktop\nmt-keras-master/examples/%s/' % DATASET_NAME.

Hope this helps. Cheers

RookieCXL commented 5 years ago

Thanks!It has been solved。 And what‘s your tensorflow version?

lvapeab commented 5 years ago

1.12.0. I haven't tested yed with TF 2.0 (I hope to do it soon).