yahoo / lopq

Training of Locally Optimized Product Quantization (LOPQ) models for approximate nearest neighbor search of high dimensional data in Python and Spark.
Apache License 2.0
562 stars 130 forks source link

Exception occurred when copy model file to HDFS #23

Open xhappy opened 5 years ago

xhappy commented 5 years ago

Starting rotation fitting for split 1 Saving pickle to temp file... Copying pickle file to hdfs... hdfs:///user/algo/lopq/model/ Traceback (most recent call last): File "train_model.py", line 426, in save_hdfs_pickle(model, args.model_pkl) File "train_model.py", line 245, in save_hdfs_pickle copy_to_hdfs(f, pkl_path) File "train_model.py", line 266, in copy_to_hdfs subprocess.call(['hadoop', 'fs', '-put', f.name, hdfs_path]) File "/usr/lib64/python2.7/subprocess.py", line 524, in call return Popen(*popenargs, **kwargs).wait() File "/usr/lib64/python2.7/subprocess.py", line 711, in init errread, errwrite) File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child raise child_exception OSError: [Errno 2] No such file or directory End of LogType:stdout

pumpikano commented 5 years ago

Sorry I can't be of much help here. To me it seems that there is some issue with the HDFS command executed in subprocess.call(['hadoop', 'fs', '-put', f.name, hdfs_path]).