Open simleo opened 5 years ago
Some error messages from the HDFS extension are uninformative, e.g.:
>>> hdfs.rm("doesnotexist.txt") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.6/dist-packages/pydoop-2.0a4-py3.6-linux-x86_64.egg/pydoop/hdfs/__init__.py", line 277, in rm retval = fs.delete(path_, recursive=recursive) File "/usr/local/lib/python3.6/dist-packages/pydoop-2.0a4-py3.6-linux-x86_64.egg/pydoop/hdfs/fs.py", line 346, in delete return self.fs.delete(path, recursive) OSError: [Errno 5] Input/output error
>>> hdfs.rm("foo", recursive=False) hdfsDelete(path=/user/root/foo, recursive=0): FileSystem#delete error: (unable to get stack trace for org.apache.hadoop.fs.PathIsNotEmptyDirectoryException exception: ExceptionUtils::getStackTrace error.) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.6/dist-packages/pydoop-2.0a4-py3.6-linux-x86_64.egg/pydoop/hdfs/__init__.py", line 277, in rm retval = fs.delete(path_, recursive=recursive) File "/usr/local/lib/python3.6/dist-packages/pydoop-2.0a4-py3.6-linux-x86_64.egg/pydoop/hdfs/fs.py", line 346, in delete return self.fs.delete(path, recursive) OSError: [Errno 255] Unknown error 255
Is there anything we can do in the extension code (are we failing to reset errno or something) or is the problem in libhdfs? Needs investigation.
errno
libhdfs
it deletes the file but still throws this error
Some error messages from the HDFS extension are uninformative, e.g.:
Is there anything we can do in the extension code (are we failing to reset
errno
or something) or is the problem inlibhdfs
? Needs investigation.