Open pkasinathan opened 6 years ago
This is with "privacy" setting, correct?
This is a known efficiency in hdfs3 (specifically, libhdfs3/libgsasl). You could try again after updating hdfs3 and libhdfs3 to latest, and then downloading and conda-installing the file https://anaconda.org/mdurant/libgsasl/1.8.1/download/linux-64/libgsasl-1.8.1-1.tar.bz2 .
Hi,
We upgraded the hdfs3 and libhdfs3 to the latest version, but still when we are trying to put any thing it does not put anything:
hdfs = HDFileSystem(host="myhadoop",pars={"dfs.nameservices": "myhadoop","dfs.ha.namenodes.myhadoop": "nn1,nn2","dfs.namenode.rpc-address.myhadoop.nn1": "nn1.example.com:8020","dfs.namenode.rpc-address.myhadoop.nn2": "nn2.example.com:8020","hadoop.security.authentication": "kerberos"})
hdfs.put("/user/prabhu/examples/pi.py");
It still does not work. Am i still missing any configuration?
Unfortunately, getting all the security configurations working via libhdfs3 has proved very problematic. I now recommend that you switch to arrow's hdfs module instead. It deals much better with configuration and security, and doesn't miss much of what is provided by hdfs3.
Is this issue resolved? I am facing the same issue while accessing ENCRYPTED(Transparent Data Encryption) HDFS (with Kerberos).
No, this is not resolved, so the recommendation to use pyarrow stands.
Team,
After we enabled HDFS wire encryption on secure cluster, hdfs3 cat/put commands are not working. hdfs3 ls commands. i.e. Basically any command that reads/write data from/into HDFS are failing.
Log:
Can you let me know whether HDF3 support wire encryption? or, Am I missing any configuration?
Please let me know.
Thanks Prabhu