Closed suolemen closed 9 years ago
I guess that RHive fail to upload data into HDFS path 'hdfs:// maste.com/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c'
Please check if you can put a file into HDFS by using RHive function 'rhive.hdfs.put' or command line 'hadoop dfs -put {file} hdfs:// maste.com/rhive/data/root/{file}'
If you cannot, please check your hadoop configuration or hadoop installation!
On Tue, Nov 4, 2014 at 4:21 PM, suolemen notifications@github.com wrote:
when i exec commond : rhive.write.table(liyg_a,"liyt_w3ww",sep=",", naString=NULL, rowName=FALSE,rowNameColumn="rowname")
find a error!!!!! java.sql.SQLException: Error while processing statement: FAILED: SemanticException Line 1:17 Invalid path ''/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c'': No files matching path hdfs:// maste.com/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c
what the problem ? who can help me ?
— Reply to this email directly or view it on GitHub https://github.com/nexr/RHive/issues/74.
hadoop dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt
err :
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
hadoop -> hdfs is ok!! :
hdfs dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt
rhive.hdfs.put("/data/rdata.txt","hdfs://maste.com/rhive/data/root/bbbccc2.txt") error: java.lang.IllegalArgumentException: Wrong FS: hdfs://hdmaster.infobird.com/rhive/data/root/bbbccc2.txt, expected: file:///
is the version problems?
Please, let me know rhive environment variables. You can check it by using rhive function "rhive.env()"
On Tue, Nov 4, 2014 at 7:05 PM, suolemen notifications@github.com wrote:
hadoop dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt err :
DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it.
hadoop -> hdfs is ok!! :
hdfs dfs -put /data/rdata.txt hdfs://maste.com/rhive/data/root/rdata.txt
rhive.hdfs.put("/data/rdata.txt","hdfs:// maste.com/rhive/data/root/bbbccc2.txt") error: java.lang.IllegalArgumentException: Wrong FS: hdfs:// hdmaster.infobird.com/rhive/data/root/bbbccc2.txt, expected: file:///
is the version problems?
— Reply to this email directly or view it on GitHub https://github.com/nexr/RHive/issues/74#issuecomment-61617676.
rhive.env() hadoop home: /usr/lib/hadoop fs: file:/// hive home: /usr/lib/hive user name: root user home: /root
Your defaultFS value is missing.
Try again by using the following function:
rhive.connect(defaultFS = "{your defaultFS}")
Please, let me know the result.
On Wed, Nov 5, 2014 at 10:44 AM, suolemen notifications@github.com wrote:
rhive.env() hadoop home: /usr/lib/hadoop fs: file:/// hive home: /usr/lib/hive user name: root user home: /root
— Reply to this email directly or view it on GitHub https://github.com/nexr/RHive/issues/74#issuecomment-61747236.
hadoop hive fsdir: hdfs dfs -ls /user/hive/warehouse use whitch? rhive.connect("10.122.74.236",defaultFS="hdfs://maste.com/rhive/data/") rhive.connect("10.122.74.236",defaultFS="hdfs://master.com/user/hive/warehouse") rhive.connect("10.122.74.236",defaultFS="/rhive/data/") rhive.connect("10.122.74.236",defaultFS="/user/hive/warehouse")
hadoop : core-site.xml
rhive.connect("10.122.74.236",defaultFS="hdfs://master.com/")
rhive.env() hadoop home: /usr/lib/hadoop fs: hdfs://master.com/ hive home: /usr/lib/hive user name: root user home: /root temp dir: /tmp/root>
rhive.write.table() is ok ! think you
when i exec commond : rhive.write.table(liyg_a,"liyt_w3ww",sep=",", naString=NULL, rowName=FALSE,rowNameColumn="rowname")
find a error!!!!! java.sql.SQLException: Error while processing statement: FAILED: SemanticException Line 1:17 Invalid path ''/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c'': No files matching path hdfs://maste.com/rhive/data/root/liyt_w3ww_1a326d158e099ba3ff421fb9fcae4c
what the problem ? who can help me ?