ed2ryo / hadoop-lda

Automatically exported from code.google.com/p/hadoop-lda
0 stars 0 forks source link

io error when run the program #1

Open GoogleCodeExporter opened 8 years ago

GoogleCodeExporter commented 8 years ago
What steps will reproduce the problem?
1. build use ant
2. put datainput on hadoop as /user/hadoop/lda-hadoop/datainput
3. hadoop jar hadooplda-hadoop.jar train \
--input=/user/hadoop/lda-hadoop/datainput \
--output=/user/hadoop/lda-hadoop/datainput.lda.model \
--working_dir=/user/hadoop/lda-hadoop/datainput.lda.training \
--num_topics=128 \
--num_iterations=100 \
--iterations_to_keep=2 \
--max_num_words=50000 \
--input_format=text \
--min_df=5 \
--alpha=0.45 \
--beta=0.01

What is the expected output? What do you see instead?

the traning is interrupted, and no model file generated.
the wrong information i saw is like this:
------------------------------------------------------------
Exception in thread "main" org.apache.hadoop.ipc.RemoteException: 
java.io.IOException: failed to create file 
/user/hadoop/lda-hadoop/datainput.lda.training/likelihood on client 
10.255.253.14 either because the filename is invalid or the file exists
------------------------------------------------------------
What version of the product are you using? On what operating system?
hadooplda-re-1, 
os info:Linux version 2.6.18-164.el5 (mockbuild@builder10.centos.org) (gcc 
version 4.1.2 20080704 (Red Hat 4.1.2-46)) #1 SMP Thu Sep 3 03:28:30 EDT 2009

Please provide any additional information below.
the data I use is Chinese text and encoding utf-8
hadoop version: Hadoop 0.20.2-cdh3u2

Original issue reported on code.google.com by chengmin...@gmail.com on 11 May 2012 at 4:09

GoogleCodeExporter commented 8 years ago
+1 me too

Original comment by qingfeng...@gmail.com on 23 Aug 2013 at 7:31

GoogleCodeExporter commented 8 years ago
Same problem with me. Anybody respond to this?

Original comment by invetat...@gmail.com on 4 Sep 2013 at 2:25

GoogleCodeExporter commented 8 years ago
I also have experienced this problem on Crunchbang 11 with Hadoop 1.2.1 and 
Oracle Java 7.

Original comment by gavinhac...@gmail.com on 5 Oct 2013 at 6:36

GoogleCodeExporter commented 8 years ago
Did you guys set the <dfs.support.append> parameters in your hdfs-site.xml to 
true?

I believe only if you set that parameter the code will be able to do what it is 
supposed to do, APPEND.

Original comment by harish.v...@gmail.com on 5 Nov 2013 at 8:39

GoogleCodeExporter commented 8 years ago
I added the property as suggested and restarted all daemons after formatting 
namenode. It doesn't help and i keep getting the same error.

Original comment by vinay...@gmail.com on 1 Dec 2013 at 5:59

GoogleCodeExporter commented 8 years ago
Has anyone found a solution to this problem yet ? I am getting the same error. 
Vinay (#5) said the suggested fix did not work. Anyone has any comments on this 
?

thanks

Original comment by aslicel...@gmail.com on 5 Jul 2014 at 1:22

GoogleCodeExporter commented 8 years ago
Same here, actually getting FileAlreadyExistsException, on the file "likelihood"

Original comment by emanuele...@gmail.com on 5 Jul 2014 at 10:09

GoogleCodeExporter commented 8 years ago
Same here 

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsExce
ption)

Original comment by waxkun on 8 Jul 2014 at 10:08

GoogleCodeExporter commented 8 years ago
I resolved downloading the source from here
http://code.google.com/p/hadoop-lda/source/checkout
rather than the .tgz in the download page

Original comment by emanuele...@gmail.com on 12 Jul 2014 at 8:39