winghc / hadoop2x-eclipse-plugin

eclipse plugin for hadoop 2.2.0 , 2.4.1
559 stars 587 forks source link

eclipse runtime error: 'An internal error occurred during: "Connecting to DFS MyHadoop".' #18

Open lmcl90 opened 9 years ago

lmcl90 commented 9 years ago

When eclipse tryed to connect hdfs, an error occurred. The log from eclipse is

log4j:WARN No appenders could be found for logger (org.apache.hadoop.security.authentication.util.KerberosName).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

!ENTRY org.eclipse.core.jobs 4 2 2014-12-10 22:03:25.098
!MESSAGE An internal error occurred during: "Connecting to DFS MyHadoop".
!STACK 0
java.lang.NoClassDefFoundError: org/htrace/Trace
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:214)
    at com.sun.proxy.$Proxy23.getListing(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:554)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy24.getListing(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1969)
    at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1952)
    at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:693)
    at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:105)
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:755)
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:751)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:751)
    at org.apache.hadoop.eclipse.dfs.DFSFolder.loadDFSFolderChildren(DFSFolder.java:61)
    at org.apache.hadoop.eclipse.dfs.DFSFolder$1.run(DFSFolder.java:178)
    at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
Caused by: java.lang.ClassNotFoundException: org.htrace.Trace cannot be found by org.apache.hadoop.eclipse_0.18.0
    at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:423)
    at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:336)
    at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:328)
    at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:160)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 21 more

!ENTRY org.eclipse.core.jobs 4 2 2014-12-10 22:03:25.507
!MESSAGE An internal error occurred during: "Connecting to DFS MyHadoop".
!STACK 0
java.lang.NoClassDefFoundError: org/htrace/Trace
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:214)
    at com.sun.proxy.$Proxy23.getListing(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:554)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy24.getListing(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1969)
    at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1952)
    at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:693)
    at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:105)
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:755)
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:751)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:751)
    at org.apache.hadoop.eclipse.dfs.DFSFolder.loadDFSFolderChildren(DFSFolder.java:61)
    at org.apache.hadoop.eclipse.dfs.DFSFolder$1.run(DFSFolder.java:178)
    at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
...

My os is OS X Yosemite. The hadoop version is 2.6.0 and eclipse is luna.

Anyone has idea about this problem?

jainmanoj commented 9 years ago

I also faced this issue. I have setup with hadoop 2.6.0 (Installed remotely on Linux system) I am setting up eclipse plugin on windows remotely

How to fix

Change the hadoop version in libraries.properties with hadoop.version=2.6.0

Add the htrace-core-3.0.4.jar add following line in src/contrib/eclipse-plugin/build.xml

rebuild and deploy plugin

Hope it will solve your issue

lmcl90 commented 9 years ago

Thanks, but it doesn't work!

I firstly changed the hadoop version in ${hadoop2x-eclipse-plugin}/ivy/libraries.properties to 2.6.0 and in ${hadoop2x-eclipse-plugin}/src/contrib/eclipse-plugin/build.xml , I added a new line <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-3.0.4.jar" todir="${build.dir}/lib" verbose="true"/> and an element lib/htrace-core.3.0.4.jar in the value of node: <attribute name="Bundle-ClassPath" .

Rebuild and deploy again.

Is this right?

jainmanoj commented 9 years ago

Please add the manifest attribute to include ........ lib/netty-${netty.version}.jar, .....

lib/htrace-core-3.0.4.jar"/>

For me it works fine.

Thanks Manoj

On Sat, Dec 13, 2014 at 11:31 AM, Li Ming notifications@github.com wrote:

Thanks, but it doesn't work!

I firstly changed the hadoop version in ${hadoop2x-eclipse-plugin}/ivy/libraries.properties to 2.6.0 and in ${hadoop2x-eclipse-plugin}//src/contrib/eclipse-plugin/build.xml , I added a new line <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-3.0.4.jar" todir="${build.dir}/lib" verbose="true"/> and an element lib/htrace-core.3.0.4.jar in the value of node: <attribute name="Bundle-ClassPath" . Is this right?

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66866526 .

lmcl90 commented 9 years ago

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

<manifest>
   <attribute name="Bundle-ClassPath"▫
    value="classes/,▫
 lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
 lib/hadoop-auth-${hadoop.version}.jar,
 lib/hadoop-common-${hadoop.version}.jar,
 lib/hadoop-hdfs-${hadoop.version}.jar,
 lib/protobuf-java-${protobuf.version}.jar,
 lib/log4j-${log4j.version}.jar,
 lib/commons-cli-${commons-cli.version}.jar,
 lib/commons-configuration-${commons-configuration.version}.jar,
 lib/commons-httpclient-${commons-httpclient.version}.jar,
 lib/commons-lang-${commons-lang.version}.jar,▫▫
 lib/commons-collections-${commons-collections.version}.jar,▫▫
 lib/jackson-core-asl-${jackson.version}.jar,
 lib/jackson-mapper-asl-${jackson.version}.jar,
 lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
 lib/slf4j-api-${slf4j-api.version}.jar,
 lib/guava-${guava.version}.jar,
 lib/netty-${netty.version}.jar,
 lib/htrace-core.3.0.4.jar"/>
   </manifest>
jainmanoj commented 9 years ago

Now build and install plugin. It should work

On Sat, Dec 13, 2014 at 3:43 PM, Li Ming notifications@github.com wrote:

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66871749 .

lmcl90 commented 9 years ago

I tried, but it still didn't work... But thanks a lot!

jainmanoj commented 9 years ago

Hi Ming,

Did your compilation successful. Please verify if your manifest file is correct after compilation.

Now deploy the plugin in eclipse ( %ECLIPSE_HOME%\plugin)

Restart eclipse

eclipse -clean -consolelog -debug

If it doesn't work, I sent send you my compiled binaries (It is for windows platform for Hadoop 2.6.0)

Thanks Manoj

On Sat, Dec 13, 2014 at 3:43 PM, Li Ming notifications@github.com wrote:

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66871749 .

lmcl90 commented 9 years ago

Hi Manoj,

Ant said the compilation was successful, although there was a warning [javac] /Users/lim/Documents/github/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xml:76: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds .

I extracted all files from the complied binaries and saw the classpath in manifest file contained “lib/htrace-core.3.0.4.jar” and there was a htrace-core.3.0.4.jar file under the lib directory. The jar file should be correct.

After redeployed and started eclipse again to connect to hdfs, same error occurred.

I have tried what you said, but it still did not work. Please send me your binaries and I will try with it.

Thank you very much for your help!

Li Ming

在 2014年12月18日,下午4:27,jainmanoj notifications@github.com 写道:

Hi Ming,

Did your compilation successful. Please verify if your manifest file is correct after compilation.

Now deploy the plugin in eclipse ( %ECLIPSE_HOME%\plugin)

Restart eclipse

eclipse -clean -consolelog -debug

If it doesn't work, I sent send you my compiled binaries (It is for windows platform for Hadoop 2.6.0)

Thanks Manoj

On Sat, Dec 13, 2014 at 3:43 PM, Li Ming notifications@github.com wrote:

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66871749 .

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67455174.

jainmanoj commented 9 years ago

​ hadoop-eclipse-plugin-2.6.0.jar https://docs.google.com/file/d/0B9Hvp2o6NunXYVQ1aFc2YUVsRjQ/edit?usp=drive_web ​Hi Li,

Sorry, I was busy. Enclosing the eclipse plugin working on my system

The size is around 39MB. So sharing it through Google drive

Thanks Manoj Kumar Jain

On Thu, Dec 18, 2014 at 2:53 PM, Li Ming notifications@github.com wrote:

Hi Manoj,

Ant said the compilation was successful, although there was a warning [javac] /Users/lim/Documents/github/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xml:76: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds.

I extracted all files from the complied binaries and saw the classpath in manifest file contained “lib/htrace-core.3.0.4.jar” and there was a htrace-core.3.0.4.jar file under the lib directory. The jar file should be correct.

After redeployed and started eclipse again to connect to hdfs, same error occurred.

I have tried what you said, but it still did not work. Please send me your binaries and I will try with it.

Thank you very much for your help!

Li Ming

在 2014年12月18日,下午4:27,jainmanoj notifications@github.com 写道:

Hi Ming,

Did your compilation successful. Please verify if your manifest file is correct after compilation.

Now deploy the plugin in eclipse ( %ECLIPSE_HOME%\plugin)

Restart eclipse

eclipse -clean -consolelog -debug

If it doesn't work, I sent send you my compiled binaries (It is for windows platform for Hadoop 2.6.0)

Thanks Manoj

On Sat, Dec 13, 2014 at 3:43 PM, Li Ming notifications@github.com wrote:

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

— Reply to this email directly or view it on GitHub < https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66871749>

.

— Reply to this email directly or view it on GitHub < https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67455174>.

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67460661 .

lmcl90 commented 9 years ago

Hi Manoj,

The project had already updated to hadoop 2.6.0 and I used the plugin released by the owner of the project. That worked fine for me. Sorry forgot to tell you that.

Thanks so much!

Li Ming‍

------------------ 原始邮件 ------------------ 发件人: "jainmanoj";notifications@github.com; 发送时间: 2014年12月25日(星期四) 晚上10:50 收件人: "winghc/hadoop2x-eclipse-plugin"hadoop2x-eclipse-plugin@noreply.github.com; 抄送: "Li Ming"liming@bioinfo.ac.cn; 主题: Re: [hadoop2x-eclipse-plugin] eclipse runtime error: 'An internalerror occurred during: "Connecting to DFS MyHadoop".' (#18)

​ hadoop-eclipse-plugin-2.6.0.jar https://docs.google.com/file/d/0B9Hvp2o6NunXYVQ1aFc2YUVsRjQ/edit?usp=drive_web ​Hi Li,

Sorry, I was busy. Enclosing the eclipse plugin working on my system

The size is around 39MB. So sharing it through Google drive

Thanks Manoj Kumar Jain

On Thu, Dec 18, 2014 at 2:53 PM, Li Ming notifications@github.com wrote:

Hi Manoj,

Ant said the compilation was successful, although there was a warning [javac] /Users/lim/Documents/github/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xml:76: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds.

I extracted all files from the complied binaries and saw the classpath in manifest file contained “lib/htrace-core.3.0.4.jar” and there was a htrace-core.3.0.4.jar file under the lib directory. The jar file should be correct.

After redeployed and started eclipse again to connect to hdfs, same error occurred.

I have tried what you said, but it still did not work. Please send me your binaries and I will try with it.

Thank you very much for your help!

Li Ming

在 2014年12月18日,下午4:27,jainmanoj notifications@github.com 写道:

Hi Ming,

Did your compilation successful. Please verify if your manifest file is correct after compilation.

Now deploy the plugin in eclipse ( %ECLIPSE_HOME%\plugin)

Restart eclipse

eclipse -clean -consolelog -debug

If it doesn't work, I sent send you my compiled binaries (It is for windows platform for Hadoop 2.6.0)

Thanks Manoj

On Sat, Dec 13, 2014 at 3:43 PM, Li Ming notifications@github.com wrote:

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

— Reply to this email directly or view it on GitHub < https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66871749>

.

— Reply to this email directly or view it on GitHub < https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67455174>.

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67460661 .

— Reply to this email directly or view it on GitHub.

jainmanoj commented 9 years ago

Thats great Manoj On 26-Dec-2014 7:07 AM, "Li Ming" notifications@github.com wrote:

Hi Manoj,

The project had already updated to hadoop 2.6.0 and I used the plugin released by the owner of the project. That worked fine for me. Sorry forgot to tell you that.

Thanks so much!

Li Ming‍

------------------ 原始邮件 ------------------ 发件人: "jainmanoj";notifications@github.com; 发送时间: 2014年12月25日(星期四) 晚上10:50 收件人: "winghc/hadoop2x-eclipse-plugin"< hadoop2x-eclipse-plugin@noreply.github.com>; 抄送: "Li Ming"liming@bioinfo.ac.cn; 主题: Re: [hadoop2x-eclipse-plugin] eclipse runtime error: 'An internalerror occurred during: "Connecting to DFS MyHadoop".' (#18)

​ hadoop-eclipse-plugin-2.6.0.jar < https://docs.google.com/file/d/0B9Hvp2o6NunXYVQ1aFc2YUVsRjQ/edit?usp=drive_web>

​Hi Li,

Sorry, I was busy. Enclosing the eclipse plugin working on my system

The size is around 39MB. So sharing it through Google drive

Thanks Manoj Kumar Jain

On Thu, Dec 18, 2014 at 2:53 PM, Li Ming notifications@github.com wrote:

Hi Manoj,

Ant said the compilation was successful, although there was a warning `[javac]

/Users/lim/Documents/github/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xml:76:

warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds`.

I extracted all files from the complied binaries and saw the classpath in manifest file contained “lib/htrace-core.3.0.4.jar” and there was a htrace-core.3.0.4.jar file under the lib directory. The jar file should be correct.

After redeployed and started eclipse again to connect to hdfs, same error occurred.

I have tried what you said, but it still did not work. Please send me your binaries and I will try with it.

Thank you very much for your help!

Li Ming

在 2014年12月18日,下午4:27,jainmanoj notifications@github.com 写道:

Hi Ming,

Did your compilation successful. Please verify if your manifest file is correct after compilation.

Now deploy the plugin in eclipse ( %ECLIPSE_HOME%\plugin)

Restart eclipse

eclipse -clean -consolelog -debug

If it doesn't work, I sent send you my compiled binaries (It is for windows platform for Hadoop 2.6.0)

Thanks Manoj

On Sat, Dec 13, 2014 at 3:43 PM, Li Ming notifications@github.com wrote:

I added the path of htrace-core-3.0.4.jar in the last line and blow is the attribute to include

— Reply to this email directly or view it on GitHub <

https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-66871749>

.

— Reply to this email directly or view it on GitHub <

https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67455174>.

— Reply to this email directly or view it on GitHub < https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-67460661>

.

— Reply to this email directly or view it on GitHub.

— Reply to this email directly or view it on GitHub https://github.com/winghc/hadoop2x-eclipse-plugin/issues/18#issuecomment-68117825 .

GeeketteDz commented 9 years ago

Hello guys, I am new in this and want to ask you in deep how to install it did you import the project as whole or waht?

Thanks

dharmajaya commented 9 years ago

Hi Guys,

I have configured distributed mode in linux 14.04.I have configured hadoop,hbase for fully distributed mode.Now i want to connect spring project to htbase database.But it throwing the following error. java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at org.apache.hadoop.hbase.client.HBaseAdmin.checkHBaseAvailable(HBaseAdmin.java:2508) at net.top.app.antipoverty.web.helppoorvillagesandtownsproplan.HelpPoorVillagesandtownsProplanController.createForm(HelpPoorVillagesandtownsProplanController.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778) at javax.servlet.http.HttpServlet.service(HttpServlet.java:621) at javax.servlet.http.HttpServlet.service(HttpServlet.java:728) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:51) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.orm.jpa.support.OpenEntityManagerInViewFilter.doFilterInternal(OpenEntityManagerInViewFilter.java:147) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:61) at org.apache.shiro.web.servlet.AdviceFilter.executeChain(AdviceFilter.java:108) at org.apache.shiro.web.servlet.AdviceFilter.doFilterInternal(AdviceFilter.java:137) at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125) at org.apache.shiro.web.servlet.ProxiedFilterChain.doFilter(ProxiedFilterChain.java:66) at org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:449) at org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:365) at org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90) at org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83) at org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:380) at org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:362) at org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:125) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:100) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:953) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1041) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:603) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:312) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) ... 63 more Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/Trace at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:218) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:481) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:833) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.(ConnectionManager.java:623) ... 68 more Caused by: java.lang.ClassNotFoundException: org.apache.htrace.Trace at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1702) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547) ... 74 more

I have added the htrace-core-3.0.4.jar and hadoop-eclipse-plugin-2.6.0.jar in project lib folder. But i dont where is the problem.Please any one help me Thanks in advance.

emailfeifan commented 8 years ago

Hi Guys, I want to build hadoop-eclipse-plugin-2.7.1.jar, but failed. The platform is Ubuntu 14.04 x86_64. The version of hadoop is 2.7.1. I had modified the "build.xml" about "htrace-core-3.1.0-incubating.jar" and I doubt there is something wrong with "htrace-core-3.1.0-incubating.jar".

The error info as follows: An internal error occurred during: "Connecting to DFS MyHadoop".org/apache/htrace/SamplerBuilder

Please any one help me Thanks in advance.

kmc-github commented 8 years ago

same issue as "emailfeifan" however for hadoop 2.7.2 and Ubuntu 15.04. "htrace-core-3.1.0-incubating.jar" reflects in manifest.mf. "htrace-core-3.1.0-incubating.jar" is packaged inside lib of the plugin.

KennyD10 commented 8 years ago

Trying to build hadoop-eclipse-plugin-2.7.2.jar. Built with htrace-core-3.0.4.jar, got this,

java.lang.NoClassDefFoundError: org/apache/htrace/SamplerBuilder at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:635) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:619) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:170) at org.apache.hadoop.eclipse.server.HadoopServer.getDFS(HadoopServer.java:478) at org.apache.hadoop.eclipse.dfs.DFSPath.getDFS(DFSPath.java:146) at org.apache.hadoop.eclipse.dfs.DFSFolder.loadDFSFolderChildren(DFSFolder.java:61) at org.apache.hadoop.eclipse.dfs.DFSFolder$1.run(DFSFolder.java:178) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)

Built with htrace-core-3.1.0-incubating.jar, got this,

java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hdfs.DFSConfigKeys at org.apache.hadoop.hdfs.DFSClient$Conf.(DFSClient.java:509) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:638) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:619) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:170) at org.apache.hadoop.eclipse.server.HadoopServer.getDFS(HadoopServer.java:478) at org.apache.hadoop.eclipse.dfs.DFSPath.getDFS(DFSPath.java:146) at org.apache.hadoop.eclipse.dfs.DFSFolder.loadDFSFolderChildren(DFSFolder.java:61) at org.apache.hadoop.eclipse.dfs.DFSFolder$1.run(DFSFolder.java:178) at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)

Can not figure out why...perhaps htrace-core-3.1.0-incubating.jar does not work with hadoop 2.7.2 yet??

hadoop-eclipse-plugin-2.6.0.jar works for me with hadoop 2.7.2 (as mentioned in #30).