Closed chip-factual closed 9 years ago
@chip-factual if you and/or @ahadrana can let me know exactly which hadoop-core version we want, i'll get you an updated Drake build
Hi Aaron,
I don't believe it is as simple as that. Drake uses clj-hdfs, which needs to be updated as well. And there are some issue related to kerberos that we have to think about.
Ahad.
On Mon, Mar 30, 2015 at 7:13 PM, Aaron Crow notifications@github.com wrote:
@chip-factual https://github.com/chip-factual if you and/or @ahadrana https://github.com/ahadrana can let me know exactly which hadoop-core version we want, i'll get you an updated Drake build
— Reply to this email directly or view it on GitHub https://github.com/Factual/drake/issues/162#issuecomment-87900108.
@ahadrana I can work on this, but I don't know a lot about the hadoop ecosystem: how its libraries work, or how the pieces fit together. So I'd need some guidance for like...what it is I actually need to do. I can update version numbers in project.clj files as well as the next guy, but if it's more complicated than that I'll need some help.
Hi Alan, assigning to you per recent conversations with you and @ahadrana . Please let me know if I can be of help
Sounds good to me. Given the fact that this has not be an issue until recently and we have been using kerberos for more than a year, I am not sure it is high enough priority to preempt any neutronic work that might be on your plate. Feel free to ping me if you have any questions.
On Thu, Apr 2, 2015 at 2:04 PM, Aaron Crow notifications@github.com wrote:
Hi Alan, assigning to you per recent conversations with you and @ahadrana https://github.com/ahadrana . Please let me know if I can be of help
— Reply to this email directly or view it on GitHub https://github.com/Factual/drake/issues/162#issuecomment-89043971.
@chip-factual , Do you build your own drake.jar to run your workflow, possibly with master
branch? Could you please try to do it with branch iflow-demo
? I did some fixes about this on the branch before. However, it does not use the latest hadoop version which we use for our cluster now.
Thanks @jinwen, I got it working using @dongshu-factual's fix from https://github.com/Factual/data-projects/issues/650.
Fixed in develop branch 92ffa2eb9b8344577b9b240c1750d50da84efead
Drake is failing to connect to Factual's hadoop cluster, with an error similar to the one shown on the drake wiki. @ahadrana traced the issue back to an out of date hadoop-core version in
project.clj
.Here's the stack trace I'm seeing: