Closed johnsca closed 8 years ago
It wasn't necessary to get the bundle tests in realtime-syslog-analytics to pass. It would probably be nice to have, but I didn't immediately see a way to query that from Hadoop and I would rather leave it out than have hard-coded cross-charm path dependencies. The other option would be to have the hadoop-plugin
interface provide more path info.
I'll run through sparkbench to see if any of those workloads trigger a need for SLP. This LGTM for now.
This LGTM, but is SPARK_CLASSPATH enough, or do we also need to set SPARK_LIBRARY_PATH as mentioned here?
http://xiaming.me/posts/2014/05/03/enable-lzo-compression-on-hadoop-pig-and-spark/#for-spark