Closed ghost closed 8 years ago
I'm afraid that you will not be able to call de.frosner.dds.core.DDS.bar
, because it expects a Scala data frame. You would need to write glue code in Scala for each of these functions that takes the name of the temp table (which is the UUID I guess?).
Also you should remove the temp table afterwards.
Thanks, updated the initial issue posting. Let's try this ...
The longer I think about this, the less sense it makes. I think it is much easier and cleaner to just add some wrapper functions in R around the existing and evolving SparkR functionality if one would have an easy way to visualize results in R.
I think it is much cleaner to enable a generic way for calling Spark packages from PySpark and SparkR
I agree and googled again but something like this doesn't exist atm. As a senior scala developer, you are welcome to submit a proposal/issue. :) If you agree, I will close this issue for now.
:+1:
On 22 Dec 2015, at 08:45, Basil Komboz notifications@github.com wrote:
I agree and googled again but something like this doesn't exist atm. As a senior scala developer, you welcome to submit a proposal/issue. :) If you agree, I will close this issue for now.
— Reply to this email directly or view it on GitHub https://github.com/FRosner/spawncamping-dds/issues/271#issuecomment-166543324.
Problem DDS should be integrated with SparkR so that that useR's can use DDS functionality from the SparkR shell.
Literature
Possible Strategy i. Write glue code functions for DDS which take an UUID and get the corresponding table.
ii. Create a R package which
init()
function in DDS to hand over the current SparkContextsc
object.iii. For each visualization function in DDS: Provide a R function (e.g.,
bar()
forbar()
), whichDataFrame
orColumn
object.SparkR::registerAsTempTable()
and computes an UUID.SparkR:::callJStatic(...)
(e.g.,SparkR:::callJStatic(className = "de.frosner.dds.core.DDS", methodName = "bar", UUID)
).jobj
) to some appropriate R object (e.g., adata.frame
).barplot()
) to visualize the results.