-
Feedback from extensions webinar...
For, say:
```scala
socketTextStream(hostname: String, port: Int, storageLevel: StorageLevel = StorageLevel.MEMORY_AND_DISK_SER_2)
```
One needs to explic…
-
When I run the script on rstudio-server (web):
library(sparklyr)
library(dplyr)
library(DBI)
sc sparklyr::spark_home_dir()
NULL
> sc
-
Probably the weirdest bug I've encountered in a long time - SparkR (and sparklyr, by extension) cannot handle more than 16 concurrent Spark sessions from a single machine under the default settings.
…
jstaf updated
7 years ago
-
I try to use sparklyr to connect to a remote spark server:
I wrote it as :
for example : the clusterlp is: 139.133.566.90“ and port is : 10001
soI wrote as:
sc
-
It would be useful if there could be some way to parse basic Scala code declared from R, such that the following types of call are possible:
``` r
new_data_frame %
invoke("map", "(x: Row) => x")
`…
-
# Reporting an Issue with sparklyr na.omit()
---
Unfortunately I can not provide an workable example but here is my issue.
I have a dataset with 50 features, only 1 feature has a lot of missing v…
-
Hello,
The codes:
install.packages("sparklyr",repos="http://cran.rstudio.com/")
library(sparklyr)
library(dplyr)
# spark_available_versions()
spark_install(version = "2.0.2",hadoop_version = …
-
Hi,
We are using sparklyr 0.4.16 on our cluster and is trying to connect to spark using spark_connect function.
The commands are as follows:
_Sys.setenv(SPARK_HOME="/opt/cloudera/parcels/CDH/lib/spa…
-
Hi,
I cannot set up the spark_connection on spark 2.0.0, following the instruction on `http://spark.rstudio.com/deployment.html`
However, I could do that on spark 1.6.x .
For different version of spar…
-
[zeromq.org](http://zeromq.org/)