vesoft-inc / nebula-spark-utils

Spark related libraries and tools
23 stars 31 forks source link

Pyspark 可以使用 Nebula Spark Connector 吗? #122

Closed wuyanxin closed 2 years ago

wuyanxin commented 3 years ago

是否有像 elasticsearch 那样的方法来导入到 nebula

    options = OrderedDict()
    options["es.nodes"] = ['127.0.0.1:9200']
    options["es.index.auto.create"] = "true"
    options["es.resource"] = "nebula/docs"

    df.write.format("org.elasticsearch.spark.sql") \
            .options(**options) \
            .save(mode='append')
wey-gu commented 3 years ago

Thank you @wuyanxin for the question. @Nicole00 could you help with this? Could we do something like this?

from py4j.java_gateway import java_import
java_import(sc._gateway.jvm,"org.foo.module.Foo")

func = sc._gateway.jvm.Foo()
func.fooMethod()
wey-gu commented 3 years ago

Checked with @Nicole00 that we didn't provide the PySpark interface for now. I Labeled it as an enhancement. Also, not sure if the py4j mitigation could help a little bit before official support.

Nicole00 commented 2 years ago

do not support.