-
您好,目前有这方面的需求,但看了spark-submit提交作业的过程后发现,这不容易实现。
spark-submit在提交作业后要进行一些列的操作,比如Spark Context的初始化,DAG Scheduler和Task Scheduer的创建,从resourceManger申请资源,excutor 线程分配等。
个人的理解是,如果想动态添加job,需要让…
-
```
Package and deploy the job to Spark cluster
INFO: Begin uploading file C:\Users\rufan\IdeaProjects\spark23mvn_0930\out\artifacts\spark23mvn_0930_DefaultArtifact\default_artifact.jar to Azure Blo…
-
I've installed combine on an AWS instance via ansible. It's 2 cores, 8GB RAM, 20 GB disk, Ubuntu 18.04, Python 2.7. Whenever I try to navigate to ```/combine/system```, I get a number of 502 Bad Gatew…
-
This documentation on known issues with external JARs is helpful and I am glad that it is included: https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-livy-rest-interface#using-an-ext…
-
Update on:
* system page
* documentation
* background tasks screen?
-
pyspark3 kernel
livy 0.4.0
python 3.5 on a cluster
python 3.6 locally
sparkmagic 0.12.5
notebook 5.4.0
We have upgraded our cluster with Ambari which also bumped up the Livy's version to 0.…
-
Build: Private 2641
Repro Steps:
1. Create an empty artifact from File\ProjectStructure\Artifacts\+\Jar\Empty\OK
2. Create an HDInsight config file, select the empty artifact, submit
![image](ht…
-
xgboost does not support spark 2.2, perhaps because of 2.1.0 in below:
https://github.com/dmlc/xgboost/blob/master/jvm-packages/pom.xml
It would be great to upgrade or make it flexible for compi…
-
Is it possible to submit a Livy Spark Batch job that references a python file instead of a jar file? I have tried something of the following, but the job fails:
curl -k --user "user:pwd" -v -H "Conten…
-
Beyond the basic firewall rules one can put in place, we would like to tie all proxied services into a single authentication mechanism. Whether or not that is basic auth through NGINX or something mor…