Closed brkyvz closed 9 years ago
Hi Burak, Thanks a lot for your email. I will go through the spark package tool and will make a release of this consumer code. Regards, Dibyendu
On Thursday, 2 April 2015 2:12 AM, Burak Yavuz <notifications@github.com> wrote:
Hi, Would you like to make a release of this on the Spark Packages Repository? This will allow users to easily include this package in their Spark Applications simply by adding the flag: --packages dibbhatt/kafka-spark-consumer:0.1 to spark-shell, spark-submit, or even pyspark.For this, you need to upload a "Release Artifact". You can make the release directly from the command line by simply using the spark-package command tool, with the command spark-package publish. Please refer to the README. Or you can go through the Release process on the webpage. Since your project contains java code, you will need to build your jar beforehand using maven.Let me know if you have any questions or issues!Best, Burak— Reply to this email directly or view it on GitHub.
Hi Dibyendu, Did you have a chance to try out the command tool? Did you face any difficulties, was it confusing? Do you have any feedback?
Thanks, Burak
Hi Burak
Sorry I have not had a chance to try the tool. Infact I wanted to make some API changes in my consumer and I could not get it done as was tied up with something else. Today I had made those changes in my Consumer and now I am ready to publish the build. I am planning to try it over the weekend and will let you know if I face any issue . Sorry again for the delay and thanks for following it up .
Regards Dibyendu
Sent from Samsung Mobile
Burak Yavuz notifications@github.com wrote:
Hi Dibyendu, Did you have a chance to try out the command tool? Did you face any difficulties, was it confusing? Do you have any feedback?
Thanks, Burak
— Reply to this email directly or view it on GitHub.
kafka-spark-consumer is now released on Spark-Packages
http://spark-packages.org/package/dibbhatt/kafka-spark-consumer
Hi, Would you like to make a release of this on the Spark Packages Repository? This will allow users to easily include this package in their Spark Applications simply by adding the flag:
--packages dibbhatt/kafka-spark-consumer:0.1
tospark-shell
,spark-submit
, or evenpyspark
.For this, you need to upload a "Release Artifact". You can make the release directly from the command line by simply using the spark-package command tool, with the command
spark-package publish
. Please refer to the README. Or you can go through the Release process on the webpage. Since your project contains java code, you will need to build your jar beforehand using maven.Let me know if you have any questions or issues!
Best, Burak