sramirez / spark-MDLP-discretization

Spark implementation of Fayyad's discretizer based on Minimum Description Length Principle (MDLP)
Apache License 2.0
44 stars 27 forks source link

Publish on Spark Packages Main Repository #1

Closed brkyvz closed 8 years ago

brkyvz commented 9 years ago

Hi @sramirez,

Would you like to make a release of this package on the Spark Packages Maven Repo? There is an sbt-plugin called sbt-spark-package that would help you make the release straight from your sbt console. All you need to do is set a couple configurations.

Publishing on the Spark Packages Repo will bump your ranking on the website, and will fill in the How To section, which users can use to include your package in their work.

Please let me know if you have any comments/questions/suggestions!

Best, Burak

sramirez commented 9 years ago

Hi @brkyvz,

Yesterday I finished the uploading process with success, although with some problems. I also tried to use this tool for scala but I had some problems with the plugin. Concretely, I obtained a recurrent error about spName, which was not recognized by sbt. I tried to copy some build.sbt and plugins.sbt from other packages with still the same problem :(

I'll upload a release for my other package about feature selection.

Thanks for your attention.

brkyvz commented 9 years ago

Hi @sramirez, Thank you for the feedback. I'll document the import of Keys better. Because you have a Build.scala file, you unfortunately need to import explicitly. Did you still have problems even after import sbtsparkpackage.SparkPackagePlugin.autoImport._?

I would greatly appreciate any more feedback you might have!

Thanks, Burak

sramirez commented 9 years ago

Hi,

No way. I've tried in many ways, but I always fail. I am not very used to sbt. I think there is a problem with import order between project folder and build.sbt.

This is the result for the second package: infotheoretic-feature-selection:

[info] Loading project definition from /home/sramirez/git/spark-infotheoretic-feature-selection/project [info] Updating {file:/home/sramirez/git/spark-infotheoretic-feature-selection/project/}spark-infotheoretic-feature-selection-build... [info] Resolving org.scala-sbt#compiler-interface;0.13.1 ... [info] Done updating. /home/sramirez/git/spark-infotheoretic-feature-selection/build.sbt:19: error: not found: value sparkPackageName sparkPackageName := "sramirez/infotheoretic-feature-selection"

barrybecker4 commented 8 years ago

I'm not sure if this has anything to do with your difficulties, but I think the package should be org.apache.spark... and not org.apache. In the sbt file, I think it should be organization := "org.apache.spark", instead of organization := "org.apache", That way it appears in the same place as lots of other spark-* libs in the local ivy2 and maven2 repos. I made this change in my fork, but did not do a PR for it.

sramirez commented 8 years ago

@barrybecker4, could you do a PR with your build.sbt, please? It'd be nice to have a build.sbt in the project.

barrybecker4 commented 8 years ago

I did not add a new build.sbt file. I just made a few tweaks to the existing project/Build.scala file. I don't know enough about sbt to say what the correct thing to do is. Probably there should be a build.sbt file at the top level, but I am currently able to run "sbt assembly" from the command line so maybe its ok for now.