com-lihaoyi / Ammonite

Scala Scripting
http://ammonite.io
MIT License
2.61k stars 367 forks source link

ammonite does not import ammonite scripts with several @ properly #744

Open antonkulaga opened 6 years ago

antonkulaga commented 6 years ago

When I create script for ammonite 2.11 that has one or two @ then it is not imported properly. For example:

import coursier.MavenRepository
interp.repositories() ++= Seq(
  MavenRepository("https://dl.bintray.com/denigma/denigma-releases"),
  MavenRepository("https://dl.bintray.com/comp-bio-aging/main/"),
  MavenRepository("https://oss.sonatype.org/content/repositories/releases")
)

@
import $ivy.{ 
  `org.apache.spark::spark-core:2.2.1`,
  `comp.bio.aging::adam-playground:0.0.9`
}
@
import ammonite.ops._
import org.apache.spark.{SparkContext, SparkConf}

val conf = new SparkConf().setMaster("local[*]").setAppName("bxm")
val sc = new SparkContext(conf)

When I run ammonite and then import my spark script:

import $file.spark
val sc = spark.sc //DOES NOT WORK

I do not see any local variables inside spark. As a workaround I had to put dependencies into separate script and import $file.deps inside spark.sc. However, having two scripts where one is only for dependencies is not convenient.

leo-bogastry commented 6 years ago

Thanks you @antonkulaga. I run into this same issue and moving the problematic dependency to a separate file solved the issue. It would be nice to see this open issue solved.

dynofu commented 6 years ago

i am curious what is this @? a script divider? any document about it's usage? thanks.

sake92 commented 6 years ago

@antonkulaga Shouldn't import $ivy.. instead be interp.load.ivy(..)? See multistage scripts section. (my mistake, that's just for artifacts depending on runtime values)

@dynofu It is mentioned in multistage scripts section.
Since interp.load.ivy(..) isn't a normal Scala method call (Ammonite needs to download JARs and put them on classpath..) a separator is needed to clarify that intention.