sbt / sbt-pom-reader

Translates xml -> awesome. Maven-ish support for sbt.
Other
76 stars 27 forks source link

Check for scope if test assign test scope to interdependent projects. #14

Closed ScrapCodes closed 7 years ago

ScrapCodes commented 10 years ago

With this that test-jar struggle ends. We no longer need to detect that, I somehow "feel" if a project dependency on another project (in multiproject build) has test scope then we should do this.

ScrapCodes commented 10 years ago

Hey @jsuereth, Can you take another look ? does it look good now !

metasim commented 9 years ago

@ScrapCodes Do you think this PR is still necessary? If so, could you please add a test case that triggers the bug before the PR, and yet passes after the PR is applied?

My intuition is that messing with the translation between Maven and sbt scopes without adequate tests could miss some edge cases and potentially let issues leak through.

JoshRosen commented 8 years ago

Apache Spark currently uses a custom fork of the sbt-pom-reader plugin which incorporates this change and getting this patch merged upstream is probably a blocker to us moving back to the official sbt-pom-reader releases (see https://issues.apache.org/jira/browse/SPARK-14401). While I don't have time to work on this personally, I bet that we could try reverting this change in Spark's build, figure out what breaks, then use that knowledge to derive a regression test to add here.

JoshRosen commented 8 years ago

I attempted to use version 2.1.0-RC2 of this plugin in Spark (https://github.com/apache/spark/pull/12310) and have an example of a test which fails without these changes:

[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/graphx/src/test/scala/org/apache/spark/graphx/EdgeRDDSuite.scala:20: object SparkFunSuite is not a member of package org.apache.spark
[error] import org.apache.spark.SparkFunSuite
[error]        ^

In this case, we have a spark-core project whose test-jar is supposed to include SparkFunSuite and a spark-graphx project which depends on spark-core in compile scope and also depends on the spark-core test-jar in test scope:

<dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_${scala.binary.version}</artifactId>
      <version>${project.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_${scala.binary.version}</artifactId>
      <version>${project.version}</version>
      <type>test-jar</type>
      <scope>test</scope>
    </dependency>
[...]

I haven't dug in too deeply yet and will try to pull the output of dependencyTree from SBT later today, but my hunch is that this is an instance of #38 and therefore I think it might be good to produce a new pull request which more narrowly-scopes this change to apply only to test-jar dependencies.

JoshRosen commented 8 years ago

This also looks related to #37. In short, it seems that several folks are experiencing similar problems with test-jar dependencies.

JoshRosen commented 8 years ago

Another subtlety which I don't think this current PR handles correctly: in Maven, declaring a dependency on a test-jar does not pull in that test-jar artifact's transitive dependencies (https://issues.apache.org/jira/browse/MNG-1378), whereas I think the approach taken here does pull in those transitive deps. We've been bitten by this in Spark before when people add only a test-jar dependency, forget its transitive dependencies, then see the SBT build pass, commit, and have the Maven build fail.

metasim commented 7 years ago

This has remained with open questions but no progress for long enough that I'm closing it to get it out of the list. Reopen if there's disagreement.