findify / s3mock

Embedded S3 server for easy mocking
MIT License
386 stars 107 forks source link

java.lang.ClassNotFoundException: scala.$less$colon$less #178

Open coperator opened 3 years ago

coperator commented 3 years ago

I get this error java.lang.ClassNotFoundException: scala.$less$colon$less at S3Mock api = new S3Mock.Builder().withPort(8001).withInMemoryBackend().build();

My pom.xml:

    <dependency>
        <groupId>io.findify</groupId>
        <artifactId>s3mock_2.13</artifactId>
        <version>0.2.6</version>
        <scope>test</scope>
    </dependency>

What could be the reason?

shuttie commented 3 years ago

Can you show your other dependencies? Looks like you have a scala 2.12 somewhere in your classpath. You can try using a s3mock_2.12 version and see if it helps.

zaxeer commented 3 years ago

I also get the error after upgrading my springboot application from OpenJKD8 to OpenJDK11 and Springboot 2.2.4.RELEASE to 2.4.5. As these were must upgrades i have to do so even going to lower version s3mock_2.12 still causes Scala Error. Have no know how of Scala so writing here.

NagulSaida commented 2 years ago

I am receiving the same issue. please provide some solution.

patiludayk commented 1 year ago

with below dependencies I am getting same error while running simple hello world program from read me file

              <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>3.3.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.13</artifactId>
            <version>3.3.1</version>
        </dependency>
String logFile = "/java/spark/spark/README.md"; // Should be some file on your system
        SparkSession spark = SparkSession.builder().appName("Simple Application").config("spark.master",     "local").getOrCreate();
        Dataset<String> logData = spark.read().textFile(logFile).cache();

        long numAs = logData.filter((FilterFunction<String>) s -> s.contains("a")).count();
        long numBs = logData.filter((FilterFunction<String>) s -> s.contains("b")).count();

        System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);

        spark.stop();