Closed Horkyze closed 5 years ago
Uh, what exactly are you using Maven and Ant for? My recommendation is to use one, not both (preferably Maven).
Ok, thanks. I managed to add spark by adding the jar file into build.xml: <include name="spark-core-2.7.2.jar"/>
now the ant compile with ant works file.
However when I tried to run hello world example byt adding this:
import static spark.Spark.*;
...
get("/hello", (req, res) -> "Hello World");
An exeption is raised. Looks like there is a problem with running the embeded jetty server. Is there any other dependency I need to import?
Exception in thread "Thread-0" java.lang.NoClassDefFoundError: javax/servlet/Filter
at spark.embeddedserver.EmbeddedServers.initialize(EmbeddedServers.java:40)
at spark.Service.lambda$init$2(Service.java:536)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: javax.servlet.Filter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 3 more
I suggest you try using only Maven, see: http://sparkjava.com/tutorials/maven-setup
Couldn't spot a clear "deploying Spark guide", but the deployment instructions for Heroku have a nice example of using maven-assembly-plugin
to assemble a JAR including dependencies: http://sparkjava.com/tutorials/heroku
Thanks, I managed to make to work now :) If you are interested https://github.com/MWGA/floodlight/commit/a601d1897fc39263dbbf7461d08b537539a8f2a9 is the current state that works.
I compile with mvn install
I've imported dependency into pom.xml:
Ran
mvn clean && mvn compile
thenant
to build. However it says the dependency cannot by found:Any idea what might be wrong?