bluenoah1991 / Vector-Tile-Spark-Process

:earth_asia: Clip geographic data into MVT files based on Apache Spark
MIT License
19 stars 3 forks source link

something wrong in jts #1

Open CaoZPLQ opened 7 months ago

CaoZPLQ commented 7 months ago

java.lang.NoSuchMethodError: org.geotools.geometry.jts.JTS.toGeometry(Lcom/vividsolutions/jts/geom/Envelope;)Lcom/vividsolutions/jts/geom/Polygon; at org.ieee.codemeow.geometric.GeometricUtils$$anonfun$intersectedTiles$1.apply(GeometricUtils.scala:60) at org.ieee.codemeow.geometric.GeometricUtils$$anonfun$intersectedTiles$1.apply(GeometricUtils.scala:57) at scala.collection.TraversableLike$$anonfun$filterImpl$1.apply(TraversableLike.scala:248) at scala.collection.Iterator$class.foreach(Iterator.scala:891) at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247) at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259) at scala.collection.AbstractTraversable.filter(Traversable.scala:104) at org.ieee.codemeow.geometric.GeometricUtils$.intersectedTiles(GeometricUtils.scala:57) at org.ieee.codemeow.geometric.spark.VectorTileTask$$anonfun$3$$anonfun$4.apply(VectorTileTask.scala:64) at org.ieee.codemeow.geometric.spark.VectorTileTask$$anonfun$3$$anonfun$4.apply(VectorTileTask.scala:63) at org.apache.spark.sql.execution.MapElementsExec$$anonfun$7$$anonfun$apply$1.apply(objects.scala:236) at org.apache.spark.sql.execution.MapElementsExec$$anonfun$7$$anonfun$apply$1.apply(objects.scala:236) at scala.collection.Iterator$$anon$11.next(Iterator.scala:410) at scala.collection.Iterator$$anon$11.next(Iterator.scala:410) at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) at org.apache.spark.scheduler.Task.run(Task.scala:121) at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750)

Hi, I executed the mvn clean and mvn package commands, followed by running spark-submit, but encountered unexpected output.

bluenoah1991 commented 7 months ago

抱歉,这个项目我已经7年没有维护了。我都忘光了。靠你自己了。:(