Closed joaofauvel closed 3 weeks ago
Thank you for your interest in Apache Sedona! We appreciate you opening your first issue. Contributions like yours help make Apache Sedona better.
I have not reproduced this problem on DBR 15.3, so this could be a configuration problem.
The fourth parameter of ST_DWithin is introduced in 1.6.0, the error message saying that ST_DWithin only expects 3 arguments indicates that it is using an old version of Sedona. Please check that the 1.6.1 Sedona JAR is deployed to your DBR cluster, and it is the only Sedona JAR deployed. Mixing multiple versions of Sedona JARs will lead to all sorts of strange behaviors.
If you are using the init script described here, please make sure that the workspace directory /Workspace/Shared/sedona/1.6.1/
contains only one sedona-spark-shaded jar.
It was a configuration issue. The cluster had a different version of the apache-sedona python package (the latest because it wasn't pinned to the same version as the jar), which is why from python the function did accept all 4 arguments.
Expected behavior
true
Actual behavior
IllegalArgumentException
``` IllegalArgumentException: function ST_DWithin takes at most 3 argument(s), 4 argument(s) specified at org.apache.sedona.sql.UDF.Catalog$.$anonfun$function$2(Catalog.scala:310) at org.apache.spark.sql.catalyst.analysis.SimpleFunctionRegistryBase.lookupFunction(FunctionRegistry.scala:251) at org.apache.spark.sql.catalyst.analysis.SimpleFunctionRegistryBase.lookupFunction$(FunctionRegistry.scala:245) at org.apache.spark.sql.catalyst.analysis.SimpleFunctionRegistry.lookupFunction(FunctionRegistry.scala:317) at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.$anonfun$resolveBuiltinOrTempFunctionInternal$1(SessionCatalog.scala:2835) at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.lookupTempFuncWithViewContext(SessionCatalog.scala:2857) at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.resolveBuiltinOrTempFunctionInternal(SessionCatalog.scala:2835) at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.resolveBuiltinOrTempFunction(SessionCatalog.scala:2812) at org.apache.spark.sql.catalyst.catalog.DelegatingSessionCatalog.resolveBuiltinOrTempFunction(DelegatingSessionCatalog.scala:529) at org.apache.spark.sql.catalyst.catalog.DelegatingSessionCatalog.resolveBuiltinOrTempFunction$(DelegatingSessionCatalog.scala:526) at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.resolveBuiltinOrTempFunction(ManagedCatalogSessionCatalog.scala:87) at com.databricks.sql.analyzer.UnresolvedFunctionLogging.$anonfun$resolveBuiltinOrTempFunction$1(UnresolvedFunctionLogging.scala:80) at com.databricks.sql.analyzer.UnresolvedFunctionLogging.recordFailure(UnresolvedFunctionLogging.scala:97) at com.databricks.sql.analyzer.UnresolvedFunctionLogging.resolveBuiltinOrTempFunction(UnresolvedFunctionLogging.scala:80) at com.databricks.sql.analyzer.UnresolvedFunctionLogging.resolveBuiltinOrTempFunction$(UnresolvedFunctionLogging.scala:78) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$.resolveBuiltinOrTempFunction(Analyzer.scala:2844) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveFunctions$$resolveBuiltinOrTempFunction(Analyzer.scala:3138) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$$anonfun$apply$23$$anonfun$applyOrElse$172.$anonfun$applyOrElse$177(Analyzer.scala:3062) at org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:103) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$$anonfun$apply$23$$anonfun$applyOrElse$172.applyOrElse(Analyzer.scala:3062) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$$anonfun$apply$23$$anonfun$applyOrElse$172.applyOrElse(Analyzer.scala:3031) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformUpWithPruning$4(TreeNode.scala:573) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUpWithPruning(TreeNode.scala:573) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformUpWithPruning$1(TreeNode.scala:566) at org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren(TreeNode.scala:1319) at org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren$(TreeNode.scala:1318) at org.apache.spark.sql.catalyst.expressions.UnaryExpression.mapChildren(Expression.scala:669) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUpWithPruning(TreeNode.scala:566) at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformExpressionsUpWithPruning$1(QueryPlan.scala:209) at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:221) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:221) at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:233) at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$4(QueryPlan.scala:239) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) at scala.collection.immutable.List.foreach(List.scala:431) at scala.collection.TraversableLike.map(TraversableLike.scala:286) at scala.collection.TraversableLike.map$(TraversableLike.scala:279) at scala.collection.immutable.List.map(List.scala:305) at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:239) at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$5(QueryPlan.scala:244) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:358) at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:244) at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUpWithPruning(QueryPlan.scala:209) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$$anonfun$apply$23.applyOrElse(Analyzer.scala:3031) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$$anonfun$apply$23.applyOrElse(Analyzer.scala:2850) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$3(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUpWithPruning$1(AnalysisHelper.scala:141) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:436) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning(AnalysisHelper.scala:137) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUpWithPruning$(AnalysisHelper.scala:133) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUpWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$.apply(Analyzer.scala:2850) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveFunctions$.apply(Analyzer.scala:2844) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$4(RuleExecutor.scala:327) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$3(RuleExecutor.scala:327) at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) at scala.collection.immutable.List.foldLeft(List.scala:91) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:324) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeBatch$1(RuleExecutor.scala:307) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$9(RuleExecutor.scala:411) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$9$adapted(RuleExecutor.scala:411) at scala.collection.immutable.List.foreach(List.scala:431) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:411) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:270) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeSameContext(Analyzer.scala:423) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:416) at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:329) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:416) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:348) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:262) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:168) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:262) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:401) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:443) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:400) at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:261) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:427) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$5(QueryExecution.scala:611) at org.apache.spark.sql.execution.SQLExecution$.withExecutionPhase(SQLExecution.scala:143) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$4(QueryExecution.scala:611) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1164) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:610) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:606) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:255) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:254) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:236) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:130) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1187) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1187) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:959) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:947) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:982) at com.databricks.backend.daemon.driver.DriverLocal$DbClassicStrategy.executeSQLQuery(DriverLocal.scala:290) at com.databricks.backend.daemon.driver.DriverLocal.executeSQLSubCommand(DriverLocal.scala:390) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$executeSql$1(DriverLocal.scala:411) at scala.collection.immutable.List.map(List.scala:293) at com.databricks.backend.daemon.driver.DriverLocal.executeSql(DriverLocal.scala:406) at com.databricks.backend.daemon.driver.JupyterDriverLocal.repl(JupyterDriverLocal.scala:930) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$30(DriverLocal.scala:1138) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$25(DriverLocal.scala:1129) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:253) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:249) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:87) at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95) at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:87) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1073) at com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1484) at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:764) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:826) at scala.util.Try$.apply(Try.scala:213) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:818) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:858) at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:636) at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:654) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:253) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:249) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43) at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:70) at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95) at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76) at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:70) at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:631) at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:541) at com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:70) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:858) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:703) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:770) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:576) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:253) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:249) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43) at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:70) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:576) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:498) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:292) at java.lang.Thread.run(Thread.java:750) at com.databricks.backend.daemon.driver.DriverLocal.executeSql(DriverLocal.scala:463) at com.databricks.backend.daemon.driver.JupyterDriverLocal.repl(JupyterDriverLocal.scala:930) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$30(DriverLocal.scala:1138) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$25(DriverLocal.scala:1129) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:253) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:249) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:87) at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95) at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:87) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$1(DriverLocal.scala:1073) at com.databricks.backend.daemon.driver.DriverLocal$.$anonfun$maybeSynchronizeExecution$4(DriverLocal.scala:1484) at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:764) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:826) at scala.util.Try$.apply(Try.scala:213) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:818) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:858) at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:636) at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:654) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:253) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:249) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43) at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:70) at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95) at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76) at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:70) at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:631) at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:541) at com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:70) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:858) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:703) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:770) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:576) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:253) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:249) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43) at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:70) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:576) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:498) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:292) ```Py4JError
``` Py4JError: An error occurred while calling z:org.apache.spark.sql.sedona_sql.expressions.st_predicates.ST_DWithin. Trace: py4j.Py4JException: Method ST_DWithin([class org.apache.spark.sql.Column, class org.apache.spark.sql.Column, class org.apache.spark.sql.Column, class org.apache.spark.sql.Column]) does not exist at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:344) at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:365) at py4j.Gateway.invoke(Gateway.java:300) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) FileSteps to reproduce the problem
Try to use ST_DWithin with optional useSpheroid/use_sphere argument in either SparkSQL or PySpark.
Settings
Sedona version = 1.5.3 Sedona python pkg = 1.6.0
Apache Spark version = 3.5.0
Apache Flink version = ?
API type = Python, SQL
Scala version = 2.12
JRE version = ?
Python version = 3.11
Environment = Azure Databricks (DBR 15.3 ML)