Azure / azure-sdk-for-java

This repository is for active development of the Azure SDK for Java. For consumers of the SDK we recommend visiting our public developer docs at https://docs.microsoft.com/java/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-java.
MIT License
2.35k stars 1.98k forks source link

[BUG]azure-resourcemanager-cosmos 2.42.0 contains breaking changes within a minor version upgrade - this breaks semver guarantees #41728

Closed FabianMeiswinkel closed 1 month ago

FabianMeiswinkel commented 1 month ago

Describe the bug After dependency azure-resourcemanager-cosmos was moved from 2.41.0 to 2.42.0 in sdk/cosmos/azure-cosmos-spark_3_2-12/pom.xml our Spark CI live tests start failing. The regression is introduced in the PR https://github.com/Azure/azure-sdk-for-java/pull/41412

Most likely due to the change to disallow reflection for com.fasterxml.jackson.databind in sdk/resourcemanager/azure-resourcemanager-cosmos/src/main/java/module-info.java

Exception or Stack Trace

24/09/04 22:58:44 ERROR Uncaught throwable from user code: azure_cosmos_spark.reactor.core.Exceptions$ReactiveException: azure_cosmos_spark.com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class azure_cosmos_spark.com.azure.resourcemanager.cosmos.models.IndexingPolicy and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)
    at azure_cosmos_spark.reactor.core.Exceptions.propagate(Exceptions.java:396)
    at azure_cosmos_spark.reactor.core.publisher.BlockingSingleSubscriber.blockingGet(BlockingSingleSubscriber.java:98)
    at azure_cosmos_spark.reactor.core.publisher.Mono.block(Mono.java:1742)
    at azure_cosmos_spark.reactor.core.scala.publisher.SMono.block(SMono.scala:108)
    at azure_cosmos_spark.reactor.core.scala.publisher.SMono.block$(SMono.scala:107)
    at azure_cosmos_spark.reactor.core.scala.publisher.ReactiveSMono.block(ReactiveSMono.scala:8)
    at com.azure.cosmos.spark.CosmosCatalogBase.$anonfun$tryGetContainerMetadata$1(CosmosCatalogBase.scala:646)
    at com.azure.cosmos.spark.Loan$Loan.to(Using.scala:43)
    at com.azure.cosmos.spark.CosmosCatalogBase.tryGetContainerMetadata(CosmosCatalogBase.scala:641)
    at com.azure.cosmos.spark.CosmosCatalogBase.loadTableImpl(CosmosCatalogBase.scala:348)
    at com.azure.cosmos.spark.CosmosCatalogBase.$anonfun$loadTable$1(CosmosCatalogBase.scala:339)
    at com.azure.cosmos.spark.TransientErrorsRetryPolicy$.$anonfun$executeWithRetry$1(TransientErrorsRetryPolicy.scala:34)
    at scala.util.control.Breaks.breakable(Breaks.scala:42)
    at com.azure.cosmos.spark.TransientErrorsRetryPolicy$.executeWithRetry(TransientErrorsRetryPolicy.scala:28)
    at com.azure.cosmos.spark.CosmosCatalogBase.loadTable(CosmosCatalogBase.scala:339)
    at org.apache.spark.sql.connector.catalog.TableCatalog.tableExists(TableCatalog.java:207)
    at com.azure.cosmos.spark.CosmosCatalogBase.tableExists(CosmosCatalogBase.scala:38)
    at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:42)
    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$1(V2CommandExec.scala:47)
    at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:47)
    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:45)
    at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:54)
    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:286)
    at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166)
    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:286)
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$9(SQLExecution.scala:303)
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:533)
    at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:226)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
    at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:155)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:482)
    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:285)
    at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:259)
    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:280)
    at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:265)
    at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:465)
    at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:69)
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:465)
    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:39)
    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:339)
    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:335)
    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)
    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)
    at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:441)
    at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:265)
    at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:395)
    at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:265)
    at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:217)
    at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:214)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:261)
    at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:122)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
    at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1155)
    at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
    at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1155)
    at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:112)
    at org.apache.spark.sql.SparkSession.$anonfun$sql$5(SparkSession.scala:928)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:917)
    at org.apache.spark.sql.SparkSession.$anonfun$sql$9(SparkSession.scala:951)
    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:951)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:984)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-3664427855330845:16)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-3664427855330845:84)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$$iw$$iw$$iw$$iw.<init>(command-3664427855330845:86)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$$iw$$iw$$iw.<init>(command-3664427855330845:88)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$$iw$$iw.<init>(command-3664427855330845:90)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$$iw.<init>(command-3664427855330845:92)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read.<init>(command-3664427855330845:94)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$.<init>(command-3664427855330845:98)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$read$.<clinit>(command-3664427855330845)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$eval$.$print$lzycompute(<notebook>:7)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$eval$.$print(<notebook>:6)
    at $linef9d6e2c2f6044f16b43f39a4ba44568f27.$eval.$print(<notebook>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
    at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
    at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
    at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
    at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:223)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:236)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:1410)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:1363)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:236)
    at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$34(DriverLocal.scala:1012)
    at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:128)
    at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$23(DriverLocal.scala:1003)
    at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
    at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
    at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:69)
    at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
    at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:69)
    at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:958)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$2(DriverWrapper.scala:859)
    at scala.util.Try$.apply(Try.scala:213)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:851)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$3(DriverWrapper.scala:891)
    at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:667)
    at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:685)
    at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
    at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
    at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
    at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:76)
    at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
    at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
    at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionTags(DriverWrapper.scala:76)
    at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:662)
    at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
    at com.databricks.backend.daemon.driver.DriverWrapper.recordOperationWithResultTags(DriverWrapper.scala:76)
    at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:891)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:702)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:803)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$runInnerLoop$1(DriverWrapper.scala:577)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
    at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
    at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
    at com.databricks.backend.daemon.driver.DriverWrapper.withAttributionContext(DriverWrapper.scala:76)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:577)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:487)
    at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:296)
    at java.lang.Thread.run(Thread.java:750)
Caused by: azure_cosmos_spark.com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class azure_cosmos_spark.com.azure.resourcemanager.cosmos.models.IndexingPolicy and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1330)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.DatabindContext.reportBadDefinition(DatabindContext.java:414)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.failForEmpty(UnknownSerializer.java:53)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.serialize(UnknownSerializer.java:30)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:502)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:341)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.ObjectMapper._writeValueAndClose(ObjectMapper.java:4799)
    at azure_cosmos_spark.com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:4040)
    at azure_cosmos_spark.com.azure.cosmos.spark.catalog.CosmosCatalogManagementSDKClient.generateTblProperties(CosmosCatalogManagementSDKClient.scala:298)
    at azure_cosmos_spark.com.azure.cosmos.spark.catalog.CosmosCatalogManagementSDKClient.$anonfun$readContainerMetadata$4(CosmosCatalogManagementSDKClient.scala:185)
    at azure_cosmos_spark.reactor.core.scala.publisher.package$.$anonfun$scalaFunction2JavaFunction$1(package.scala:51)
    at azure_cosmos_spark.reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:106)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoInnerProducerBase.complete(Operators.java:2666)
    at azure_cosmos_spark.reactor.core.publisher.MonoSingle$SingleSubscriber.onComplete(MonoSingle.java:180)
    at azure_cosmos_spark.reactor.core.publisher.FluxZip$ZipCoordinator.drain(FluxZip.java:787)
    at azure_cosmos_spark.reactor.core.publisher.FluxZip$ZipInner.onNext(FluxZip.java:1018)
    at azure_cosmos_spark.reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:122)
    at azure_cosmos_spark.reactor.core.publisher.SerializedSubscriber.onNext(SerializedSubscriber.java:99)
    at azure_cosmos_spark.reactor.core.publisher.FluxRetryWhen$RetryWhenMainSubscriber.onNext(FluxRetryWhen.java:174)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:129)
    at azure_cosmos_spark.reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:129)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
    at azure_cosmos_spark.reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:122)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoCompletionStage.lambda$subscribe$0(MonoCompletionStage.java:100)
    at java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
    at java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
    at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
    at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
    at azure_cosmos_spark.reactor.core.publisher.MonoToCompletableFuture.onNext(MonoToCompletableFuture.java:64)
    at azure_cosmos_spark.reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.signalCached(MonoCacheTime.java:337)
    at azure_cosmos_spark.reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.onNext(MonoCacheTime.java:354)
    at azure_cosmos_spark.reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onNext(MonoPeekTerminal.java:180)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:151)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:151)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:129)
    at azure_cosmos_spark.reactor.core.publisher.FluxFlatMap$FlatMapMain.checkTerminated(FluxFlatMap.java:847)
    at azure_cosmos_spark.reactor.core.publisher.FluxFlatMap$FlatMapMain.drainLoop(FluxFlatMap.java:609)
    at azure_cosmos_spark.reactor.core.publisher.FluxFlatMap$FlatMapMain.drain(FluxFlatMap.java:589)
    at azure_cosmos_spark.reactor.core.publisher.FluxFlatMap$FlatMapMain.onComplete(FluxFlatMap.java:466)
    at azure_cosmos_spark.reactor.core.publisher.SerializedSubscriber.onComplete(SerializedSubscriber.java:146)
    at azure_cosmos_spark.reactor.core.publisher.FluxRetryWhen$RetryWhenMainSubscriber.onComplete(FluxRetryWhen.java:200)
    at azure_cosmos_spark.reactor.core.publisher.FluxMergeSequential$MergeSequentialMain.drain(FluxMergeSequential.java:374)
    at azure_cosmos_spark.reactor.core.publisher.FluxMergeSequential$MergeSequentialMain.innerComplete(FluxMergeSequential.java:335)
    at azure_cosmos_spark.reactor.core.publisher.FluxMergeSequential$MergeSequentialInner.onComplete(FluxMergeSequential.java:591)
    at azure_cosmos_spark.reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onComplete(FluxDoFinally.java:128)
    at azure_cosmos_spark.reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onComplete(MonoPeekTerminal.java:299)
    at azure_cosmos_spark.reactor.core.publisher.FluxPeekFuseable$PeekFuseableConditionalSubscriber.onComplete(FluxPeekFuseable.java:595)
    at azure_cosmos_spark.reactor.core.publisher.FluxMapFuseable$MapFuseableConditionalSubscriber.onComplete(FluxMapFuseable.java:350)
    at azure_cosmos_spark.reactor.core.publisher.FluxMapFuseable$MapFuseableConditionalSubscriber.onComplete(FluxMapFuseable.java:350)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1840)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.complete(MonoIgnoreThen.java:292)
    at azure_cosmos_spark.reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onNext(MonoIgnoreThen.java:187)
    at azure_cosmos_spark.reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:236)
    at azure_cosmos_spark.reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:157)
    at azure_cosmos_spark.reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
    at azure_cosmos_spark.reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.complete(MonoIgnoreThen.java:292)
    at azure_cosmos_spark.reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onNext(MonoIgnoreThen.java:187)
    at azure_cosmos_spark.reactor.core.publisher.SerializedSubscriber.onNext(SerializedSubscriber.java:99)
    at azure_cosmos_spark.reactor.core.publisher.FluxRetryWhen$RetryWhenMainSubscriber.onNext(FluxRetryWhen.java:174)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoInnerProducerBase.complete(Operators.java:2666)
    at azure_cosmos_spark.reactor.core.publisher.MonoSingle$SingleSubscriber.onComplete(MonoSingle.java:180)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:2060)
    at azure_cosmos_spark.reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onComplete(FluxMapFuseable.java:152)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1840)
    at azure_cosmos_spark.reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoInnerProducerBase.complete(Operators.java:2666)
    at azure_cosmos_spark.reactor.core.publisher.MonoSingle$SingleSubscriber.onComplete(MonoSingle.java:180)
    at azure_cosmos_spark.reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:144)
    at azure_cosmos_spark.reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onComplete(FluxSwitchIfEmpty.java:85)
    at azure_cosmos_spark.reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:260)
    at azure_cosmos_spark.reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:144)
    at azure_cosmos_spark.reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onComplete(FluxDoFinally.java:128)
    at azure_cosmos_spark.reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onComplete(FluxHandleFuseable.java:236)
    at azure_cosmos_spark.reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onComplete(FluxContextWrite.java:126)
    at azure_cosmos_spark.reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1840)
    at azure_cosmos_spark.reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:129)
    at azure_cosmos_spark.reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:260)
    at azure_cosmos_spark.reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:144)
    at azure_cosmos_spark.reactor.netty.channel.FluxReceive.onInboundComplete(FluxReceive.java:415)
    at azure_cosmos_spark.reactor.netty.channel.ChannelOperations.onInboundComplete(ChannelOperations.java:439)
    at azure_cosmos_spark.reactor.netty.channel.ChannelOperations.terminate(ChannelOperations.java:493)
    at azure_cosmos_spark.reactor.netty.http.client.HttpClientOperations.onInboundNext(HttpClientOperations.java:789)
    at azure_cosmos_spark.reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:114)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at azurecosmosspark.io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:289)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at azurecosmosspark.io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
    at azurecosmosspark.io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
    at azurecosmosspark.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
    at azurecosmosspark.io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at azurecosmosspark.io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1475)
    at azurecosmosspark.io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1338)
    at azurecosmosspark.io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1387)
    at azurecosmosspark.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:530)
    at azurecosmosspark.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:469)
    at azurecosmosspark.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
    at azurecosmosspark.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1407)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
    at azurecosmosspark.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
    at azurecosmosspark.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:918)
    at azurecosmosspark.io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:799)
    at azurecosmosspark.io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:501)
    at azurecosmosspark.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:399)
    at azurecosmosspark.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:994)
    at azurecosmosspark.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    at azurecosmosspark.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.lang.Thread.run(Thread.java:750)
github-actions[bot] commented 1 month ago

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @kushagraThapar @pjohari-ms @TheovanKraay.

XiaofeiCao commented 1 month ago

Yes, the disallow of Jackson is due to migration serialization/deserialization to azure-json. The migration removed Jackson annotations, so normal ObjectMapper.writeValueAsString won't work anyway. For manual serialization, instead of ObjectMapper, you should use SerializerFactory.createDefaultManagementSerializerAdapter().serialize(string, SerializerEncoding.JSON). @alzimmermsft for awareness

FabianMeiswinkel commented 1 month ago

For azure-cosmos-spark-* we have fixed this by making new versions work with the breaking change