Azure / azure-cosmosdb-bulkexecutor-java-getting-started

Bulk Executor Utility for Azure Cosmos DB Java SQL API
Creative Commons Attribution 4.0 International
14 stars 13 forks source link

Bulk import Unique index constraint violation error #4

Closed vamsikt closed 5 years ago

vamsikt commented 5 years ago

When using bulk importer library to import documents into cosmos DB collection with unique index constraint rule and I tried to import a few documents (along with some duplicate documents which already exists in the collection), then whole bulk import operation fails and throwing an error, fails to insert the non duplicated documents. Is there any way to suppress this expected errors and do insert the other non-duplicated documents using bulk importer?

abinav2307 commented 5 years ago

Hello @vamsikt, depending on the size of the batch of documents sent to the BulkExecutor, a majority of the documents would have been successfully ingested.

We are working on pushing out a change shortly that will provide more insight into the subset of documents to retry during failures, instead of the entire batch. This will be released in newer versions of the BulkExecutor.

bitsevn commented 4 years ago

Please let me know if this feature is implemented as of now or not?

I also tried to bulk import 100 documents with 20 documents having id that already exist in the cosmos db. What I expected is to have imported 80 documents successfully and rejected 20 documents with reason of rejection. However it fails the entire operation and throws exception.

This feature is super important for our project. Can you please let me know when is this expected to be rolled out? Or is it already there?

I am using latest version available on maven repository.

mounicadammalapati commented 2 years ago

Is there any update on this issue. I am also having same issue. I have documents that has duplicates and new ones but the whole batch is failing and not uploading the new docs. Is there any workaround solution...?