Open GuillaumeVadeBe opened 1 year ago
Facing the same issue! Is there a workaround for this? Has Microsoft fixed this or is this still active?
Here is a SO about it. Idk if there is another issue for this or not
Would very much like confirmation whether HPKs are supported in ADF. I can't find documentation anywhere.
Update: from tests, it's clear to me that HPKs are not supported in ADF. A copy data activity yields the following error:
Operation on target Sample failed: Failure happened on 'Sink' side. ErrorCode=CosmosDbSqlApiOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CosmosDbSqlApi operation Failed. ErrorMessage: Response status code does not indicate success: -1 (-1); Substatus: 0; ActivityId: ; Reason: ();.,Source=Microsoft.DataTransfer.ClientLibrary.CosmosDbSqlApiV3,''Type=Microsoft.Azure.Cosmos.CosmosException,Message=Response status code does not indicate success: -1 (-1); Substatus: 0; ActivityId: ; Reason: ();,Source=Microsoft.Azure.Cosmos.Client,'
Concerning this error:
"Schema import failed: Error converting value "MultiHash" to type 'Microsoft.Azure.Documents.PartitionKind'. Path 'partitionKey.kind', line 1, position 366.\r\nRequested value 'MultiHash' was not found."
I believe that happens when one of the HPKs is null or empty in the data - regardless... if it was, it wouldn't matter, because of the above error.
We have some Cosmos containers (NOSQL) with hierarchical partitioning enabled which we want to use as a source and sink dataset in Azure Data Factory. When trying to create the dataset the following error pops up:
"Schema import failed: Error converting value \"MultiHash\" to type 'Microsoft.Azure.Documents.PartitionKind'. Path 'partitionKey.kind', line 1, position 366.\r\nRequested value 'MultiHash' was not found."
Are we doing something wrong or are hierarchical partitioned containers not supported as a dataset yet? If so, will this ever be supported? I can't seem to find anything about this in the documentation.