Closed morikat1509 closed 5 months ago
I don't really have any information on the old tool but I'm not aware of anything that would cause it to limit data size.
The new tool is completely new and not related to the old code base. Given that it is based on the newer Cosmos DB SDK it is likely that performance for Cosmos is better than the old tool but I'm not aware of any comparison benchmarking.
The focus of this tool is primarily on ease of use for development and testing scenarios given that there are other tools, like Azure Data Factory, that are already well suited for large scale and high performance data transfer. That said, there is nothing inherent in the architecture that limits total data size. Specifics vary by extension type used but in general all data is asynchronously streamed at both source and sink ends of the transfer so there isn't anything like a memory cache size limit.
Hi @bowencode , Our service have been using this desktop tool since 2015 to Migrate Data from cosmosdb to cosmosdb instances. Recently we have upgraded the tool in early Oct 2023 before the retiring of the old repo. The tool is working fine but I might have a migration coming with 200GB of data(Around 76 Million Cosmos DB Documents).
My questions,