Azure / azure-data-lake-store-net

Azure Data Lake Store .Net SDK
MIT License
18 stars 24 forks source link

Port Exhaustion when enumerating directories #22

Closed KasperHoldum closed 6 years ago

KasperHoldum commented 6 years ago

We have a scenario where we need to enumerate all directories in the Data Lake Store and delete specific files. This results in a lot of HTTP requests to the DLS. During this operation Application Insights report the following dependency error during this enumeration:

"Only one usage of each socket address (protocol/network address/port) is normally permitted"

There is some sort of retry pattern built into the client, so eventually the operation succeds, but it seems odd to run into port exhaustion.

rahuldutta90 commented 6 years ago

Are you doing enumerate from multiple threads concurrently?

Either way can you please try increasing the "ServicePointManager.DefaultConnectionLimit ". For multi-threaded applications, it is recomended to increase this. The default value of this is 2. It controls the number of connections client opens under a connection group. We have documented this: https://docs.microsoft.com/en-us/dotnet/api/microsoft.azure.datalake.store?view=azure-dotnet

KasperHoldum commented 6 years ago

Each ADSL client is only being used single-threaded, but it's enumerating our entire DLS. Multiple requests can occur concurrently, but using seperate ADSL client for each request.

rahuldutta90 commented 6 years ago

In that case in WebRequest layer, one connection group will exist and it will contain 2 connections. CAn you please try increasing "ServicePointManager.DefaultConnectionLimit" to the concurrency of your program. Let me know if that works

Separate question: I am assuming you are doing the enumerate recursively, if you are enumerating same account, why are you using separate adlsclient. You can use same client to do the whole enumeration.

rahuldutta90 commented 6 years ago

@KasperHoldum Just checking did it mitigate/resolve the issue?

rahuldutta90 commented 6 years ago

Closing the issue. Please reopen it if it did not resolve the issue.