Open AndyShih12 opened 8 years ago
This problem was fixed in 0.11.2 (at least for my part). I updated to Spark 2 and 0.12.0 and now the problem is there again.
We export data from mongodb every night, but the connections to the mongodb are never closed, so after a time our spark collapses because we got tons of connection threads ... Will you fix this in Version 0.12 ?
Using 0.12.0 with Spark streaming 2.0
Number of connections keeps building up...to a few thousand until MongoDB refuses to accept any new connections.
Is there any way to reuse the same connection? Or to force close the connection? Or to set a lifespan of the connections (i.e. 30seconds)? Thanks