Closed kitsuyui closed 7 years ago
Currently maximum pool size can't be small because it causes deadlock when current code establishes a connection when another transaction is running in a single thread. Pool size must be unlimited to avoid the deadlock. Meanwhile, a solution is to use smaller number of threads.
Thanks, I look forward to be released #466 ! 😄
@kitsuyui We released https://github.com/treasure-data/digdag/pull/515 instead of #466 as 0.9.8. It would be great if you try it
@komamitsu @frsyuki
I tried v0.9.16. It was certainly mitigated. (1024 connections to 80 connections) But it is connections all it has remained in my machine. So I have to limit maximumConnectionPool in postgres.properties explicitly.
It is not so good feeling about this...
use small connection pool size by setting database.maximumPoolSize
(integer, default: available CPU cores * 32)
@frsyuki これはどう?
https://github.com/treasure-data/digdag/pull/280/commits/780d61780ae5658547b87b44222a870942ebc715#diff-d11545055f57d9e418a11dd1e06f680d
My machine has 32 cores and digdag tried to make 1024 connections. It exceeded to my Postgres's max_connections. And digdag occupied every postgres connection on the server. Other processes couldn't make new connect/reconnect. It had broken other jobs on server.
I had written
database.maximumPoolSize = 10
in postgresql.properties and it has stopped.To prevent this tragedy on others
There are various ways:
maximumPoolSize
to some percentage of max connections parameter. (by gettingSHOW max_connections
)