Open yangxikun opened 8 years ago
Maybe I have found the increase memory use reason: in the such EvIoManager class has properties like
private $events = [];
or private $timers = [];
The framework stop some events or timers : $this->events[$id]->stop();
and $this->timers[$id]->stop();
But don't unset them(unset($this->events[$id])
, unset($this->timers[$id])
I believe the memory leak was coming from the socket created by $socket = (yield Icicle\Socket\connect('10.123.5.34', 80));
not being closed (not calling $socket->close()
). This leaves resources in the loop associated with that object. However, I see that being a common problem, so I updated the class extended by NetworkSocket
so that the loop resources would be freed from the loop without calling close. Update to the latest version of icicleio/socket
(should be able to just run composer update
) and see if your code works without changes. Please let me know if you still are seeing memory leaks after updating.
After update icicleio/socket
, my code works!
However, there still something wrong:
kill -9
). Is there a good practices to limit the concurrent connection number on the server script?~/dev/icicle » php server.php
^C[Fri Jan 1 07:41:41 2016] Script: '/home/rokety/dev/icicle/server.php'
/home/rokety/software/php-5.6.15/Zend/zend_vm_execute.h(944) : Freeing 0x071F6418 (32 bytes), script=/home/rokety/dev/icicle/server.php
=== Total 1 memory leaks detected ===
If I pass false
in Icicle\Loop\loop()
:
function loop(Loop $loop = null)
{
static $instance;
if (null !== $loop) {
$instance = $loop;
} elseif (null === $instance) {
$instance = create(false);
}
return $instance;
}
zend engine doesn't complain the memory leak. So there is something wrong with the signal handling?
Eventually try compiling php-src master branch which fixes a bunch of memory leaks. There are leaks involving cleanup of Generators and Exceptions in older versions.
As long as these leaks aren't numerous, there's nothing you need to bother about. (After all, it's just 32 bytes ...)
@yangxikun What sort of high concurrency are you talking about? I've successfully tested a couple thousand concurrent requests with no issues. Does the server just hang (even with no load) after this happens? I'm going to need a little more information to track down what might be happening.
Under high concurrency, the server script will hold many connections at the same time. In my case, each request will make a remote request to another web services. It meant each request will cost a lot of time. So, Increasingly connections that hold by server, will lead to more memory use, until the amount of memory use exceed php limit.
I use apache jemeter to make benchmark on the example server(with litter modify):
I found the memory use by the process is increase, and not decrease:
And there have many error output like:
If I continue benchmark, will suffer a fatal error:
Another, If I
ctrl+c
the server script, it print a memory leak message: