Open p5pRT opened 19 years ago
I first observed this behavior when attempting to save a directed acyclic graph (DAG) consisting of of about 5\,000 nodes\, but below I give a minimalistic example that exhibits the same behavior.
The following script creates a long sequence of nested references. The command-line argument to the script should be a positive integer\, corresponding to the length of this sequence of references.
With an argument greater than some system-dependent critical size\, the script causes perl to segfault:
use strict; use warnings;
my $size = shift;
my $x = []; $x = [ $x ] for 1..$size;
require Storable; Storable::store( \$x\, '/tmp/$0.$$' ) or die;
warn "ok\n";
__END__
For this script\, in the system described in this report\, the critical value of the argument is 30826. With any argument below this number\, the script terminates normally; with an argument greater than or equal to it\, perl segfaults.
Similarly\, if the two lines indicated by the "# *" in the script above are replaced by
require Data::Dumper; Data::Dumper->Dump( [ \$x ] ) or die;
the resulting script fails for any argument greater than or equal to 15406.
Two important differences between the two versions of the script is that the one using Storable fails after the 'warn "ok\n"' line\, and always terminates quickly (whether successfully or not)\, even for values of the input close to the critical value. In contrast\, for the version using Data::Dumper\, when the script fails the 'warn "ok\n"' line never executes\, and the time to success or failure grows sharply as the value of the input approaches the critical value. But with the DAG object that originally triggered the segfaulting\, I observed this pronounced slowing down near the critical input size with both Data::Dumper and Storable.
Most likely the reason for the segfaulting is that a sufficiently long chain of links will cause the serialization process to start thrashing. Simply breaking the chain of links in two unconnected halves allows the serialization to succeed\, even though the total number of links has been reduced by only 1.
On Wed\, Jun 29\, 2005 at 09:46:33PM -0000\, kynn jones wrote:
With an argument greater than some system-dependent critical size\, the script causes perl to segfault:
use strict; use warnings;
my $size = shift;
my $x = []; $x = [ $x ] for 1..$size;
require Storable; Storable::store( \$x\, '/tmp/$0.$$' ) or die;
It's the stack being trashed due to excessive recursion. No real way to fix it short of rewriting Storable to be iterative rather than recursive.
The workaround is to increase the stack size available to the process with 'uname -s'
-- In defeat\, indomitable; in victory\, insufferable -- Churchill on Montgomery
The RT System itself - Status changed from 'new' to 'open'
Migrated from rt.perl.org#36427 (status was 'open')
Searchable as RT36427$