UIKit0 / newsbeuter

Automatically exported from code.google.com/p/newsbeuter
0 stars 0 forks source link

newsbeuter -x reload segfaults #110

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
I'm getting a segfault when running newsbeuter -x reload. Starting newsbeuter
normally and typing "R" to reload all feeds usually works without crashing, but
has occasionally also segfaulted (when there's no ui activity, I only press
reload feeds and wait long enough).

I'm using newsbeuter-svn 1576, with a fresh newsbeuter-cache.db file, and a
config file containing only:
    reload-threads 6

If I leave that out, I'm not seeing the crashes. I have about 400 or so feeds
in my urls file; I've tried winnowing them down to see if it's any specific
feed that's causing the crash. So far as I can tell, no specific feed is
responsible, however if I have too few feeds in the urls file, I won't be able
to reproduce the crash. I haven't yet identified a minimal test case.

Here's the tail of the log file produced when running newsbeuter -l6 -d logfile:

[2008-11-30 10:33:42] INFO: fmtstr_formatter::register_fmt: char = d value = 
[2008-11-30 10:33:42] DEBUG: fmtstr_formatter::do_wformat: fmt = `%4i %n %11u 
%t' width = 170
[2008-11-30 10:33:42] DEBUG: fmtstr_formatter::do_wformat: fmtlen = 14
[2008-11-30 10:33:42] DEBUG: fmtstr_formatter::do_wformat: number = 4
[2008-11-30 10:33:42] DEBUG: fmtstr_formatter::do_wformat: swprintf result =  
387
[2008-11-30 10:33:42] DEBUG: fmtstr_formatter::do_wformat: number = 11
[2008-11-30 10:33:42] DEBUG: fmtstr_formatter::do_wformat: swprintf result =    
   (1/1)
[2008-11-30 10:33:42] DEBUG: end of do_wformat
[2008-11-30 10:33:42] INFO: fmtstr_formatter::do_format: result =  387 N       
(1/1) Effraie@blog - Balise - upc
[2008-11-30 10:33:42] DEBUG: feedlist_formaction::set_feedlist: format result = 
 387 N       (1/1) Effraie@blog - Balise - upc

And here's the tail of the logfile produced when running newsbeuter -x reload 
-l6 -d logfile
(deleting the newsbeuter-cache.db file first):

[2008-11-30 10:39:18] DEBUG: rssitem_callback: title = dear lazywebs: help me 
fix the internet
[2008-11-30 10:39:18] DEBUG: rssitem_callback: title = Chicago Hardy Party, 
part 2
[2008-11-30 10:39:18] DEBUG: rssitem_callback: title = Ubuntu-Chicago Hardy 
Heron Release Party
[2008-11-30 10:39:18] DEBUG: rssitem_callback: title = dear lazywebs…
[2008-11-30 10:39:18] INFO: scope_measure: function 
`cache::internalize_rssfeed' took 0.030447 s
[2008-11-30 10:39:18] DEBUG: controller::reload: after internalize_rssfeed
[2008-11-30 10:39:18] DEBUG: controller::reload_range: reloading feed #325
[2008-11-30 10:39:18] DEBUG: controller::reload: pos = 325 max = 461
[2008-11-30 10:39:18] DEBUG: controller::reload: created parser
[2008-11-30 10:39:18] DEBUG: running: query: SELECT lastmodified FROM rss_feed 
WHERE rssurl = 'http://effiejayx.velugmaracaibo.org.ve/?feed=rss2&cat=5';

And here's a backtrack of running the previous command in gdb:

$ gdb newsbeuter
GNU gdb 6.8
...
(gdb) run -x reload
Starting program: /usr/bin/newsbeuter -x reload
(no debugging symbols found)
...
Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0xb7b5ba50 (LWP 999)]
0x00000019 in ?? ()
(gdb)
(gdb) bt
#0  0x00000019 in ?? ()
#1  0xb7f8b2d7 in showit () from /usr/lib/libcurl.so.4
#2  0xb7f8b44f in Curl_debug () from /usr/lib/libcurl.so.4
#3  0xb7f8b7c2 in Curl_failf () from /usr/lib/libcurl.so.4
#4  0xb7f823a8 in Curl_resolv_timeout () from /usr/lib/libcurl.so.4
#5  0x086e08e0 in ?? ()
#6  0xb7e4fff4 in ?? () from /lib/libc.so.6
#7  0x088874ce in ?? ()
#8  0xb59b9b3c in ?? ()
#9  0xb7d7d19f in _IO_str_init_static_internal () from /lib/libc.so.6
Backtrace stopped: previous frame inner to this frame (corrupt stack?)

What version of the product are you using? On what operating system?

newsbeuter-svn 1576, stfl-svn 101, libcurl 7.19.2
Arch Linux with kernel 2.6.27.6

Original issue reported on code.google.com by prof...@jimpryor.net on 30 Nov 2008 at 4:31

GoogleCodeExporter commented 9 years ago
OK, I'm out of my depths here, but since the backtraces always point to a
function in libcurl (not always the same function), and since the segfaults
would only happen when I have >1 thread, it didn't seem unreasonable to try out
the following patch to newsbeuter:

--- libnxml-0.18.1/xmlrss/nxml_download.c   2008-11-27 14:42:54.000000000 -0500
+++ libnxml-0.18.3/xmlrss/nxml_download.c   2008-08-21 17:55:49.000000000 -0400
@@ -76,6 +82,7 @@
   curl_easy_setopt (curl, CURLOPT_URL, fl);
   curl_easy_setopt (curl, CURLOPT_WRITEFUNCTION, __nxml_memorize_file);
   curl_easy_setopt (curl, CURLOPT_FOLLOWLOCATION, 1);
+  curl_easy_setopt (curl, CURLOPT_NOSIGNAL, 1);
   curl_easy_setopt (curl, CURLOPT_FILE, (void *) chunk);
   curl_easy_setopt (curl, CURLOPT_ENCODING, "gzip, deflate");

This one of the changes introduced in the move from libnxml 0.18.1, which
newsbeuter forks from, and libnxml 0.18.3, which is more recent. Googling says
the CURLOPT_NOSIGNAL flag has to do with whether libcurl is able to abort a
long DNS lookup by sending a signal. That's all I know.

But I can report that turning this flag on (and hence disabling libcurl's
ability to do those aborts) seems to cure my segfaults. I'll report back if
they reappear...but so far as I can now see, they're gone.

I don't know if this is the best long-term solution, since now my reloads take 
a lot longer...

Original comment by prof...@jimpryor.net on 30 Nov 2008 at 7:24

GoogleCodeExporter commented 9 years ago
I wrote:
> since now my reloads take a lot longer...

In fact, sometimes they seem to be finishing promptly, other times they hang 
indefinitely. If I do "newsbeuter -x reload -l6 -d LOGFILE" and do "tail -f 
LOGFILE" in another terminal, I see that they're always hanging after printing 
lines like this:
DEBUG: controller::reload: after internalize_rssfeed

This is with multiple threading enabled, and the CURLOPT_NOSIGNAL flag set as 
decribed in the previous comment. Following some hints on the curl website, I 
also recompiled my libcurl with c-ares support enabled (and recompiled 
newsbeuter after doing so).

So I've made the segfaults go away but now sometimes (not always) a request to 
"newsbeuter -x reload" will hang indefinitely...

Original comment by prof...@jimpryor.net on 30 Nov 2008 at 11:35