Open unixwitch opened 7 years ago
@unixwitch We are seeing similar issues with our Django application. We'll see random AttributeError in really unexpected parts of the code, like from Django Rest Framework....
AttributeError: rest_framework.fields in get_attribute
But also seen the same error....
AttributeError: 'WSGIRequest' object has no attribute 'path'
Followed by a very similar core dump.
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: !!! uWSGI process 95 got Segmentation Fault !!!
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: *** backtrace of 95 ***
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: uwsgi(uwsgi_backtrace+0x30) [0x468300]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: uwsgi(uwsgi_segfault+0x21) [0x4686a1]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /lib/x86_64-linux-gnu/libc.so.6(+0x35180) [0x7fd97ee61180]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x97a94) [0x7fd97f4a3a94]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(_PyObject_GenericSetAttrWithDict+0x132) [0x7fd97f4ab9e2]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_SetAttr+0x8f) [0x7fd97f4ab36f]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x24d1) [0x7fd97f511db1]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x845cc) [0x7fd97f4905cc]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_CallFunction+0xbb) [0x7fd97f45ee0b]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x7267a) [0x7fd97f47e67a]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(_PyObject_GenericSetAttrWithDict+0x107) [0x7fd97f4ab9b7]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_SetAttr+0x8f) [0x7fd97f4ab36f]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x24d1) [0x7fd97f511db1]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x594e) [0x7fd97f51522e]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x845cc) [0x7fd97f4905cc]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_CallFunction+0xbb) [0x7fd97f45ee0b]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(_PyObject_GenericGetAttrWithDict+0xb5) [0x7fd97f4ab655]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x3b3e) [0x7fd97f51341e]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x845cc) [0x7fd97f4905cc]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x616cd) [0x7fd97f46d6cd]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0xbf0c8) [0x7fd97f4cb0c8]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x163d) [0x7fd97f510f1d]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x594e) [0x7fd97f51522e]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x84695) [0x7fd97f490695]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x11c3) [0x7fd97f510aa3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x84695) [0x7fd97f490695]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x11c3) [0x7fd97f510aa3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x84695) [0x7fd97f490695]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x11c3) [0x7fd97f510aa3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x845cc) [0x7fd97f4905cc]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_CallFunctionObjArgs+0x137) [0x7fd97f45f797]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/python2.7/site-packages/newrelic-2.70.0.51/newrelic/packages/wrapt/_wrappers.so(+0x42bd) [0x7fd97a5882bd]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x11c3) [0x7fd97f510aa3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x84695) [0x7fd97f490695]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x11c3) [0x7fd97f510aa3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x84695) [0x7fd97f490695]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalFrameEx+0x11c3) [0x7fd97f510aa3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyEval_EvalCodeEx+0x830) [0x7fd97f516730]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(+0x845cc) [0x7fd97f4905cc]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_Call+0x43) [0x7fd97f45ecb3]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: /usr/local/lib/libpython2.7.so.1.0(PyObject_CallFunctionObjArgs+0x137) [0x7fd97f45f797]
Mar 15 14:47:49 ip-172-31-19-222.ec2.internal docker[31241]: *** end of backtrace ***
Packages used: Python==2.7.10 Django==1.8.10 uWSGI==2.0.12
Are you by chance using the newrelic
python wrapper? We've had a bunch of AttributeError
surface through that. Which version of Python 2.7 did you see this happen on?
Our uswgi config looks like...
[uwsgi]
master = true
processes = 4
http = :80
module = app.config.wsgi
enable-threads = true
single-interpreter = true
wsgi-env-behavior = holy
die-on-term = true
harakiri = 60
harakiri-verbose = true
buffer-size = 65535
chdir = /code/
So this happens even on 2.0, but maybe is related to the 'holy' env behaviour ? (in 2.1 became the default setting). I will investigate in this area
We don't use New Relic, but we do use Raven (the Sentry client) which does some things to the WSGI request. I've asked a developer to see if the problem persists after removing Raven, but I didn't hear back yet.
Ran into a similar-looking problem with another Django application, Python 3.6:
AttributeError: 'SessionStore' object has no attribute '_SessionBase__session_key'
Internal Server Error: /
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/django/core/handlers/base.py", line 235, in get_response
response = middleware_method(request, response)
File "/usr/local/lib/python3.6/site-packages/django/contrib/messages/middleware.py", line 23, in process_response
unstored_messages = request._messages.update(response)
File "/usr/local/lib/python3.6/site-packages/django/contrib/messages/storage/base.py", line 137, in update
self._prepare_messages(self._queued_messages)
AttributeError: 'FallbackStorage' object has no attribute '_queued_messages'
[pid: 15|app: 0|req: 7/14] 172.31.240.26 () {46 vars in 1669 bytes} [Thu Mar 23 11:17:33 2017] GET / => generated 27 bytes in 40 msecs (HTTP/1.1 500) 1 headers in 63 bytes (1 switches on core 0)
Not Found: /favicon.ico
[pid: 12|app: 0|req: 8/15] 172.31.240.26 () {42 vars in 1073 bytes} [Thu Mar 23 11:17:33 2017] GET /favicon.ico => generated 85 bytes in 44 msecs (HTTP/1.1 404) 2 headers in 80 bytes (1 switches on core 0)
Thu Mar 23 11:20:24 2017 - !!! uWSGI process 15 got Segmentation Fault !!!
Thu Mar 23 11:20:24 2017 - DAMN ! worker 2 (pid: 15) died :( trying respawn ...
I've switched it to use --wsgi-env-behaviour=cheat
to see whether that makes any difference.
The users report that setting wsgi-env-behaviour = cheat
on our 2.1-based sites has fixed the problem.
Removing Raven didn't have any effect, so I don't think that's related.
I was having the same issue (lots of SIGSEGV), and I can confirm @unixwitch's hack (--wsgi-env-behaviour cheat
) worked
Using master@f262899bc5eb890007ba157d8b1da1e836049b74, we've been seeing strange problems with Django applications. The first is that WSGIRequest will be randomly missing attributes it should have:
or
The second is intermittent SEGVs:
And other evidence of memory corruption:
We've only seen this on our staging sites, which use
master
. Production sites using a 2.0-based branch don't seem to be affected. At least two affected applications use Python 3.4, but we've seen'WSGIRequest' object has no attribute 'path'
on 2.7 as well, so I don't think it's related to version. (We'll try updating one application to a newer Python version anyway.)These problems are all intermittent and don't seem to have an obvious cause; they manifest as internal server errors to users, but the application works fine after the worker is restarted. I'm not entirely sure this is caused by uWSGI (it could be some Python or C module bug...), but the fact it only appears in 2.1 suggests it may be.
Here's an example uwsgi config of an affected application:
I'm not really sure where to start with debugging this - at the point the crash happens, the problem has presumably already occurred, and I can't reproduce it...