Closed bsdphk closed 8 years ago
this means it is solved? (the way it was described in https://varnish-cache.org/trac/ticket/1635 ?)
We are facing the same issue with CentOS Linux release 7.4.1708 and this varnish:
Installed Packages
Name : varnish
Arch : x86_64
Version : 4.0.5
Release : 1.el7
Size : 1.2 M
Repo : installed
From repo : epel
Hi,
4.0 has been EOL for some time, can you upgrate and report back please?
Cheers,
-- Guillaume Quintard
On Mon, Jul 30, 2018 at 1:23 PM, Francesco R. notifications@github.com wrote:
this means it is solved? (the way it was described in https://varnish-cache.org/trac/ticket/1635 ?)
We are facing the same issue with CentOS Linux release 7.4.1708 and this varnish:
Installed Packages Name : varnish Arch : x86_64 Version : 4.0.5 Release : 1.el7 Size : 1.2 M Repo : installed From repo : epel
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/varnishcache/varnish-cache/issues/1635#issuecomment-408997689, or mute the thread https://github.com/notifications/unsubscribe-auth/ADmgKUF3JlXj4pLv4a90wRoFssdCcOWXks5uL2tVgaJpZM4VnHDf .
Yes, great! Updated to this one
Installed Packages
Name : varnish
Arch : x86_64
Version : 4.1.10
Release : 1.el7
Size : 7.9 M
Repo : installed
From repo : varnishcache_varnish41
and MAIN.bans_deleted started moving up from zero, now 1 less then MAIN.bans_added
thank you very much!
Just an update in case somebody else hit the same issue:
I wrote, as you see above, the problem was solved by updating varnish to v. 4.1.10.
Well, kind of: it did for 10 days, then we faced the same issue: completed bans stopped from being removed, so the ban list keeps growing: as a temporary solution we restart varnish every week: loosing all its cache :(
[side note: this happened in two independent but equal varnish servers ]
We are aware our figures are pretty big, so maybe we are hitting some kind of limit?
MAIN.bans 1218515 . Count of bans
MAIN.bans_completed 1218502 . Number of bans marked 'completed'
MAIN.bans_obj 1218513 . Number of bans using obj.*
MAIN.bans_req 2 . Number of bans using req.*
MAIN.bans_added 2752831 1.41 Bans added
MAIN.bans_deleted 1534316 0.78 Bans deleted
MAIN.bans_tested 4628230 2.36 Bans tested against objects (lookup)
MAIN.bans_obj_killed 9058 0.00 Objects killed by bans (lookup)
MAIN.bans_lurker_tested 202174933397 103285.64 Bans tested against objects (lurker)
MAIN.bans_tests_tested 420808781 214.98 Ban tests tested against objects (lookup)
MAIN.bans_lurker_tests_tested 202791678582 103600.72 Ban tests tested against objects (lurker)
MAIN.bans_lurker_obj_killed 3416020 1.75 Objects killed by bans (lurker)
MAIN.bans_dups 1304078 0.67 Bans superseded by other bans
MAIN.bans_lurker_contention 1099 0.00 Lurker gave way for lookup
MAIN.bans_persisted_bytes 766079485 . Bytes used by the persisted ban lists
MAIN.bans_persisted_fragmentation 746580277 . Extra bytes in persisted ban lists due to fragmentation
here the full output:
$ sudo varnishstat -1
MAIN.uptime 1957435 1.00 Child process uptime
MAIN.sess_conn 4864384 2.49 Sessions accepted
MAIN.sess_drop 0 0.00 Sessions dropped
MAIN.sess_fail 0 0.00 Session accept failures
MAIN.client_req_400 83 0.00 Client requests received, subject to 400 errors
MAIN.client_req_417 0 0.00 Client requests received, subject to 417 errors
MAIN.client_req 23092025 11.80 Good client requests received
MAIN.cache_hit 9599210 4.90 Cache hits
MAIN.cache_hit_grace 69464 0.04 Cache grace hits
MAIN.cache_hitpass 601239 0.31 Cache hits for pass
MAIN.cache_miss 8866506 4.53 Cache misses
MAIN.backend_conn 252673 0.13 Backend conn. success
MAIN.backend_unhealthy 0 0.00 Backend conn. not attempted
MAIN.backend_busy 0 0.00 Backend conn. too many
MAIN.backend_fail 9 0.00 Backend conn. failures
MAIN.backend_reuse 10552029 5.39 Backend conn. reuses
MAIN.backend_recycle 10676632 5.45 Backend conn. recycles
MAIN.backend_retry 9082 0.00 Backend conn. retry
MAIN.fetch_head 1402 0.00 Fetch no body (HEAD)
MAIN.fetch_length 1932420 0.99 Fetch with Length
MAIN.fetch_chunked 8726956 4.46 Fetch chunked
MAIN.fetch_eof 0 0.00 Fetch EOF
MAIN.fetch_bad 0 0.00 Fetch bad T-E
MAIN.fetch_none 644 0.00 Fetch no body
MAIN.fetch_1xx 0 0.00 Fetch no body (1xx)
MAIN.fetch_204 0 0.00 Fetch no body (204)
MAIN.fetch_304 95347 0.05 Fetch no body (304)
MAIN.fetch_failed 40220 0.02 Fetch failed (all causes)
MAIN.fetch_no_thread 0 0.00 Fetch failed (no thread)
MAIN.pools 2 . Number of thread pools
MAIN.threads 291 . Total number of threads
MAIN.threads_limited 283 0.00 Threads hit max
MAIN.threads_created 202802 0.10 Threads created
MAIN.threads_destroyed 202511 0.10 Threads destroyed
MAIN.threads_failed 0 0.00 Thread creation failed
MAIN.thread_queue_len 0 . Length of session queue
MAIN.busy_sleep 17087 0.01 Number of requests sent to sleep on busy objhdr
MAIN.busy_wakeup 17083 0.01 Number of requests woken after sleep on busy objhdr
MAIN.busy_killed 0 0.00 Number of requests killed after sleep on busy objhdr
MAIN.sess_queued 313922 0.16 Sessions queued for thread
MAIN.sess_dropped 0 0.00 Sessions dropped for thread
MAIN.n_object 158538 . object structs made
MAIN.n_vampireobject 0 . unresurrected objects
MAIN.n_objectcore 158733 . objectcore structs made
MAIN.n_objecthead 167714 . objecthead structs made
MAIN.n_waitinglist 280 . waitinglist structs made
MAIN.n_backend 2 . Number of backends
MAIN.n_expired 5202972 . Number of expired objects
MAIN.n_lru_nuked 0 . Number of LRU nuked objects
MAIN.n_lru_moved 8885215 . Number of LRU moved objects
MAIN.n_lru_limited 0 0.00 Reached nuke_limit
MAIN.losthdr 0 0.00 HTTP header overflows
MAIN.s_sess 4864384 2.49 Total sessions seen
MAIN.s_req 23092025 11.80 Total requests seen
MAIN.s_pipe 0 0.00 Total pipe sessions seen
MAIN.s_pass 1865919 0.95 Total pass-ed requests seen
MAIN.s_fetch 10732425 5.48 Total backend fetches initiated
MAIN.s_synth 2760390 1.41 Total synthethic responses made
MAIN.s_req_hdrbytes 12885606129 6582.90 Request header bytes
MAIN.s_req_bodybytes 7773783633 3971.41 Request body bytes
MAIN.s_resp_hdrbytes 11736504264 5995.86 Response header bytes
MAIN.s_resp_bodybytes 1071844286187 547575.93 Response body bytes
MAIN.s_pipe_hdrbytes 0 0.00 Pipe request header bytes
MAIN.s_pipe_in 0 0.00 Piped bytes from client
MAIN.s_pipe_out 0 0.00 Piped bytes to client
MAIN.sess_closed 839346 0.43 Session Closed
MAIN.sess_closed_err 1158135 0.59 Session Closed with error
MAIN.sess_readahead 0 0.00 Session Read Ahead
MAIN.sess_herd 17993398 9.19 Session herd
MAIN.sc_rem_close 2900756 1.48 Session OK REM_CLOSE
MAIN.sc_req_close 0 0.00 Session OK REQ_CLOSE
MAIN.sc_req_http10 0 0.00 Session Err REQ_HTTP10
MAIN.sc_rx_bad 0 0.00 Session Err RX_BAD
MAIN.sc_rx_body 0 0.00 Session Err RX_BODY
MAIN.sc_rx_junk 83 0.00 Session Err RX_JUNK
MAIN.sc_rx_overflow 0 0.00 Session Err RX_OVERFLOW
MAIN.sc_rx_timeout 1158053 0.59 Session Err RX_TIMEOUT
MAIN.sc_tx_pipe 0 0.00 Session OK TX_PIPE
MAIN.sc_tx_error 0 0.00 Session Err TX_ERROR
MAIN.sc_tx_eof 0 0.00 Session OK TX_EOF
MAIN.sc_resp_close 805205 0.41 Session OK RESP_CLOSE
MAIN.sc_overload 0 0.00 Session Err OVERLOAD
MAIN.sc_pipe_overflow 0 0.00 Session Err PIPE_OVERFLOW
MAIN.sc_range_short 0 0.00 Session Err RANGE_SHORT
MAIN.shm_records 2148360485 1097.54 SHM records
MAIN.shm_writes 96938226 49.52 SHM writes
MAIN.shm_flushes 171552 0.09 SHM flushes due to overflow
MAIN.shm_cont 328840 0.17 SHM MTX contention
MAIN.shm_cycles 1024 0.00 SHM cycles through buffer
MAIN.backend_req 10805688 5.52 Backend requests made
MAIN.n_vcl 1 . Number of loaded VCLs in total
MAIN.n_vcl_avail 1 . Number of VCLs available
MAIN.n_vcl_discard 0 . Number of discarded VCLs
MAIN.bans 1218515 . Count of bans
MAIN.bans_completed 1218502 . Number of bans marked 'completed'
MAIN.bans_obj 1218513 . Number of bans using obj.*
MAIN.bans_req 2 . Number of bans using req.*
MAIN.bans_added 2752831 1.41 Bans added
MAIN.bans_deleted 1534316 0.78 Bans deleted
MAIN.bans_tested 4628230 2.36 Bans tested against objects (lookup)
MAIN.bans_obj_killed 9058 0.00 Objects killed by bans (lookup)
MAIN.bans_lurker_tested 202174933397 103285.64 Bans tested against objects (lurker)
MAIN.bans_tests_tested 420808781 214.98 Ban tests tested against objects (lookup)
MAIN.bans_lurker_tests_tested 202791678582 103600.72 Ban tests tested against objects (lurker)
MAIN.bans_lurker_obj_killed 3416020 1.75 Objects killed by bans (lurker)
MAIN.bans_dups 1304078 0.67 Bans superseded by other bans
MAIN.bans_lurker_contention 1099 0.00 Lurker gave way for lookup
MAIN.bans_persisted_bytes 766079485 . Bytes used by the persisted ban lists
MAIN.bans_persisted_fragmentation 746580277 . Extra bytes in persisted ban lists due to fragmentation
MAIN.n_purges 0 . Number of purge operations executed
MAIN.n_obj_purged 0 . Number of purged objects
MAIN.exp_mailed 12500457 6.39 Number of objects mailed to expiry thread
MAIN.exp_received 12500457 6.39 Number of objects received by expiry thread
MAIN.hcb_nolock 19066955 9.74 HCB Lookups without lock
MAIN.hcb_lock 8660129 4.42 HCB Lookups with lock
MAIN.hcb_insert 8660087 4.42 HCB Inserts
MAIN.esi_errors 0 0.00 ESI parse errors (unlock)
MAIN.esi_warnings 0 0.00 ESI parse warnings (unlock)
MAIN.vmods 1 . Loaded VMODs
MAIN.n_gzip 0 0.00 Gzip operations
MAIN.n_gunzip 9886510 5.05 Gunzip operations
MAIN.vsm_free 973232 . Free VSM space
MAIN.vsm_used 83961376 . Used VSM space
MAIN.vsm_cooling 0 . Cooling VSM space
MAIN.vsm_overflow 0 . Overflow VSM space
MAIN.vsm_overflowed 0 0.00 Overflowed VSM space
MGT.uptime 1957435 1.00 Management process uptime
MGT.child_start 1 0.00 Child process started
MGT.child_exit 0 0.00 Child process normal exit
MGT.child_stop 0 0.00 Child process unexpected exit
MGT.child_died 0 0.00 Child process died (signal)
MGT.child_dump 0 0.00 Child process core dumped
MGT.child_panic 0 0.00 Child process panic
MEMPOOL.busyobj.live 2 . In use
MEMPOOL.busyobj.pool 12 . In Pool
MEMPOOL.busyobj.sz_wanted 65536 . Size requested
MEMPOOL.busyobj.sz_actual 65504 . Size allocated
MEMPOOL.busyobj.allocs 10796927 5.52 Allocations
MEMPOOL.busyobj.frees 10796925 5.52 Frees
MEMPOOL.busyobj.recycle 10776983 5.51 Recycled from pool
MEMPOOL.busyobj.timeout 1214718 0.62 Timed out from pool
MEMPOOL.busyobj.toosmall 0 0.00 Too small to recycle
MEMPOOL.busyobj.surplus 17221 0.01 Too many for pool
MEMPOOL.busyobj.randry 19944 0.01 Pool ran dry
MEMPOOL.req0.live 0 . In use
MEMPOOL.req0.pool 12 . In Pool
MEMPOOL.req0.sz_wanted 65536 . Size requested
MEMPOOL.req0.sz_actual 65504 . Size allocated
MEMPOOL.req0.allocs 10837194 5.54 Allocations
MEMPOOL.req0.frees 10837194 5.54 Frees
MEMPOOL.req0.recycle 9788877 5.00 Recycled from pool
MEMPOOL.req0.timeout 1317731 0.67 Timed out from pool
MEMPOOL.req0.toosmall 0 0.00 Too small to recycle
MEMPOOL.req0.surplus 699734 0.36 Too many for pool
MEMPOOL.req0.randry 1048317 0.54 Pool ran dry
MEMPOOL.sess0.live 7 . In use
MEMPOOL.sess0.pool 10 . In Pool
MEMPOOL.sess0.sz_wanted 512 . Size requested
MEMPOOL.sess0.sz_actual 480 . Size allocated
MEMPOOL.sess0.allocs 2426174 1.24 Allocations
MEMPOOL.sess0.frees 2426167 1.24 Frees
MEMPOOL.sess0.recycle 1341870 0.69 Recycled from pool
MEMPOOL.sess0.timeout 874927 0.45 Timed out from pool
MEMPOOL.sess0.toosmall 0 0.00 Too small to recycle
MEMPOOL.sess0.surplus 728970 0.37 Too many for pool
MEMPOOL.sess0.randry 1084304 0.55 Pool ran dry
MEMPOOL.req1.live 2 . In use
MEMPOOL.req1.pool 10 . In Pool
MEMPOOL.req1.sz_wanted 65536 . Size requested
MEMPOOL.req1.sz_actual 65504 . Size allocated
MEMPOOL.req1.allocs 10862516 5.55 Allocations
MEMPOOL.req1.frees 10862514 5.55 Frees
MEMPOOL.req1.recycle 9806180 5.01 Recycled from pool
MEMPOOL.req1.timeout 1319976 0.67 Timed out from pool
MEMPOOL.req1.toosmall 0 0.00 Too small to recycle
MEMPOOL.req1.surplus 709200 0.36 Too many for pool
MEMPOOL.req1.randry 1056336 0.54 Pool ran dry
MEMPOOL.sess1.live 9 . In use
MEMPOOL.sess1.pool 10 . In Pool
MEMPOOL.sess1.sz_wanted 512 . Size requested
MEMPOOL.sess1.sz_actual 480 . Size allocated
MEMPOOL.sess1.allocs 2438210 1.25 Allocations
MEMPOOL.sess1.frees 2438201 1.25 Frees
MEMPOOL.sess1.recycle 1343995 0.69 Recycled from pool
MEMPOOL.sess1.timeout 874583 0.45 Timed out from pool
MEMPOOL.sess1.toosmall 0 0.00 Too small to recycle
MEMPOOL.sess1.surplus 739810 0.38 Too many for pool
MEMPOOL.sess1.randry 1094215 0.56 Pool ran dry
SMA.Transient.c_req 14837597 7.58 Allocator requests
SMA.Transient.c_fail 0 0.00 Allocator failures
SMA.Transient.c_bytes 85173714824 43512.92 Bytes allocated
SMA.Transient.c_freed 85173269688 43512.69 Bytes freed
SMA.Transient.g_alloc 603 . Allocations outstanding
SMA.Transient.g_bytes 445136 . Bytes outstanding
SMA.Transient.g_space 0 . Bytes available
SMF.s0.c_req 14397526 7.36 Allocator requests
SMF.s0.c_fail 0 0.00 Allocator failures
SMF.s0.c_bytes 899634298880 459598.56 Bytes allocated
SMF.s0.c_freed 894315216896 456881.18 Bytes freed
SMF.s0.g_alloc 347727 . Allocations outstanding
SMF.s0.g_bytes 5319081984 . Bytes outstanding
SMF.s0.g_space 5418336256 . Bytes available
SMF.s0.g_smf 406641 . N struct smf
SMF.s0.g_smf_frag 55035 . N small free smf
SMF.s0.g_smf_large 3879 . N large free smf
VBE.boot.server1.happy 18446744073709551615 . Happy health probes
VBE.boot.server1.bereq_hdrbytes 3023417456 1544.58 Request header bytes
VBE.boot.server1.bereq_bodybytes 3875030936 1979.65 Request body bytes
VBE.boot.server1.beresp_hdrbytes 2346630624 1198.83 Response header bytes
VBE.boot.server1.beresp_bodybytes 429896950979 219622.59 Response body bytes
VBE.boot.server1.pipe_hdrbytes 0 0.00 Pipe request header bytes
VBE.boot.server1.pipe_out 0 0.00 Piped bytes to backend
VBE.boot.server1.pipe_in 0 0.00 Piped bytes from backend
VBE.boot.server1.conn 1 . Concurrent connections to backend
VBE.boot.server1.req 5403083 2.76 Backend requests sent
VBE.boot.server2.happy 18446744073709551615 . Happy health probes
VBE.boot.server2.bereq_hdrbytes 3022782365 1544.26 Request header bytes
VBE.boot.server2.bereq_bodybytes 3895173388 1989.94 Request body bytes
VBE.boot.server2.beresp_hdrbytes 2347135711 1199.09 Response header bytes
VBE.boot.server2.beresp_bodybytes 425342540710 217295.87 Response body bytes
VBE.boot.server2.pipe_hdrbytes 0 0.00 Pipe request header bytes
VBE.boot.server2.pipe_out 0 0.00 Piped bytes to backend
VBE.boot.server2.pipe_in 0 0.00 Piped bytes from backend
VBE.boot.server2.conn 1 . Concurrent connections to backend
VBE.boot.server2.req 5402917 2.76 Backend requests sent
LCK.backend.creat 4 0.00 Created locks
LCK.backend.destroy 0 0.00 Destroyed locks
LCK.backend.locks 21872198 11.17 Lock Operations
LCK.backend_tcp.creat 2 0.00 Created locks
LCK.backend_tcp.destroy 0 0.00 Destroyed locks
LCK.backend_tcp.locks 42841788 21.89 Lock Operations
LCK.ban.creat 1 0.00 Created locks
LCK.ban.destroy 0 0.00 Destroyed locks
LCK.ban.locks 15783420800 8063.32 Lock Operations
LCK.busyobj.creat 10796561 5.52 Created locks
LCK.busyobj.destroy 10796923 5.52 Destroyed locks
LCK.busyobj.locks 173605223 88.69 Lock Operations
LCK.cli.creat 1 0.00 Created locks
LCK.cli.destroy 0 0.00 Destroyed locks
LCK.cli.locks 652372 0.33 Lock Operations
LCK.exp.creat 1 0.00 Created locks
LCK.exp.destroy 0 0.00 Destroyed locks
LCK.exp.locks 55943824 28.58 Lock Operations
LCK.hcb.creat 1 0.00 Created locks
LCK.hcb.destroy 0 0.00 Destroyed locks
LCK.hcb.locks 17172571 8.77 Lock Operations
LCK.lru.creat 2 0.00 Created locks
LCK.lru.destroy 0 0.00 Destroyed locks
LCK.lru.locks 39305248 20.08 Lock Operations
LCK.mempool.creat 5 0.00 Created locks
LCK.mempool.destroy 0 0.00 Destroyed locks
LCK.mempool.locks 92912559 47.47 Lock Operations
LCK.objhdr.creat 8715815 4.45 Created locks
LCK.objhdr.destroy 8549145 4.37 Destroyed locks
LCK.objhdr.locks 8733217235 4461.56 Lock Operations
LCK.pipestat.creat 1 0.00 Created locks
LCK.pipestat.destroy 0 0.00 Destroyed locks
LCK.pipestat.locks 0 0.00 Lock Operations
LCK.sess.creat 4861099 2.48 Created locks
LCK.sess.destroy 4864095 2.48 Destroyed locks
LCK.sess.locks 34744 0.02 Lock Operations
LCK.smp.creat 0 0.00 Created locks
LCK.smp.destroy 0 0.00 Destroyed locks
LCK.smp.locks 0 0.00 Lock Operations
LCK.vbe.creat 1 0.00 Created locks
LCK.vbe.destroy 0 0.00 Destroyed locks
LCK.vbe.locks 652363 0.33 Lock Operations
LCK.vcapace.creat 1 0.00 Created locks
LCK.vcapace.destroy 0 0.00 Destroyed locks
LCK.vcapace.locks 0 0.00 Lock Operations
LCK.vcl.creat 1 0.00 Created locks
LCK.vcl.destroy 0 0.00 Destroyed locks
LCK.vcl.locks 25536414 13.05 Lock Operations
LCK.vxid.creat 1 0.00 Created locks
LCK.vxid.destroy 0 0.00 Destroyed locks
LCK.vxid.locks 196191 0.10 Lock Operations
LCK.waiter.creat 2 0.00 Created locks
LCK.waiter.destroy 0 0.00 Destroyed locks
LCK.waiter.locks 84236374 43.03 Lock Operations
LCK.wq.creat 3 0.00 Created locks
LCK.wq.destroy 0 0.00 Destroyed locks
LCK.wq.locks 101288733 51.75 Lock Operations
LCK.wstat.creat 1 0.00 Created locks
LCK.wstat.destroy 0 0.00 Destroyed locks
LCK.wstat.locks 48032380 24.54 Lock Operations
LCK.smf.creat 1 0.00 Created locks
LCK.smf.destroy 0 0.00 Destroyed locks
LCK.smf.locks 28447326 14.53 Lock Operations
LCK.sma.creat 1 0.00 Created locks
LCK.sma.destroy 0 0.00 Destroyed locks
LCK.sma.locks 29674591 15.16 Lock Operations
a lot of improvements regarding bans have gone into more recent versions.
In your particular case, I suspect that this could be related to the fact that you got request bans (see MAIN.bans_req
), those may prevent the ban lurker from working bans.
So besides recommending to upgrade to a current weekly version, you absolutely have to avoid request bans.
If you continue to have problems after following these recommendations, we really need your varnishadm ban.list
output to help you.
Great, @nigoroll, you are perfectly right! Indeed, the ban list started growing (and server's CPU) exactly the day we added those two ban requests:
sudo varnishadm ban req.http.host == "http://www.xyz.it"
thanks! :1st_place_medal:
This is our ban.list
sudo varnishadm ban.list
Present bans:
1535027175.263366 720 - obj.http.X-Location-Id ~ 8329 && obj.http.X-Ban-Host ~ \.aaa\.xyz\.it
1535027175.263361 0 - obj.http.X-Location-Id ~ 8732 && obj.http.X-Ban-Host ~ \.bbb\.xyz\.it
1535027154.731680 6190 C
1535027154.731668 0 C
1535027128.112949 0 C
...
where in default.vcl we have
...
sub vcl_recv {
if( req.method == "PURGE" ) {
if( req.http.X-Ban-Condition ) {
ban( req.http.X-Ban-Condition );
return( synth( 200, "Purged: " + req.http.X-Ban-Condition ) );
}
return( synth( 500, "Missing X-Ban-Condition header." ) );
}
(more details for the ban login in https://github.com/mugoweb/mugo_varnish/blob/master/doc/INSTALL.md )
Old ticket imported from Trac See archived copy here: https://varnish-cache.org/trac/ticket/1635