openethereum / parity-ethereum

The fast, light, and robust client for Ethereum-like networks.
Other
6.83k stars 1.68k forks source link

Losing peers and falling out of sync #8974

Closed madmartian5 closed 6 years ago

madmartian5 commented 6 years ago

Before filing a new issue, please provide the following information.

I'm running:

  • Which Parity version?: Parity/v1.11.4-beta-cc44ae9-20180619/x86_64-linux-gnu/rustc1.26.1
  • Which operating system?: Linux (Ubuntu)
  • How installed?: via installer
  • Are you fully synchronized?: yes
  • Which network are you connected to?: ethereum
  • Did you try to restart the node?: yes

Your issue description goes here below. Try to include actual vs. expected behavior and steps to reproduce the issue.

My parity node is dropping peers quickly and then falling out of sync. I have to restart often, at which time it connects to a bunch of peers, manages to sync and then shortly there after peers start dropping off and the cycle continues.

What can I do to keep it connected to peers and in sync. Also, if someone could recommend default settings (cli flags) for running a node that is used primarily to broadcase trnasactions and also to read logs and pendings.

Thanks Adam

5chdn commented 6 years ago

Please share logs. Is it a warp sync or a regular sync?

madmartian5 commented 6 years ago

How do I enable logs sorry

madmartian5 commented 6 years ago

screen shot 2018-06-25 at 1 44 03 pm

In meantime.. this is after about 30mins

5chdn commented 6 years ago

And after that? Please show some more :)

madmartian5 commented 6 years ago

after that it stalled.. you can see it fell to 0 and then nothing.. I tried going back to parity 1.10 stable and that was better for awhile.. but same issue.

atlanticcrypto commented 6 years ago

Having same issue with 1.11.4. For me it happens with higher frequency, roughly every 30-40 blocks. Checked it on multiple instances where previous clients run fine.

Will troubleshoot more when I have time and respond with log info.

Tbaut commented 6 years ago

This always drops to 0 suddenly? Could you provide some logs such as -l network?

madmartian5 commented 6 years ago

yes, always drops quickly.. pretty much as soon as it is completely synched it starts dropping. Unfortunately, for now I have switched to trying geth again, as I have a server that costs me to much to have sitting now working.

I increased CPU and mem.. and again it helped for a bit.. but not for long

roninkaizen commented 6 years ago

is a reduction of peers possible, had the the same, with hard limiting to 20 peers i came to "productive" results, a node which ran continously for weeks

madmartian5 commented 6 years ago

screen shot 2018-06-25 at 7 13 54 pm

toomanypeers error there.

Tbaut commented 6 years ago

@amexperts Could you please copy/paste the last 30sec in a gist as screenshots a not convenient to look at.

dafky2000 commented 6 years ago

This issue has been plaguing me lately as well... I'll try to get a -l network asap but essentially it is dropping connections to all peers once it finally catches up and then gets out of sync again... Parity v0.10.7

2018-06-26 12:33:24 UTC Syncing #5857469 00af…ba79     0 blk/s    9 tx/s   0 Mgas/s      0+    4 Qed  #5857474   67/128 peers  598 MiB chain  172 MiB db  470 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 208 µs
2018-06-26 12:33:34 UTC Syncing #5857470 1623…6658     0 blk/s   26 tx/s   0 Mgas/s      0+    6 Qed  #5857476   68/128 peers  599 MiB chain  172 MiB db  556 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 209 µs
2018-06-26 12:33:44 UTC Syncing #5857470 1623…6658     0 blk/s    0 tx/s   0 Mgas/s      0+    7 Qed  #5857476   69/128 peers  599 MiB chain  172 MiB db  682 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 204 µs
2018-06-26 12:33:54 UTC Syncing #5857471 4eab…906e     0 blk/s   46 tx/s   1 Mgas/s      0+    4 Qed  #5857477   68/128 peers  605 MiB chain  172 MiB db  224 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 198 µs
2018-06-26 12:34:04 UTC Syncing #5857472 8c97…9f3e     0 blk/s    6 tx/s   0 Mgas/s      0+    5 Qed  #5857478   67/128 peers  614 MiB chain  172 MiB db  285 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 197 µs
2018-06-26 12:34:14 UTC Syncing #5857473 a3b6…0e05     0 blk/s   25 tx/s   1 Mgas/s      0+    8 Qed  #5857479   68/128 peers  613 MiB chain  172 MiB db  568 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 197 µs
2018-06-26 12:34:24 UTC Syncing #5857473 a3b6…0e05     0 blk/s    0 tx/s   0 Mgas/s      0+    9 Qed  #5857480   67/128 peers  607 MiB chain  172 MiB db  646 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 206 µs
2018-06-26 12:34:34 UTC Syncing #5857475 0b31…7bbc     0 blk/s   23 tx/s   1 Mgas/s      0+    5 Qed  #5857480   69/128 peers  608 MiB chain  172 MiB db  422 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 206 µs
2018-06-26 12:35:07 UTC   69/128 peers    616 MiB chain  172 MiB db  0 bytes queue  190 KiB sync  RPC:  0 conn,  3 req/s, 206 µs
2018-06-26 12:37:10 UTC Imported #5857481 e94d…6652 (66 txs, 8.00 Mgas, 7929.57 ms, 16.54 KiB) + another 1 block(s) containing 92 tx(s)
2018-06-26 12:37:28 UTC   46/64 peers    528 MiB chain  172 MiB db  310 KiB queue  190 KiB sync  RPC:  0 conn,  3 req/s, 183 µs 
2018-06-26 12:37:59 UTC Imported #5857485 23b9…4ebd (40 txs, 8.00 Mgas, 11063.39 ms, 21.14 KiB) + another 2 block(s) containing 305 tx(s)
2018-06-26 12:38:13 UTC    5/64 peers    528 MiB chain  172 MiB db  0 bytes queue  190 KiB sync  RPC:  0 conn,  3 req/s, 211 µs
2018-06-26 12:38:14 UTC    0/64 peers    528 MiB chain  172 MiB db  0 bytes queue  190 KiB sync  RPC:  0 conn,  3 req/s, 211 µs
2018-06-26 12:38:14 UTC    0/64 peers    528 MiB chain  172 MiB db  0 bytes queue  190 KiB sync  RPC:  0 conn,  3 req/s, 211 µs
2018-06-26 12:38:52 UTC Syncing #5857485 23b9…4ebd     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5857485    1/64 peers 528 MiB chain  172 MiB db  0 bytes queue  191 KiB sync  RPC:  0 conn,  2 req/s, 213 µs
2018-06-26 12:38:54 UTC Syncing #5857485 23b9…4ebd     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5857485    1/64 peers 528 MiB chain  172 MiB db  0 bytes queue  191 KiB sync  RPC:  0 conn,  3 req/s, 213 µs
2018-06-26 12:39:23 UTC Syncing #5857485 23b9…4ebd     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5857482    1/64 peers 528 MiB chain  172 MiB db  0 bytes queue  190 KiB sync  RPC:  0 conn,  3 req/s, 216 µs
2018-06-26 12:39:26 UTC Syncing #5857485 23b9…4ebd     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5857482    1/64 peers 528 MiB chain  172 MiB db  0 bytes queue  190 KiB sync  RPC:  0 conn,  3 req/s, 201 µs
2018-06-26 12:39:44 UTC Syncing #5857485 23b9…4ebd     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5857482    2/64 peers 528 MiB chain  172 MiB db  0 bytes queue  199 KiB sync  RPC:  0 conn,  3 req/s, 200 µs
phutchins commented 6 years ago

I have the same issue. It's odd however as I have 2 Debian nodes and 2 Ubuntu nodes and it seems like some will loose peers and others will not. Working on tracking down why... Any thoughts on troubleshooting and getting helpful information?

madmartian5 commented 6 years ago

So. I tried again on a new server - 32GB Ram, 8vCPUS and a dedicated 500GB SSD - and it seems to be holding for now.. hovers around 10 peers

roninkaizen commented 6 years ago

https://gist.github.com/roninkaizen/c8afc10d3f949c287899ec175f077bbc same here, reductions brought just a little "cure"- only the continous addition of reserved peers with the parity-ui changed it into productive again

atlanticcrypto commented 6 years ago

This was fixed for me on 1.11.5. Considerably better performance both in peering and chain reorgs.

Also, the low number of peers you are suggesting seems like a problem in and of itself. @amexperts you should be able to support 250+ peers with that hardware. I have nodes similar in construction handling 500+ peers.

peterbitfly commented 6 years ago

I am seeing the same behavior on 1.11.5 on multiple production nodes (all dedicated servers or high performance cloud machines). I am currently gathering logs with -lnetwork,sync on one of the affected nodes. For us the issue started after upgrading from 1.10 to the 1.11 series.

Edit: Here is a log file with -lnetwork,sync during the time the node went out of sync: https://gist.github.com/ppratscher/288908767d4db1e9da82bc039b20f9b4

Grepping for "Syncing #" in the current log file yield the following:

2018-07-09 11:10:17  IO Worker #0 INFO import  Syncing #5932312 0xce78…99d5     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932312   91/100 peers      1 GiB chain  128 MiB db  0 bytes queue    9 MiB sync  RPC:  0 conn, 13 req/s, 321 µs
2018-07-09 11:12:52  IO Worker #1 INFO import  Syncing #5932319 0xcc55…a442     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932319   90/100 peers      1 GiB chain  128 MiB db  0 bytes queue   10 MiB sync  RPC:  0 conn, 13 req/s, 248 µs
2018-07-09 11:13:17  IO Worker #0 INFO import  Syncing #5932321 0x7ac1…18e5     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932316   90/100 peers      1 GiB chain  128 MiB db  0 bytes queue   10 MiB sync  RPC:  0 conn, 13 req/s, 202 µs
2018-07-09 11:13:27  IO Worker #1 INFO import  Syncing #5932322 0x654d…baf8     0 blk/s   21 tx/s   0 Mgas/s      0+    0 Qed  #5932318   89/100 peers      1 GiB chain  128 MiB db  0 bytes queue   10 MiB sync  RPC:  0 conn, 13 req/s, 229 µs
2018-07-09 11:14:47  IO Worker #3 INFO import  Syncing #5932330 0xc0af…5d1d     0 blk/s    9 tx/s   0 Mgas/s      0+    0 Qed  #5932329   83/100 peers      1 GiB chain  125 MiB db  0 bytes queue   16 MiB sync  RPC:  0 conn, 12 req/s, 367 µs
2018-07-09 11:14:57  IO Worker #1 INFO import  Syncing #5932331 0x547f…0c63     0 blk/s   13 tx/s   0 Mgas/s      0+    0 Qed  #5932326   85/100 peers      1 GiB chain  125 MiB db  0 bytes queue   16 MiB sync  RPC:  0 conn, 12 req/s, 322 µs
2018-07-09 11:17:07  IO Worker #1 INFO import  Syncing #5932339 0x2cc7…1f32     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932339   89/100 peers      1 GiB chain  124 MiB db  0 bytes queue   16 MiB sync  RPC:  0 conn,  5 req/s, 540 µs
2018-07-09 11:18:37  IO Worker #1 INFO import  Syncing #5932346 0x1eff…1e23     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932346   87/100 peers      1 GiB chain  122 MiB db  0 bytes queue   16 MiB sync  RPC:  0 conn, 11 req/s, 372 µs
2018-07-09 11:21:22  IO Worker #2 INFO import  Syncing #5932355 0xcb06…5995     0 blk/s    4 tx/s   0 Mgas/s      0+    0 Qed  #5932355   90/100 peers      1 GiB chain  120 MiB db  0 bytes queue   16 MiB sync  RPC:  0 conn, 10 req/s, 389 µs
2018-07-09 11:21:32  IO Worker #3 INFO import  Syncing #5932356 0xedf9…4b7d     0 blk/s   12 tx/s   1 Mgas/s      0+    0 Qed  #5932350   82/100 peers      1 GiB chain  121 MiB db  0 bytes queue   20 MiB sync  RPC:  0 conn, 13 req/s, 471 µs
2018-07-09 11:21:42  IO Worker #1 INFO import  Syncing #5932357 0xfe88…dae6     0 blk/s    6 tx/s   0 Mgas/s      0+    0 Qed  #5932344   82/100 peers      1 GiB chain  121 MiB db  0 bytes queue   20 MiB sync  RPC:  0 conn, 13 req/s, 435 µs
2018-07-09 11:22:37  IO Worker #2 INFO import  Syncing #5932357 0xfe88…dae6     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932357   86/100 peers      1 GiB chain  121 MiB db  0 bytes queue   20 MiB sync  RPC:  0 conn, 12 req/s, 272 µs
2018-07-09 11:23:07  IO Worker #0 INFO import  Syncing #5932360 0x652e…e5ad     0 blk/s    9 tx/s   0 Mgas/s      0+    0 Qed  #5932360   83/100 peers      1 GiB chain  121 MiB db  0 bytes queue   20 MiB sync  RPC:  0 conn, 12 req/s, 749 µs
2018-07-09 11:24:42  IO Worker #0 INFO import  Syncing #5932364 0x5331…d809     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932364   91/100 peers      1 GiB chain  121 MiB db  0 bytes queue   24 MiB sync  RPC:  0 conn, 11 req/s, 148 µs
2018-07-09 11:25:27  IO Worker #2 INFO import  Syncing #5932369 0xbd75…d88e     0 blk/s   13 tx/s   1 Mgas/s      0+    0 Qed  #5932368   91/100 peers      1 GiB chain  122 MiB db  0 bytes queue   24 MiB sync  RPC:  0 conn, 12 req/s, 175 µs
2018-07-09 11:26:37  IO Worker #0 INFO import  Syncing #5932372 0xdf5b…e19e     0 blk/s    6 tx/s   0 Mgas/s      0+    0 Qed  #5932364   89/100 peers      1 GiB chain  121 MiB db  0 bytes queue   25 MiB sync  RPC:  0 conn, 13 req/s, 219 µs
2018-07-09 11:31:47  IO Worker #3 INFO import  Syncing #5932391 0xbb78…dd1e     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932391   89/100 peers      1 GiB chain  120 MiB db  0 bytes queue   28 MiB sync  RPC:  0 conn, 13 req/s, 339 µs
2018-07-09 11:32:47  IO Worker #3 INFO import  Syncing #5932397 0xfa74…c1da     0 blk/s    5 tx/s   0 Mgas/s      0+    0 Qed  #5932393   82/100 peers      1 GiB chain  122 MiB db  0 bytes queue   28 MiB sync  RPC:  0 conn, 10 req/s, 565 µs
2018-07-09 11:34:32  IO Worker #1 INFO import  Syncing #5932404 0xc300…5371     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932404   86/100 peers      1 GiB chain  125 MiB db  0 bytes queue   33 MiB sync  RPC:  0 conn, 12 req/s, 741 µs
2018-07-09 11:35:42  IO Worker #2 INFO import  Syncing #5932408 0x26b7…b501     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932408   90/100 peers      1 GiB chain  126 MiB db  0 bytes queue   33 MiB sync  RPC:  0 conn, 12 req/s, 672 µs
2018-07-09 11:35:52  IO Worker #3 INFO import  Syncing #5932408 0x26b7…b501     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5932408   90/100 peers      1 GiB chain  126 MiB db  0 bytes queue   33 MiB sync  RPC:  0 conn, 12 req/s, 635 µs
2018-07-09 11:54:57  IO Worker #0 INFO import  Syncing #5932482 0x256a…e2ab     0 blk/s   13 tx/s   0 Mgas/s      0+    0 Qed  #5932482   92/100 peers      1 GiB chain  133 MiB db  0 bytes queue   46 MiB sync  RPC:  0 conn, 12 req/s, 631 µs
2018-07-09 12:02:42  IO Worker #1 INFO import  Syncing #5932510 0x4af0…5227     0 blk/s   23 tx/s   0 Mgas/s      0+    0 Qed  #5932510   94/100 peers      1 GiB chain  134 MiB db  0 bytes queue   47 MiB sync  RPC:  0 conn, 11 req/s, 569 µs

As it can be seen the node is switching to sync mode every 10 minutes.

tomusdrw commented 6 years ago

@ppratscher could you dump the list of peers when that happens? Just invoke parity_netPeers RPC.

vogelito commented 6 years ago

Hi - over the last 24 hours we've seen this behavior in our parity nodes as well. This is new behavior.

We are running a slightly older version Parity/v1.10.6-unstable-bc0d134-20180605/x86_64-linux-gnu/rustc1.25.0 (it's a stable release but since we build the binaries ourselves the tag says unstable).

Here's a graph of the block height between geth and parity (Geth Block Height - Parity Block Height). A positive number means Parity is lagging behind Geth, a negative number would mean Geth is lagging behind Parity.

Graphs are in ET (UTC-04:00).

Last 48 hours:

screen shot 2018-07-13 at 11 09 18 am

Zoomed in graph for the period between 6:20 and 8:20 AM ET on July 13 (10:20 - 12:20 UTC on 2018-07-13):

screen shot 2018-07-13 at 11 09 42 am

Here's a similar grep to the one @ppratscher did:

2018-07-13 01:23:48 UTC IO Worker #1 INFO import  Syncing #5953955 4936…8ff1     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5953956   25/25 peers    147 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 2449 µs
2018-07-13 01:44:19 UTC IO Worker #2 INFO import  Syncing #5954009 8a77…6fb7     0 blk/s   10 tx/s   0 Mgas/s      0+    0 Qed  #5954009    9/25 peers    126 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 2566 µs
2018-07-13 01:44:19 UTC IO Worker #1 INFO import  Syncing #5954009 8a77…6fb7     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954009    9/25 peers    126 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 2566 µs
2018-07-13 01:44:59 UTC IO Worker #2 INFO import  Syncing #5954018 033f…0427     0 blk/s   75 tx/s   6 Mgas/s      0+   24 Qed  #5954044    4/25 peers    126 MiB chain  306 MiB db    2 MiB queue    2 MiB sync  RPC:  0 conn, 24 req/s, 258 µs
2018-07-13 01:54:09 UTC IO Worker #0 INFO import  Syncing #5954072 51c6…e87a     0 blk/s    0 tx/s   0 Mgas/s      0+    4 Qed  #5954077    7/25 peers    141 MiB chain  306 MiB db  436 KiB queue    2 MiB sync  RPC:  0 conn, 24 req/s, 590 µs
2018-07-13 03:27:11 UTC IO Worker #3 INFO import  Syncing #5954472 fced…e6bb     0 blk/s    2 tx/s   0 Mgas/s      0+    0 Qed  #5954471   22/25 peers    149 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  2 req/s, 1142 µs
2018-07-13 03:27:11 UTC IO Worker #1 INFO import  Syncing #5954472 fced…e6bb     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954471   22/25 peers    149 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  2 req/s, 1142 µs
2018-07-13 04:20:34 UTC IO Worker #3 INFO import  Syncing #5954719 d537…b5fe     0 blk/s    5 tx/s   0 Mgas/s      0+    0 Qed  #5954718   24/25 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 16 req/s, 221 µs
2018-07-13 04:20:44 UTC IO Worker #2 INFO import  Syncing #5954719 d537…b5fe     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954577   23/25 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 3803 µs
2018-07-13 04:20:54 UTC IO Worker #3 INFO import  Syncing #5954719 d537…b5fe     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5953681   24/25 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 38 req/s, 3803 µs
2018-07-13 04:21:04 UTC IO Worker #2 INFO import  Syncing #5954719 d537…b5fe     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5953038   24/25 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 38 req/s, 3803 µs
2018-07-13 04:21:15 UTC IO Worker #3 INFO import  Syncing #5954720 7c28…cec8     0 blk/s    9 tx/s   0 Mgas/s      0+    0 Qed  #5954718   27/50 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 38 req/s, 232 µs
2018-07-13 04:21:24 UTC IO Worker #2 INFO import  Syncing #5954720 7c28…cec8     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954698   25/25 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 39 req/s, 249 µs
2018-07-13 04:21:34 UTC IO Worker #2 INFO import  Syncing #5954720 7c28…cec8     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954417   26/50 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 2370 µs
2018-07-13 04:21:44 UTC IO Worker #2 INFO import  Syncing #5954720 7c28…cec8     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5953873   27/50 peers    142 MiB chain  306 MiB db  0 bytes queue   20 MiB sync  RPC:  0 conn, 21 req/s, 2370 µs
2018-07-13 04:21:54 UTC IO Worker #2 INFO import  Syncing #5954720 7c28…cec8     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5953263   27/50 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 2370 µs
2018-07-13 05:05:00 UTC IO Worker #2 INFO import  Syncing #5954892 3781…81aa     0 blk/s    3 tx/s   0 Mgas/s      0+    0 Qed  #5954892   25/25 peers    123 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 252 µs
2018-07-13 05:05:00 UTC IO Worker #1 INFO import  Syncing #5954892 3781…81aa     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954892   25/25 peers    123 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 252 µs
2018-07-13 05:16:42 UTC IO Worker #3 INFO import  Syncing #5954914 c6d0…984a     0 blk/s    3 tx/s   0 Mgas/s      0+    0 Qed  #5954914    6/25 peers    125 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 27 req/s, 205 µs
2018-07-13 05:16:42 UTC IO Worker #1 INFO import  Syncing #5954914 c6d0…984a     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5954914    6/25 peers    125 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 27 req/s, 205 µs
2018-07-13 05:16:49 UTC IO Worker #1 INFO import  Syncing #5954914 c6d0…984a     0 blk/s    0 tx/s   0 Mgas/s      0+   24 Qed  #5954941    2/25 peers    125 MiB chain  306 MiB db    3 MiB queue    2 MiB sync  RPC:  0 conn, 32 req/s, 205 µs
2018-07-13 05:16:59 UTC IO Worker #1 INFO import  Syncing #5954929 03b8…b79d     1 blk/s  198 tx/s  11 Mgas/s      0+    8 Qed  #5954941    3/25 peers    125 MiB chain  306 MiB db 1023 KiB queue    2 MiB sync  RPC:  0 conn, 37 req/s, 3184 µs
2018-07-13 05:51:29 UTC IO Worker #1 INFO import  Syncing #5955074 c394…68e2     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955075   30/50 peers    108 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 23 req/s, 188 µs
2018-07-13 06:17:03 UTC IO Worker #2 INFO import  Syncing #5955171 c3bb…6f04     0 blk/s    3 tx/s   0 Mgas/s      0+    0 Qed  #5955169   24/25 peers    123 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 47 req/s, 248 µs
2018-07-13 06:22:50 UTC IO Worker #2 INFO import  Syncing #5955197 8c1a…26dd     0 blk/s   12 tx/s   0 Mgas/s      0+    0 Qed  #5955197   25/25 peers    131 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 34 req/s, 166 µs
2018-07-13 07:13:08 UTC IO Worker #0 INFO import  Syncing #5955388 cf58…0516     0 blk/s   26 tx/s   1 Mgas/s      0+    0 Qed  #5955388   22/25 peers    118 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 35 req/s, 242 µs
2018-07-13 07:25:15 UTC IO Worker #1 INFO import  Syncing #5955446 b9db…4e51     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955447   22/25 peers    130 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 49 req/s, 310 µs
2018-07-13 07:25:27 UTC IO Worker #3 INFO import  Syncing #5955448 90e6…ca94     0 blk/s   18 tx/s   1 Mgas/s      0+    0 Qed  #5955446   22/25 peers    130 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 56 req/s, 233 µs
2018-07-13 07:25:35 UTC IO Worker #2 INFO import  Syncing #5955448 90e6…ca94     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955425   23/25 peers    130 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 37 req/s, 200 µs
2018-07-13 07:50:05 UTC IO Worker #2 INFO import  Syncing #5955516 04fd…dd18     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955516    3/25 peers    125 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  1 req/s, 1517 µs
2018-07-13 07:50:15 UTC IO Worker #2 INFO import  Syncing #5955529 46ce…14aa     1 blk/s  215 tx/s  10 Mgas/s      0+   15 Qed  #5955548    5/25 peers    125 MiB chain  306 MiB db    2 MiB queue    2 MiB sync  RPC:  0 conn,  1 req/s, 1517 µs
2018-07-13 07:50:35 UTC IO Worker #0 INFO import  Syncing #5955551 cf43…9eb7     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955551   10/25 peers    135 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 30 req/s, 251 µs
2018-07-13 07:50:45 UTC IO Worker #0 INFO import  Syncing #5955551 cf43…9eb7     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955551    8/25 peers    135 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 20 req/s, 232 µs
2018-07-13 07:50:55 UTC IO Worker #0 INFO import  Syncing #5955551 cf43…9eb7     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955551    9/25 peers    136 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 18 req/s, 203 µs
2018-07-13 07:59:15 UTC IO Worker #2 INFO import  Syncing #5955587 9b1d…f2ee     0 blk/s    4 tx/s   0 Mgas/s      0+    0 Qed  #5955587   18/25 peers    142 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 17 req/s, 2060 µs
2018-07-13 08:08:30 UTC IO Worker #2 INFO import  Syncing #5955632 fd95…1770     0 blk/s   22 tx/s   1 Mgas/s      0+    0 Qed  #5955632   27/50 peers    147 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 22 req/s, 1027 µs
2018-07-13 08:22:37 UTC IO Worker #0 INFO import  Syncing #5955658 4a23…3092     0 blk/s   93 tx/s   5 Mgas/s      0+    0 Qed  #5955658   13/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 11 req/s,  48 µs
2018-07-13 08:23:10 UTC IO Worker #3 INFO import  Syncing #5955658 4a23…3092     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955658   11/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 2188 µs
2018-07-13 08:23:10 UTC IO Worker #0 INFO import  Syncing #5955658 4a23…3092     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955658   11/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 2188 µs
2018-07-13 08:23:10 UTC IO Worker #2 INFO import  Syncing #5955658 4a23…3092     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955658   11/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 2188 µs
2018-07-13 08:23:10 UTC IO Worker #1 INFO import  Syncing #5955658 4a23…3092     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955658   11/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 2188 µs
2018-07-13 08:23:25 UTC IO Worker #0 INFO import  Syncing #5955658 4a23…3092     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955658    2/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 2145 µs
2018-07-13 08:23:35 UTC IO Worker #0 INFO import  Syncing #5955658 4a23…3092     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955658    3/25 peers    129 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 2145 µs
2018-07-13 08:23:45 UTC IO Worker #2 INFO import  Syncing #5955669 4e03…8958     1 blk/s  125 tx/s   8 Mgas/s      0+   25 Qed  #5955695    4/25 peers    130 MiB chain  306 MiB db    3 MiB queue    2 MiB sync  RPC:  0 conn, 17 req/s, 179 µs
2018-07-13 08:23:55 UTC IO Worker #3 INFO import  Syncing #5955693 35be…a6ba     2 blk/s  221 tx/s  19 Mgas/s      0+    7 Qed  #5955701    3/25 peers    139 MiB chain  306 MiB db  929 KiB queue    2 MiB sync  RPC:  0 conn, 47 req/s, 805 µs
2018-07-13 08:24:40 UTC IO Worker #0 INFO import  Syncing #5955704 797a…12db     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955704    8/25 peers    140 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 46 req/s, 183 µs
2018-07-13 09:20:35 UTC IO Worker #3 INFO import  Syncing #5955938 1f84…ac79     0 blk/s    5 tx/s   0 Mgas/s      0+    0 Qed  #5955938   26/50 peers    147 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 56 req/s, 328 µs
2018-07-13 09:20:45 UTC IO Worker #2 INFO import  Syncing #5955938 1f84…ac79     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5955938   23/25 peers    147 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 7532 µs
2018-07-13 09:58:41 UTC IO Worker #1 INFO import  Syncing #5956103 bf64…c027     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5956104   31/50 peers    108 MiB chain  306 MiB db 776 bytes queue    2 MiB sync  RPC:  0 conn, 31 req/s, 234 µs
2018-07-13 10:27:15 UTC IO Worker #1 INFO import  Syncing #5956194 7111…4c75     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5956194    2/25 peers    152 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 1228 µs
2018-07-13 10:27:25 UTC IO Worker #2 INFO import  Syncing #5956214 a4c8…3acf     2 blk/s  246 tx/s  15 Mgas/s      0+   15 Qed  #5956230    5/25 peers    152 MiB chain  306 MiB db    2 MiB queue    2 MiB sync  RPC:  0 conn, 31 req/s, 270 µs
2018-07-13 10:41:15 UTC IO Worker #3 INFO import  Syncing #5956262 aca3…f965     0 blk/s   19 tx/s   1 Mgas/s      0+    7 Qed  #5956270    2/25 peers    135 MiB chain  306 MiB db  793 KiB queue    2 MiB sync  RPC:  0 conn, 42 req/s, 182 µs
2018-07-13 10:46:38 UTC IO Worker #3 INFO import  Syncing #5956292 d82a…9798     0 blk/s    3 tx/s   0 Mgas/s      0+    0 Qed  #5956292   17/25 peers    159 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 211 µs
2018-07-13 10:46:45 UTC IO Worker #1 INFO import  Syncing #5956293 ff0e…e839     0 blk/s   23 tx/s   1 Mgas/s      0+    0 Qed  #5956293   21/25 peers    157 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 25 req/s, 201 µs
2018-07-13 11:12:25 UTC IO Worker #0 INFO import  Syncing #5956394 888c…266c     0 blk/s    0 tx/s   0 Mgas/s      0+    1 Qed  #5956395   26/50 peers    138 MiB chain  306 MiB db   38 KiB queue    2 MiB sync  RPC:  0 conn, 26 req/s, 640 µs
2018-07-13 11:12:25 UTC IO Worker #1 INFO import  Syncing #5956394 888c…266c     0 blk/s    0 tx/s   0 Mgas/s      0+    1 Qed  #5956395   26/50 peers    138 MiB chain  306 MiB db   38 KiB queue    2 MiB sync  RPC:  0 conn, 26 req/s, 640 µs
2018-07-13 11:33:15 UTC IO Worker #0 INFO import  Syncing #5956457 79bf…9a88     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5956457    1/25 peers    149 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 475 µs
2018-07-13 11:33:25 UTC IO Worker #0 INFO import  Syncing #5956457 79bf…9a88     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5956457    2/25 peers    149 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 21 req/s, 475 µs
2018-07-13 11:33:35 UTC IO Worker #3 INFO import  Syncing #5956481 0761…ded4     2 blk/s  251 tx/s  19 Mgas/s      0+    6 Qed  #5956488    3/25 peers    150 MiB chain  306 MiB db  925 KiB queue    2 MiB sync  RPC:  0 conn, 48 req/s, 158 µs
2018-07-13 11:33:45 UTC IO Worker #2 INFO import  Syncing #5956488 d7da…edce     0 blk/s   92 tx/s   5 Mgas/s      0+    0 Qed  #5956488    4/25 peers    150 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 47 req/s, 2829 µs
2018-07-13 11:34:30 UTC IO Worker #2 INFO import  Syncing #5956500 114e…ceb9     1 blk/s  137 tx/s   9 Mgas/s      0+    4 Qed  #5956506    6/25 peers    151 MiB chain  306 MiB db  435 KiB queue    2 MiB sync  RPC:  0 conn, 35 req/s, 275 µs
2018-07-13 12:15:30 UTC IO Worker #2 INFO import  Syncing #5956631 2af3…3ce9     0 blk/s   10 tx/s   1 Mgas/s      0+   34 Qed  #5956668    2/25 peers    140 MiB chain  306 MiB db    3 MiB queue    2 MiB sync  RPC:  0 conn, 26 req/s, 283 µs
2018-07-13 12:15:40 UTC IO Worker #0 INFO import  Syncing #5956656 3818…2531     2 blk/s  187 tx/s  19 Mgas/s      0+   12 Qed  #5956670    3/25 peers    140 MiB chain  306 MiB db    1 MiB queue    2 MiB sync  RPC:  0 conn, 28 req/s, 1156 µs
2018-07-13 12:21:26 UTC IO Worker #3 INFO import  Syncing #5956692 a1a5…520f     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5956692   15/25 peers    148 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  2 req/s, 284 µs
2018-07-13 13:17:37 UTC IO Worker #1 INFO import  Syncing #5956934 941b…9520     0 blk/s    0 tx/s   0 Mgas/s      0+    0 Qed  #5956935   25/25 peers    139 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  0 req/s, 281 µs
2018-07-13 13:18:33 UTC IO Worker #3 INFO import  Syncing #5956943 b2d6…8271     0 blk/s    5 tx/s   0 Mgas/s      0+    1 Qed  #5956944   26/50 peers    140 MiB chain  306 MiB db   62 KiB queue    2 MiB sync  RPC:  0 conn,  0 req/s, 206 µs
2018-07-13 13:18:40 UTC IO Worker #3 INFO import  Syncing #5956944 9c0c…69a4     0 blk/s   11 tx/s   2 Mgas/s      0+    0 Qed  #5956944   27/50 peers    140 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 18 req/s, 233 µs
2018-07-13 14:36:05 UTC IO Worker #3 INFO import  Syncing #5957254 f5dc…e959     0 blk/s    3 tx/s   0 Mgas/s      0+    0 Qed  #5957254   20/25 peers    152 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn, 16 req/s, 341 µs
2018-07-13 14:36:15 UTC IO Worker #0 INFO import  Syncing #5957256 7061…81c8     0 blk/s    6 tx/s   1 Mgas/s      0+    0 Qed  #5957256   22/25 peers    152 MiB chain  306 MiB db  0 bytes queue    2 MiB sync  RPC:  0 conn,  2 req/s, 354 µs

You can notice a similar behavior.

If there is other information I can provide, please let me know. I will try to invoke the parity_netPeers RPC next time I see it hanging.

vogelito commented 6 years ago

Oh, and our DB dir is ~1.4TB (we're running an unpruned node):

# du -sh /root/.local/share/io.parity.ethereum/chains/ethereum/db/
1.4T    /root/.local/share/io.parity.ethereum/chains/ethereum/db/
5chdn commented 6 years ago

it's a stable release but since we build the binaries ourselves the tag says unstable

compile with --features final to get a stable version string.

we're running an unpruned node

I can reproduce this on pruned nodes, too, so I guess it's unrelated.

vogelito commented 6 years ago

Thanks @5chdn . I can confirm we haven't seen this issue since my report last Friday.

phutchins commented 6 years ago

At first I thought that this was resolved with the x.6 release as I hit 100 peers and stayed there for a few days. Unfortunately however as soon as I pushed transactions through the node it dropped back to the single digits and isn't holding up now.

On Mon, Jul 16, 2018, 2:44 PM Daniel Vogel notifications@github.com wrote:

Thanks @5chdn https://github.com/5chdn . I can confirm we haven't seen this issue since my report last Friday.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/paritytech/parity/issues/8974#issuecomment-405338984, or mute the thread https://github.com/notifications/unsubscribe-auth/ABBkqXZeIhPjUEDkppvkbt7391LbAndSks5uHN8ZgaJpZM4U19rT .

tomusdrw commented 6 years ago

@phutchins Can you elaborate on "pushing transactions through the node", did you submit multiple transactions over the RPC? Were these signed (eth_sendRawTransaction) or do you use eth_sendTransaction? What transaction queue settings do you have?

5chdn commented 6 years ago

pushing transactions through the node

@MicahZoltu is that what your nodes experience, too?

MicahZoltu commented 6 years ago

No, when my node gets into the bad state I just see this line repeated every n seconds and nothing else:

6/10 peers      4 MiB chain  105 MiB db  0 bytes queue  461 KiB sync  RPC:  0 conn,  6 req/s,  88 µs

When in this state, the latest block returned by eth_getBlockByNumber('latest') never changes, and I don't see the other log line I normally see when blocks arrive.

peterbitfly commented 6 years ago

We are still seeing this with the new 2.0.0 version:

2018-07-20 10:26:04  Syncing #5997032 0x6ce8…7ec8     0.10 blk/s   17.2 tx/s    0 Mgas/s      0+    0 Qed  #5997028   94/100 peers     24 MiB chain  113 MiB db  0 bytes queue  124 KiB sync  RPC:  0 conn,    6 req/s,  371 µs
2018-07-20 10:26:14  Syncing #5997032 0x6ce8…7ec8     0.00 blk/s    0.0 tx/s    0 Mgas/s      0+    0 Qed  #5997026   95/100 peers     24 MiB chain  113 MiB db  0 bytes queue  123 KiB sync  RPC:  0 conn,    5 req/s,  231 µs
2018-07-20 10:26:24  Syncing #5997032 0x6ce8…7ec8     0.00 blk/s    0.0 tx/s    0 Mgas/s      0+    0 Qed  #5997026   96/100 peers     24 MiB chain  113 MiB db  0 bytes queue  123 KiB sync  RPC:  0 conn,    5 req/s,  223 µs
2018-07-20 10:26:34  Syncing #5997032 0x6ce8…7ec8     0.00 blk/s    0.0 tx/s    0 Mgas/s      0+    0 Qed  #5997029   96/100 peers     24 MiB chain  113 MiB db  0 bytes queue  125 KiB sync  RPC:  0 conn,    5 req/s,  225 µs
2018-07-20 10:26:44  Syncing #5997033 0xf57e…3e37     0.20 blk/s   26.8 tx/s    1 Mgas/s      0+    0 Qed  #5997033   93/100 peers     24 MiB chain  114 MiB db  0 bytes queue  123 KiB sync  RPC:  0 conn,    6 req/s,  223 µs
2018-07-20 10:26:54  Syncing #5997033 0xf57e…3e37     0.00 blk/s    0.0 tx/s    0 Mgas/s      0+    0 Qed  #5997014   93/100 peers     24 MiB chain  114 MiB db  0 bytes queue  135 KiB sync  RPC:  0 conn,    5 req/s,  296 µs

All of those incidents we observed have in common that the highest block number seen on the network is always lower than the block that is currently processed. For example in the output above the currently processed block is 5997033 but the highest block seen on the network varies between 5997026, 5997033 and 5997014.

phutchins commented 6 years ago

@tomusdrwsome details on 'pushing transactions'... I have written an application that sends out batches (300) of transactions from a list of transactions (can be 10k-150k in total volume). I'm currently using RPC and calling the transfer method on an erc20 contract which I assume is calling eth.sendTransaction in the background. I can dig through the library to find out if necessary.

After this, the nodes that I used to send those transactions had trouble keeping peers but the others didn't. I'll have to check the peer health or the two unused nodes when I am able to to see if they still have good peer counts.

tomusdrw commented 6 years ago

@phutchins I see. I suppose the issue you are experiencing is caused by following:

  1. We run 4 threads for sync processing (IoWorkers)
  2. We run additional 4 threads for JSON-RPC methods handling (on supported platforms)

When you are starting to send multiple transactions via RPC the 4 threads are mostl likely saturating all the cores you have, starving the sync threads. That causes the sync requests to timeout, which in turn causes the remote peers to disconnect you.

Could you please try either:

  1. Running on a beefier hardware
  2. Submitting pre-signed transactions
  3. Lowering the number of RPC threads using --jsonrpc-threads 2
tomusdrw commented 6 years ago

Also please note that @phutchins issues is completely unrelated to @ppratscher logs provided in recent comment, from the logs it seems that the node has higher block than every other in the network, which causes it to go to "Syncing" mode - why it happens is yet to debug. @Tbaut could you open a separate issue for this? @ppratscher is the node mining? Is it possible that it mined all the blocks but somehow they are not propagated to other peers (this would also explain deep re-orgs in the mainnet network)

Tbaut commented 6 years ago

Thanks for your help on that @tomusdrw . This particular issue seems to be related to many different causes. I will close it as many users were able to "solve" it with better hardware or updating their node. I'd ask @ppratscher to open a specific issue that we can debug accordingly.

tzapu commented 6 years ago

@tomusdrw could you recommend a configuration that should not have a peer loosing issue? (for X amount of cores, how many json threads, how many server threads, how many verifiers, how many whatever else counts? ) thank you here is a week worth of peer counts, it goes to 100 (min-peers) after each restart, then after a short while drops. it's weird.

screenshot 2018-08-11 23 10 25

parity 1.11.8 on 8 cores with these settings

--mode=active
--base-path=/paritydb
--db-compaction=ssd
--pruning=fast
--no-ancient-blocks
--jsonrpc-interface=all
--jsonrpc-cors=all
--ws-interface=all
--tx-queue-gas=off
--tx-queue-mem-limit=0
--tx-queue-size=4294967295
--cache-size=6000
--min-peers=100
--max-peers=200
--scale-verifiers
--num-verifiers=2
--jsonrpc-server-threads=4
--jsonrpc-threads=4
--unsafe-expose
Tbaut commented 6 years ago

@tzapu this thread was closed to make sure it doesn't become a general thread for a problem that has many causes. Please open a dedicated issue that we'll be happy to answer.

askucher commented 4 years ago

It happens