Closed AuHau closed 3 years ago
Hello @AuHau
Was this a problem after updating to libp2p@0.30
? Or it should not be related?
EADDRNOTAVAIL
should be a result of attempting to use a port already in use.
Can you provide us the libp2p configuration of your node, as well as the debug information so that we can get some more information? You can get the debug information by doing DEBUG=libp2p* node index.js
or similar
@vasco-santos Thanks for your help. Just want to clarify that that IP isn't something we own. It should not be trying to bind there.
This is our configuration:
link/ether REDACTED brd ff:ff:ff:ff:ff:ff
inet 10.10.183.89/20 brd 10.10.191.255 scope global dynamic ens5
valid_lft 3105sec preferred_lft 3105sec
inet6 2600:1f14:de2::REDACTED/128 scope global dynamic noprefixroute
valid_lft 424sec preferred_lft 124sec
inet6 fe80::REDACTED/64 scope link
valid_lft forever preferred_lft forever
Thanks for your help. Just want to clarify that that IP isn't something we own. It should not be trying to bind there.
👍🏼 Can I have your libp2p config and the logs? I am not understanding what this can be currently
We set the DEBUG=libp2p*
and we'll reach you back as soon it crashes again. As for the config, we're loading it as
const libp2pConfig = config.get('libp2p') as Record<string, any>
let libp2p: Libp2p
if (config.has('peerId')) {
const cnfId = config.get<{ id: string, privKey: string, pubKey: string }>('peerId')
const peerId = await PeerId.createFromJSON(cnfId)
if (!isValidPeerId(peerId)) {
throw new Error('Supplied PeerId is not valid!')
}
libp2p = await createLibP2P({ ...libp2pConfig, peerId })
} else {
libp2p = await createLibP2P(libp2pConfig)
}
with the following (JSON5) config piece:
libp2p: {
addresses: {
listen: [
"/ip4/127.0.0.1/tcp/8998",
"/ip4/127.0.0.1/tcp/8999/ws"
]
},
config: {
peerDiscovery: {
bootstrap: {
enabled: false
}
}
}
},
Hmm to clarify the information that my colleagues already put up. The config that Lucas put is the extension config of our base config. For the base config see bellow. It is worth noting that this one is for our boot node, so we don't use bootstrap here, but in our other components, we do. Also even though we have listening addresses bound only to localhost, it runs in Docker container so Docker translates the requests as coming from localhost.
We had this problem prior to upgrading to 0.30.0
and upgraded afterward to see if the problem would be fixed.
Just an idea, could it be a problem that we do not actually listen on IPv6 address? And some peer wants to connect to it?
The base libp2p config that we use is:
const config = {
dialer: {
maxParallelDials: 150, // 150 total parallel multiaddr dials
maxDialsPerPeer: 4, // Allow 4 multiaddrs to be dialed per peer in parallel
dialTimeout: 10e3 // 10 second dial timeout per peer dial
},
modules: {
transport: [
TCP,
WS
],
streamMuxer: [Multiplex],
connEncryption: [
SECIO,
NOISE
],
peerDiscovery: [Bootstrap, MulticastDNS],
dht: KadDHT,
pubsub: GossipSub
},
config: {
peerDiscovery: {
autoDial: true,
[MulticastDNS.tag]: {
enabled: true
},
bootstrap: {
enabled: true
}
},
dht: { // The DHT options (and defaults) can be found in its documentation
kBucketSize: 20,
enabled: true,
randomWalk: {
enabled: true, // Allows to disable discovery (enabled by default)
interval: 300e3,
timeout: 10e3
}
},
pubsub: {
enabled: true,
emitSelf: true
}
},
metrics: {
enabled: true
},
peerStore: {
persistence: true
}
}
Okey, now back with a crash. I'm pasting the whole last minute before it happened, as idk how much is needed. Lemme know if it's enough or if I should go further back.
2020-12-23T09:09:22.008Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.024Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.031Z libp2p:secio 3. finish - finish
2020-12-23T09:09:22.031Z libp2p:upgrader outbound selecting muxer /mplex/6.7.0
2020-12-23T09:09:22.039Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data: <Buffer 10 2f 69 70 66 73 2f 6b 61 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.040Z libp2p:mplex initiator stream 1 1 send { id: 1,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 28 08 04 50 00 12 22 12 20 b1 3b 1d f7 71 06 1d 47 9c a8 30 39 83 b6 a9 b7 fc 8d d0 5b d3 b7 e4 33 ac 4e ef a9 87 16 cf 58> }
2020-12-23T09:09:22.040Z libp2p:mplex initiator stream 1 1 send { id: 1, type: 'CLOSE_INITIATOR', data: undefined }
2020-12-23T09:09:22.040Z libp2p:mplex:stream initiator stream 1 sink end undefined
2020-12-23T09:09:22.040Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data: <Buffer 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.040Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer ad 09 0a ab 02 08 00 12 a6 02 30 82 01 22 30 0d 06 09 2a 86 48 86 f7 0d 01 01 01 05 00 03 82 01 0f 00 30 82 01 0a 02 82 01 01 00 c4 aa b2 04 96 27 a4 ... > }
2020-12-23T09:09:22.041Z libp2p:mplex initiator stream 0 0 send { id: 0, type: 'CLOSE_INITIATOR', data: undefined }
2020-12-23T09:09:22.041Z libp2p:mplex:stream initiator stream 0 sink end undefined
2020-12-23T09:09:22.041Z libp2p:mplex:stream initiator stream 0 source end undefined
2020-12-23T09:09:22.041Z libp2p:mplex initiator stream 0 0 ended
2020-12-23T09:09:22.041Z libp2p:mplex incoming message { id: 0, type: 'CLOSE_RECEIVER', data: <Buffer > }
2020-12-23T09:09:22.041Z libp2p:mplex missing stream 0
2020-12-23T09:09:22.043Z libp2p:peer-store:address-book stored provided peer record for QmXRk6qHuieoFdQF9zQr7g5m8GDx1hLAX77btnQ7hDKC1M
2020-12-23T09:09:22.043Z libp2p:peer-store:proto-book the protocols provided to store are equal to the already stored for QmXRk6qHuieoFdQF9zQr7g5m8GDx1hLAX77btnQ7hDKC1M
2020-12-23T09:09:22.073Z libp2p:connection-manager checking limit of maxEventLoopDelay. current value: 0.6810230000000956 of Infinity
2020-12-23T09:09:22.124Z libp2p:connection-manager checking limit of maxReceivedData. current value: 6414.775672180776 of Infinity
2020-12-23T09:09:22.124Z libp2p:connection-manager checking limit of maxSentData. current value: 2613.3718675111245 of Infinity
2020-12-23T09:09:22.125Z libp2p:connection-manager checking limit of maxData. current value: 9028.147539691901 of Infinity
2020-12-23T09:09:22.125Z libp2p:connection-manager metrics update 9028.147539691901
2020-12-23T09:09:22.158Z libp2p:mplex incoming message { id: 0, type: 'CLOSE_INITIATOR', data: <Buffer > }
2020-12-23T09:09:22.158Z libp2p:mplex:stream receiver stream 0 source end undefined
2020-12-23T09:09:22.158Z libp2p:mplex receiver stream 0 0 ended
2020-12-23T09:09:22.249Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_INITIATOR',
data: <Buffer 0e 2f 69 70 66 73 2f 62 69 74 73 77 61 70 0a> }
2020-12-23T09:09:22.249Z libp2p:mplex receiver stream 1 1 send { id: 1, type: 'MESSAGE_RECEIVER', data: <Buffer 03 6e 61 0a> }
2020-12-23T09:09:22.290Z libp2p:upgrader /mplex/6.7.0 selected as muxer protocol
2020-12-23T09:09:22.290Z libp2p:upgrader Successfully upgraded outbound connection
2020-12-23T09:09:22.291Z libp2p:upgrader outbound: starting new stream on /ipfs/id/1.0.0
2020-12-23T09:09:22.291Z libp2p:mplex new initiator stream 0 0
2020-12-23T09:09:22.291Z libp2p:mplex initiator stream 0 0 send { id: 0, type: 'NEW_STREAM', data: '0' }
2020-12-23T09:09:22.291Z libp2p:connection-manager checking limit of maxConnections. current value: 534 of Infinity
2020-12-23T09:09:22.291Z libp2p:tcp outbound connection /ip4/1.64.231.44/tcp/41811/p2p/QmTzTX4PyCfb4MeqBj7zsVjztPLmLhQvP12UV6hUjJXThg upgraded
2020-12-23T09:09:22.294Z libp2p:tcp dialing {"family":"ipv4","host":"42.2.149.167","transport":"tcp","port":45749}
2020-12-23T09:09:22.294Z libp2p:mplex initiator stream 0 0 send { id: 0,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.294Z libp2p:dht:net:QmRstX1G added to the routing table: QmTzTX4PyCfb4MeqBj7zsVjztPLmLhQvP12UV6hUjJXThg
2020-12-23T09:09:22.294Z libp2p:tcp connection aborted {"family":"ipv4","host":"42.2.149.167","transport":"tcp","port":37885}
2020-12-23T09:09:22.294Z libp2p:tcp connection aborted {"family":"ipv4","host":"42.2.149.167","transport":"tcp","port":42557}
2020-12-23T09:09:22.295Z libp2p:tcp connection aborted {"family":"ipv4","host":"42.2.149.167","transport":"tcp","port":32963}
2020-12-23T09:09:22.295Z libp2p:tcp connection aborted {"family":"ipv4","host":"42.2.149.167","transport":"tcp","port":45749}
2020-12-23T09:09:22.295Z libp2p:dialer token 112 released
2020-12-23T09:09:22.295Z libp2p:dialer token 35 released
2020-12-23T09:09:22.295Z libp2p:dialer token 63 released
2020-12-23T09:09:22.295Z libp2p:dialer token 39 released
2020-12-23T09:09:22.295Z libp2p:dialer dial succeeded to QmTzTX4PyCfb4MeqBj7zsVjztPLmLhQvP12UV6hUjJXThg
2020-12-23T09:09:22.295Z libp2p:upgrader outbound: starting new stream on /ipfs/kad/1.0.0
2020-12-23T09:09:22.295Z libp2p:mplex new initiator stream 1 1
2020-12-23T09:09:22.295Z libp2p:mplex initiator stream 1 1 send { id: 1, type: 'NEW_STREAM', data: '1' }
2020-12-23T09:09:22.295Z libp2p:mplex initiator stream 1 1 send { id: 1,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 10 2f 69 70 66 73 2f 6b 61 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.300Z libp2p:mplex incoming message { id: 0, type: 'NEW_STREAM', data: <Buffer 30> }
2020-12-23T09:09:22.300Z libp2p:mplex new receiver stream 0 0
2020-12-23T09:09:22.301Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.301Z libp2p:mplex receiver stream 0 0 send { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.301Z libp2p:upgrader outbound: incoming stream opened on /ipfs/id/1.0.0
2020-12-23T09:09:22.303Z libp2p:mplex receiver stream 0 0 send { id: 0,
type: 'MESSAGE_RECEIVER',
data: <Buffer 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.304Z libp2p:mplex receiver stream 0 0 send { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer d6 09 2a 0a 69 70 66 73 2f 30 2e 31 2e 30 32 10 6a 73 2d 6c 69 62 70 32 70 2f 30 2e 32 39 2e 33 0a ab 02 08 00 12 a6 02 30 82 01 22 30 0d 06 09 2a 86 ... > }
2020-12-23T09:09:22.304Z libp2p:mplex receiver stream 0 0 send { id: 0, type: 'CLOSE_RECEIVER', data: undefined }
2020-12-23T09:09:22.304Z libp2p:mplex:stream receiver stream 0 sink end undefined
2020-12-23T09:09:22.332Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 83 10 08 04 42 42 0a 22 12 20 8d 58 64 cd 68 19 de 41 b8 84 1e 79 e1 a9 10 06 45 29 c9 a1 a9 c5 27 9b 6a 98 b6 0e 64 44 32 a8 12 08 04 7f 00 00 01 06 ... > }
2020-12-23T09:09:22.332Z libp2p:mplex:stream initiator stream 1 source end undefined
2020-12-23T09:09:22.332Z libp2p:mplex initiator stream 1 1 ended
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmXrQNTfKBJSQ6ttDh8p5B9uaegu3Lxt6sJ76isru5z3Bu
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWSHvvpHN8j4MXkUx5mx1MJExkYwqM9rQcivCqitXN7upW
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmSdKzrtdS6MmN6s28i6r1SNL79dL8rbqyMCy2ffckre2s
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmeZA5o6oL33dTuUdP194D76QXJUQ4JbYtcmAWruHmSMZJ
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWP4a9qcbm5ouvtstmadoBMCGtRiH3mwjKfwdTZc9AvSh9
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmT84iE433j1WrFWunmx6J7BjbJBeqzLxfPmzGfuSVMEsz
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmXbb7PwU5savYyTTLCFst7JarASdJ5CNNv5SSb5mFbKPu
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmQjWkWSWiDe5REaR67UPK5c3U2s4yvHF2ANSW5W3WuWN8
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmVV9MnVQJkt6zrDNKtd1XZQ3a3FaquP6491iWaZLS7WVT
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmSfRXjPQyso69NifBZkcSG4VFKaiu46J2hDizPz9532JP
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmVmufesiTFnghT4D41HRxz6tFNxCHUwAzKFSGAuV7UJQx
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWDuMR6FUiRngU1RHRUw61v6XRwfnPZzuVRvY2FBpNWFvd
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmQERLYYQVat7KYLTe1LBg5DeXco2G1ppBerZvh5xkuzyp
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmWUKufrFfeYMF7Pr8iSCx8pumXjfrCXpSxGUEgv1q3YLB
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWGrbCpUBqjGsiatbxc7FYPrKw2aeRAM34e3Wrf3qu3MGj
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmfUeXuMnLreMZmJNaNBF581BVBF3piG8qDYuRX4V4azPH
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWQRJBxjnA6CdxnDwk5wbCjSMaJwcfHmzA7bVVpjuW3chf
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWHN68dMbGGWLNCSvnjzKHEGQ77N4hAVJW9K4Krc1jEbvf
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for QmcY3efozCYwoD3RuYPZxEN8rtRjd7Xys9k6Ams78KUdmG
2020-12-23T09:09:22.334Z libp2p:peer-store:address-book the addresses provided to store are already stored for 12D3KooWMsUYSLmFBc62ChtiYUBTPPVpRZqazBYbLUFHc8tuXurC
2020-12-23T09:09:22.334Z libp2p:dht:query:QmaGV9yF1F8QMucmf4zUSXUWqU1ANvKxEENsLYuCHA6Dt3:QmRstX1G queue:work:done undefined undefined
2020-12-23T09:09:22.334Z libp2p:dht:peer-queue dequeue QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.335Z libp2p:dht:query:QmaGV9yF1F8QMucmf4zUSXUWqU1ANvKxEENsLYuCHA6Dt3:QmRstX1G queue:work
2020-12-23T09:09:22.335Z libp2p:dht:QmRstX1G findPeerSingle QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.335Z libp2p:dht:net:QmRstX1G sending to: QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.338Z libp2p:dialer 14 tokens request, returning 4, 99 remaining
2020-12-23T09:09:22.339Z libp2p:tcp dialing {"family":"ipv4","host":"167.99.62.163","transport":"tcp","port":4001}
2020-12-23T09:09:22.339Z libp2p:tcp dialing {"family":"ipv6","host":"::1","transport":"tcp","port":4001}
2020-12-23T09:09:22.339Z libp2p:tcp dialing {"family":"ipv4","host":"10.17.0.7","transport":"tcp","port":4001}
2020-12-23T09:09:22.339Z libp2p:websockets dialing /ip6/::1/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.340Z libp2p:websockets dialing ::1:4002
2020-12-23T09:09:22.341Z libp2p:tcp dialing {"family":"ipv6","host":"2604:a880:800:a1::1738:a001","transport":"tcp","port":4001}
2020-12-23T09:09:22.341Z libp2p:websockets dialing /ip4/127.0.0.1/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.341Z libp2p:websockets dialing 127.0.0.1:4002
2020-12-23T09:09:22.342Z libp2p:websockets dialing /ip4/10.17.0.7/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.342Z libp2p:websockets dialing 10.17.0.7:4002
2020-12-23T09:09:22.343Z libp2p:websockets dialing /ip4/167.99.62.163/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.343Z libp2p:websockets dialing 167.99.62.163:4002
2020-12-23T09:09:22.360Z libp2p:upgrader /mplex/6.7.0 selected as muxer protocol
2020-12-23T09:09:22.361Z libp2p:upgrader Successfully upgraded outbound connection
2020-12-23T09:09:22.361Z libp2p:upgrader outbound: starting new stream on /ipfs/id/1.0.0
2020-12-23T09:09:22.361Z libp2p:mplex new initiator stream 0 0
2020-12-23T09:09:22.361Z libp2p:mplex initiator stream 0 0 send { id: 0, type: 'NEW_STREAM', data: '0' }
2020-12-23T09:09:22.362Z libp2p:connection-manager checking limit of maxConnections. current value: 535 of Infinity
2020-12-23T09:09:22.362Z libp2p:tcp outbound connection /ip4/222.164.51.47/tcp/35394/p2p/QmQiauJHFq77WhzWwtKV6ZeySGk2foTpnVD6YAByFiWHaB upgraded
2020-12-23T09:09:22.362Z libp2p:dialer token 128 released
2020-12-23T09:09:22.362Z libp2p:mplex initiator stream 0 0 send { id: 0,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.363Z libp2p:dht:net:QmRstX1G added to the routing table: QmQiauJHFq77WhzWwtKV6ZeySGk2foTpnVD6YAByFiWHaB
2020-12-23T09:09:22.363Z libp2p:dialer dial succeeded to QmQiauJHFq77WhzWwtKV6ZeySGk2foTpnVD6YAByFiWHaB
2020-12-23T09:09:22.363Z libp2p:upgrader outbound: starting new stream on /ipfs/kad/1.0.0
2020-12-23T09:09:22.363Z libp2p:mplex new initiator stream 1 1
2020-12-23T09:09:22.363Z libp2p:mplex initiator stream 1 1 send { id: 1, type: 'NEW_STREAM', data: '1' }
2020-12-23T09:09:22.364Z libp2p:mplex initiator stream 1 1 send { id: 1,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 10 2f 69 70 66 73 2f 6b 61 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.366Z libp2p:mplex incoming message { id: 0, type: 'NEW_STREAM', data: <Buffer 30> }
2020-12-23T09:09:22.366Z libp2p:mplex new receiver stream 0 0
2020-12-23T09:09:22.367Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.367Z libp2p:mplex receiver stream 0 0 send { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.367Z libp2p:upgrader outbound: incoming stream opened on /ipfs/id/1.0.0
2020-12-23T09:09:22.369Z libp2p:mplex receiver stream 0 0 send { id: 0,
type: 'MESSAGE_RECEIVER',
data: <Buffer 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.369Z libp2p:mplex receiver stream 0 0 send { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer d6 09 2a 0a 69 70 66 73 2f 30 2e 31 2e 30 32 10 6a 73 2d 6c 69 62 70 32 70 2f 30 2e 32 39 2e 33 0a ab 02 08 00 12 a6 02 30 82 01 22 30 0d 06 09 2a 86 ... > }
2020-12-23T09:09:22.369Z libp2p:mplex receiver stream 0 0 send { id: 0, type: 'CLOSE_RECEIVER', data: undefined }
2020-12-23T09:09:22.369Z libp2p:mplex:stream receiver stream 0 sink end undefined
2020-12-23T09:09:22.414Z libp2p:tcp connection opened {"family":"ipv4","host":"167.99.62.163","transport":"tcp","port":4001}
2020-12-23T09:09:22.415Z libp2p:tcp new outbound connection /ip4/167.99.62.163/tcp/4001/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.415Z libp2p:upgrader Starting the outbound connection upgrade
2020-12-23T09:09:22.415Z libp2p:upgrader selecting outbound crypto protocol [ '/secio/1.0.0', '/noise' ]
2020-12-23T09:09:22.474Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.484Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data: <Buffer 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.484Z libp2p:mplex initiator stream 0 0 send { id: 0, type: 'CLOSE_INITIATOR', data: undefined }
2020-12-23T09:09:22.485Z libp2p:mplex:stream initiator stream 0 sink end undefined
2020-12-23T09:09:22.485Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer f5 08 0a ab 02 08 00 12 a6 02 30 82 01 22 30 0d 06 09 2a 86 48 86 f7 0d 01 01 01 05 00 03 82 01 0f 00 30 82 01 0a 02 82 01 01 00 dc 04 75 f8 1d a4 87 ... > }
2020-12-23T09:09:22.485Z libp2p:mplex:stream initiator stream 0 source end undefined
2020-12-23T09:09:22.485Z libp2p:mplex initiator stream 0 0 ended
2020-12-23T09:09:22.488Z libp2p:peer-store:address-book stored provided multiaddrs for QmTzTX4PyCfb4MeqBj7zsVjztPLmLhQvP12UV6hUjJXThg
2020-12-23T09:09:22.489Z libp2p:peer-store:proto-book the protocols provided to store are equal to the already stored for QmTzTX4PyCfb4MeqBj7zsVjztPLmLhQvP12UV6hUjJXThg
2020-12-23T09:09:22.489Z libp2p:identify received observed address of /ip4/34.219.99.159/tcp/51562
2020-12-23T09:09:22.490Z libp2p:websockets connected /ip4/167.99.62.163/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.490Z libp2p:websockets new outbound connection /ip4/167.99.62.163/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.491Z libp2p:upgrader Starting the outbound connection upgrade
2020-12-23T09:09:22.491Z libp2p:upgrader selecting outbound crypto protocol [ '/secio/1.0.0', '/noise' ]
2020-12-23T09:09:22.500Z libp2p:mplex incoming message { id: 0, type: 'CLOSE_RECEIVER', data: <Buffer > }
2020-12-23T09:09:22.500Z libp2p:mplex missing stream 0
2020-12-23T09:09:22.501Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.501Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data: <Buffer 10 2f 69 70 66 73 2f 6b 61 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.501Z libp2p:mplex initiator stream 1 1 send { id: 1,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 28 08 04 50 00 12 22 12 20 b1 3b 1d f7 71 06 1d 47 9c a8 30 39 83 b6 a9 b7 fc 8d d0 5b d3 b7 e4 33 ac 4e ef a9 87 16 cf 58> }
2020-12-23T09:09:22.501Z libp2p:mplex initiator stream 1 1 send { id: 1, type: 'CLOSE_INITIATOR', data: undefined }
2020-12-23T09:09:22.502Z libp2p:mplex:stream initiator stream 1 sink end undefined
2020-12-23T09:09:22.543Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.551Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.551Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data: <Buffer 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.551Z libp2p:mplex initiator stream 0 0 send { id: 0, type: 'CLOSE_INITIATOR', data: undefined }
2020-12-23T09:09:22.552Z libp2p:mplex:stream initiator stream 0 sink end undefined
2020-12-23T09:09:22.552Z libp2p:mplex incoming message { id: 0,
type: 'MESSAGE_RECEIVER',
data:
<Buffer fe 08 0a ab 02 08 00 12 a6 02 30 82 01 22 30 0d 06 09 2a 86 48 86 f7 0d 01 01 01 05 00 03 82 01 0f 00 30 82 01 0a 02 82 01 01 00 b7 79 07 4f 6d ee d2 ... > }
2020-12-23T09:09:22.553Z libp2p:mplex:stream initiator stream 0 source end undefined
2020-12-23T09:09:22.553Z libp2p:mplex initiator stream 0 0 ended
2020-12-23T09:09:22.553Z libp2p:mplex incoming message { id: 1,
type: 'MESSAGE_RECEIVER',
data: <Buffer 10 2f 69 70 66 73 2f 6b 61 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.555Z libp2p:peer-store:address-book stored provided multiaddrs for QmQiauJHFq77WhzWwtKV6ZeySGk2foTpnVD6YAByFiWHaB
2020-12-23T09:09:22.555Z libp2p:peer-store:proto-book the protocols provided to store are equal to the already stored for QmQiauJHFq77WhzWwtKV6ZeySGk2foTpnVD6YAByFiWHaB
2020-12-23T09:09:22.555Z libp2p:identify received observed address of /ip4/34.219.99.159/tcp/43208
2020-12-23T09:09:22.555Z libp2p:mplex initiator stream 1 1 send { id: 1,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 28 08 04 50 00 12 22 12 20 b1 3b 1d f7 71 06 1d 47 9c a8 30 39 83 b6 a9 b7 fc 8d d0 5b d3 b7 e4 33 ac 4e ef a9 87 16 cf 58> }
2020-12-23T09:09:22.556Z libp2p:mplex initiator stream 1 1 send { id: 1, type: 'CLOSE_INITIATOR', data: undefined }
2020-12-23T09:09:22.556Z libp2p:mplex:stream initiator stream 1 sink end undefined
2020-12-23T09:09:22.556Z libp2p:mplex incoming message { id: 0, type: 'CLOSE_RECEIVER', data: <Buffer > }
2020-12-23T09:09:22.556Z libp2p:mplex missing stream 0
2020-12-23T09:09:22.565Z libp2p:upgrader encrypting outbound connection to {"id":"QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM"}
2020-12-23T09:09:22.568Z libp2p:noise Stage 0 - Initiator starting to send first message.
2020-12-23T09:09:22.591Z libp2p:noise Stage 0 - Initiator finished sending first message.
2020-12-23T09:09:22.591Z libp2p:noise Stage 1 - Initiator waiting to receive first message from responder...
2020-12-23T09:09:22.602Z libp2p:mplex incoming message { id: 1, type: 'RESET_INITIATOR', data: <Buffer > }
2020-12-23T09:09:22.602Z libp2p:mplex:stream receiver stream 1 source end { Error: stream reset
at Object.reset (/srv/app/node_modules/libp2p-mplex/src/stream.js:72:27)
at Mplex._handleIncoming (/srv/app/node_modules/libp2p-mplex/src/mplex.js:227:16)
at pipe (/srv/app/node_modules/libp2p-mplex/src/mplex.js:156:22)
at process._tickCallback (internal/process/next_tick.js:68:7) code: 'ERR_MPLEX_STREAM_RESET' }
2020-12-23T09:09:22.602Z libp2p:mplex:stream receiver stream 1 sink end { Error: stream reset
at Object.reset (/srv/app/node_modules/libp2p-mplex/src/stream.js:72:27)
at Mplex._handleIncoming (/srv/app/node_modules/libp2p-mplex/src/mplex.js:227:16)
at pipe (/srv/app/node_modules/libp2p-mplex/src/mplex.js:156:22)
at process._tickCallback (internal/process/next_tick.js:68:7) code: 'ERR_MPLEX_STREAM_RESET' }
2020-12-23T09:09:22.602Z libp2p:mplex receiver stream 1 1 ended
2020-12-23T09:09:22.603Z libp2p:upgrader:error Error: stream reset
at Object.reset (/srv/app/node_modules/libp2p-mplex/src/stream.js:72:27)
at Mplex._handleIncoming (/srv/app/node_modules/libp2p-mplex/src/mplex.js:227:16)
at pipe (/srv/app/node_modules/libp2p-mplex/src/mplex.js:156:22)
at process._tickCallback (internal/process/next_tick.js:68:7)
2020-12-23T09:09:22.625Z libp2p:mplex incoming message { id: 1, type: 'CLOSE_RECEIVER', data: <Buffer > }
2020-12-23T09:09:22.625Z libp2p:mplex missing stream 1
2020-12-23T09:09:22.637Z libp2p:upgrader encrypting outbound connection to {"id":"QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM"}
2020-12-23T09:09:22.640Z libp2p:noise Stage 0 - Initiator starting to send first message.
2020-12-23T09:09:22.641Z libp2p:noise Stage 0 - Initiator finished sending first message.
2020-12-23T09:09:22.641Z libp2p:noise Stage 1 - Initiator waiting to receive first message from responder...
2020-12-23T09:09:22.655Z libp2p:mplex incoming message { id: 0, type: 'CLOSE_INITIATOR', data: <Buffer > }
2020-12-23T09:09:22.656Z libp2p:mplex:stream receiver stream 0 source end undefined
2020-12-23T09:09:22.656Z libp2p:mplex receiver stream 0 0 ended
2020-12-23T09:09:22.673Z libp2p:noise Stage 1 - Initiator received the message.
2020-12-23T09:09:22.673Z libp2p:noise Initiator going to check remote's signature...
2020-12-23T09:09:22.675Z libp2p:noise All good with the signature!
2020-12-23T09:09:22.675Z libp2p:noise Stage 2 - Initiator sending third handshake message.
2020-12-23T09:09:22.678Z libp2p:noise Stage 2 - Initiator sent message with signed payload.
2020-12-23T09:09:22.679Z libp2p:upgrader outbound selecting muxer /mplex/6.7.0
2020-12-23T09:09:22.722Z libp2p:noise Stage 1 - Initiator received the message.
2020-12-23T09:09:22.723Z libp2p:noise Initiator going to check remote's signature...
2020-12-23T09:09:22.724Z libp2p:noise All good with the signature!
2020-12-23T09:09:22.724Z libp2p:noise Stage 2 - Initiator sending third handshake message.
2020-12-23T09:09:22.728Z libp2p:noise Stage 2 - Initiator sent message with signed payload.
2020-12-23T09:09:22.728Z libp2p:upgrader outbound selecting muxer /mplex/6.7.0
2020-12-23T09:09:22.735Z libp2p:mplex incoming message { id: 0, type: 'CLOSE_INITIATOR', data: <Buffer > }
2020-12-23T09:09:22.735Z libp2p:mplex:stream receiver stream 0 source end undefined
2020-12-23T09:09:22.736Z libp2p:mplex receiver stream 0 0 ended
2020-12-23T09:09:22.803Z libp2p:upgrader /mplex/6.7.0 selected as muxer protocol
2020-12-23T09:09:22.803Z libp2p:upgrader Successfully upgraded outbound connection
2020-12-23T09:09:22.803Z libp2p:upgrader outbound: starting new stream on /ipfs/id/1.0.0
2020-12-23T09:09:22.803Z libp2p:mplex new initiator stream 0 0
2020-12-23T09:09:22.803Z libp2p:mplex initiator stream 0 0 send { id: 0, type: 'NEW_STREAM', data: '0' }
2020-12-23T09:09:22.804Z libp2p:connection-manager checking limit of maxConnections. current value: 536 of Infinity
2020-12-23T09:09:22.804Z libp2p:websockets outbound connection /ip4/167.99.62.163/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM upgraded
2020-12-23T09:09:22.805Z libp2p:websockets dialing /ip6/2604:a880:800:a1::1738:a001/tcp/4002/ws/p2p/QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.805Z libp2p:websockets dialing 2604:a880:800:a1::1738:a001:4002
2020-12-23T09:09:22.806Z libp2p:mplex initiator stream 0 0 send { id: 0,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 0f 2f 69 70 66 73 2f 69 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.806Z libp2p:dht:net:QmRstX1G added to the routing table: QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.806Z libp2p:tcp connection aborted {"family":"ipv4","host":"10.17.0.7","transport":"tcp","port":4001}
2020-12-23T09:09:22.807Z libp2p:dialer token 67 released
2020-12-23T09:09:22.807Z libp2p:dialer token 71 released
2020-12-23T09:09:22.807Z libp2p:dialer token 42 released
2020-12-23T09:09:22.807Z libp2p:dialer token 94 released
2020-12-23T09:09:22.807Z libp2p:dialer dial succeeded to QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
2020-12-23T09:09:22.808Z libp2p:upgrader outbound: starting new stream on /ipfs/kad/1.0.0
2020-12-23T09:09:22.808Z libp2p:mplex new initiator stream 1 1
2020-12-23T09:09:22.808Z libp2p:mplex initiator stream 1 1 send { id: 1, type: 'NEW_STREAM', data: '1' }
2020-12-23T09:09:22.808Z libp2p:mplex initiator stream 1 1 send { id: 1,
type: 'MESSAGE_INITIATOR',
data:
<Buffer 13 2f 6d 75 6c 74 69 73 74 72 65 61 6d 2f 31 2e 30 2e 30 0a 10 2f 69 70 66 73 2f 6b 61 64 2f 31 2e 30 2e 30 0a> }
2020-12-23T09:09:22.809Z libp2p:upgrader:error Failed to upgrade outbound connection { Error: The operation was aborted
at nextAbortHandler (/srv/app/node_modules/abortable-iterator/index.js:37:20)
at AbortSignal.abortHandler (/srv/app/node_modules/abortable-iterator/index.js:16:29)
at AbortSignal.dispatchEvent (/srv/app/node_modules/event-target-shim/dist/event-target-shim.js:818:35)
at abortSignal (/srv/app/node_modules/abort-controller/dist/abort-controller.js:52:12)
at AbortController.abort (/srv/app/node_modules/abort-controller/dist/abort-controller.js:91:9)
at AbortSignal.onAbort (/srv/app/node_modules/any-signal/index.js:13:16)
at AbortSignal.dispatchEvent (/srv/app/node_modules/event-target-shim/dist/event-target-shim.js:818:35)
at abortSignal (/srv/app/node_modules/abort-controller/dist/abort-controller.js:52:12)
at AbortController.abort (/srv/app/node_modules/abort-controller/dist/abort-controller.js:91:9)
at dialAbortControllers.map.c (/srv/app/node_modules/libp2p/src/dialer/dial-request.js:75:39) type: 'aborted', code: 'ERR_MUXER_UNAVAILABLE' }
events.js:174
throw er; // Unhandled 'error' event
^
Error: connect EADDRNOTAVAIL 2604:a880:800:a1::1738:a001:4002 - Local (:::0)
at internalConnect (net.js:882:16)
at defaultTriggerAsyncIdScope (internal/async_hooks.js:294:19)
at defaultTriggerAsyncIdScope (net.js:972:9)
at process._tickCallback (internal/process/next_tick.js:61:11)
Emitted 'error' event at:
at emitErrorNT (internal/streams/destroy.js:91:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:59:3)
at process._tickCallback (internal/process/next_tick.js:63:19)
Thanks for all the detailed information. So, from my initial diagnosis, the flow to the error is:
QmaGV9yF1F8QMucmf4zUSXUWqU1ANvKxEENsLYuCHA6Dt3
QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
QmPEr3yxLMg37sg7yM39qfDMgo18BjpXcwtcgqcChFVbNM
socket.io
connect for getting the raw connection and we are not listening for itA question to validate this:
p-any
not caring about the order 🤔 A piece of feedback regarding this setup. If your setup relies on a public network, which seems to be, I really recommend you to use the new customAnnounce address filter in production. This filter function allows you to filter announced addresses programatically.
The nice thing here is that other nodes would not attempt to dial private address like:
The node would only advertise its reachable (public) addresses, improving the dial performance of the network. You can look into
https://github.com/libp2p/js-libp2p/blob/master/doc/CONFIGURATION.md#customizing-libp2p (search for announceFilter
and use the libp2p-utils
filter to filter out private addresses. Bear in mind that you will probably want this only in production and not in a local test environment.
@jacobheun we should look into thinking on a design to improve the dialer logic a bit. Thinking about the 2 following points:
Meanwhile, I will look into a PR to fix the websockets error handler and will report back once I have it. This way you can install the branch directly and try it out
Can you try this out with https://github.com/libp2p/js-libp2p-websockets/pull/118?
Using "libp2p-websockets": "libp2p/js-libp2p-websockets#fix/add-error-event-handler"
instead of "libp2p-websockets": "^0.15.0"
and let me know how it went?
Thanks a lot! Will try.
Btw. there was error on our side and the log produced was indeed from 0.29.0. But the other components should run the 0.30.0 and still crushing.
Hello @AuHau Did you manage to get this tested?
Hello @AuHau Did you manage to get this tested?
@vasco-santos We did try the branch you created for us we still got the error. We now setup the DEBUG so we can provide you a new log once it crashes again. Any other suggestion for us to Test? We still havent checked with announceFilter
yet.
Ok, keep me posted with the logs. After failing I think you should try to setup the announceFIlter
. It does not fix the underlying issue but should mitigate this problem while we look into this deeper
Hey there @vasco-santos so we had another crash bellow are logs of our two components that crashed:
Looking back on all this and researching a bit, this might be an IPv6 related issue. Usually, EADDRNOTAVAIL
errors happen on listen once any address already being used for listen is attempted. In this case, this is a connect problem.
EADDRNOTAVAIL
Indicates that the network address is currently unavailable for use. The calling host cannot reach the specified destination. Apparently the error is thrown somewhere in the Node.js stack and this might be an issue there.
With some research, I think the issue is around invalid usage of IPv6 addresses. Looking at:
it seems that the address 2604:1380:45f1:3f00::1:4002 will fall under the category of the Global Unicast Address. To clearly understand your topology, you don't own the peer with id 12D3KooWJfq7u5cYc65HAZT3mE3irDH5KYSUdFxXTdf2KTB7FQgf, which has the address /ip6/2604:1380:45f1:3f00::1/tcp/4002/ws/p2p/12D3KooWJfq7u5cYc65HAZT3mE3irDH5KYSUdFxXTdf2KTB7FQgf, is that correct?
@jacobheun do you have any ideas about this?
Hello,
2000::/3
is the global unicast address prefix from which IANA currently hands out assignments. It is essentially same as the public IPv4 space, addresses that are routable in the Internet.
You can inspect where a given address belong on https://bgp.he.net/ . In particular, all of our IPv6 addresses belong to the AWS prefix, https://bgp.he.net/net/2600:1f14::/35. 2604:1380:45f1:3f00::1
falls outside that prefix, and the same applies to 2604:a880:800:a1::1738:a001
(pasted in OP) and ::1
(not pictured, but we had a couple of crashes with that one too).
TL;DR we don't own that IPv6 address. As for the peer ID, it seems all ours start with Qm
.
Cheers.
EDIT: changed ::1:4002
-> ::1
, as it might be misinterpreted as the address ::1:4002
instead of address ::1
port 4002
.
Ok, so I did some experiments and I have "good" news. I could create a local test scenario replicating the logs you have and I could get to a simular error, which should be the same, not sure why the different code.
node:events:353
throw er; // Unhandled 'error' event
^
Error: connect EHOSTUNREACH 2604:1380:45f1:3f00::1:4002 - Local (:::54287)
at internalConnect (node:net:910:16)
at defaultTriggerAsyncIdScope (node:internal/async_hooks:430:12)
at node:net:1001:9
at processTicksAndRejections (node:internal/process/task_queues:75:11)
Emitted 'error' event on Socket instance at:
at emitErrorNT (node:internal/streams/destroy:188:8)
at emitErrorCloseNT (node:internal/streams/destroy:153:3)
at processTicksAndRejections (node:internal/process/task_queues:80:21) {
errno: -65,
code: 'EHOSTUNREACH',
syscall: 'connect',
address: '2604:1380:45f1:3f00::1',
port: 4002
}
I have been doing a debug session from this and I could reach to the root cause of the problem. This happens when a websockets connect start and is immediately aborted by libp2p. Once this happens libp2p will do rawSocket.close()
on the raw socket. Apparently, there is an issue in ws or even node dealing with this scenario. I was looking into node's codebase and they have tests for destroying an immediately open socket.
The code stack is:
Being this last place where this fails. I will open an issue in websockets
and try to get some feedback on how this should be fixed, or if this is a problem within Node.js.
Meanwhile, I will update my previous websockets PR to have a temporary solution for you to try out. This fixes my issue locally, so I think this will do the trick for you. It will be as simple as wrapping the rawSocket close with a setTimeout
to close it in the next event loop. I will let you know here once you can update your dependencies, as well as updates on the issue itself
Temporary fix is done: https://github.com/libp2p/js-libp2p-websockets/pull/118/commits/cc0bbc70f22b77f9802eac32aa04c1a581653cbf
Can you retry installing the npm dependencies? It is under the same websockets branch: fix/add-error-event-handler
.
Let me know how it goes
~@lucasvuotto @AuHau I did a new commit there. It seems that the problem would be fixed by the initial approach, but as it was wrapped inside a promise, the error handler was only added in the next event loop iteration. Now, I extracted it to outside the Promise executor and it seems to work smoothly in my test case.~
EDIT: This do not work after all, added back the setTimeout as mentioned earlier today
Please try it out again (reinstalling your deps) and let me know
Is there any update on wether this fixed the issue?
Unfortunately, I have already left the project, so can't give you the current status. The latest update was that we employed a bit different approach (without libp2p) and were planning to revisit it later on as we were pressed on time.
@julianmrodri might give you more current info.
I released libp2p-websockets@0.15.2
with the error handler for this.
I am going to close the issue, feel free to reopen if this is still an issue with the update.
Versions:
Type: bug
Severity: Critical
Description:
We are using libp2p and gossipsub to create a messaging channel for our application. We have tested it locally and everything worked ok, now we have deployed it and have big problem where our backends crashes rather randomly without much information, but I am pretty sure it is libp2p related.
The error is:
Steps to reproduce the error:
To be honest, I have no idea how to reproduce this :-(
Would you have any pointers on what is happening and how to handle this?