Closed littleskunk closed 8 years ago
Noticed this today too. @phutchins any ideas?
I have to assume that it's related to the NGINX proxy in front of bridge, since we are running the same code as the other working seeds.
Looking into this.
Ok, I believe what is happening is that the seed node (593844dc7f0076a1aeda9a6b9788af17e67c1052) that is hardcoded into the source for our clients was no longer up. It is the old API and was taken down after the migration to the new cluster.
It was believed that you would still get a list of known good seeds even if this seed was down, but it may only be if that seed is still up but not necessarily the node that you're talking to initially.
I've brought the seed node (593844dc7f0076a1aeda9a6b9788af17e67c1052) back up which should resolve the issue.
@gordonwritescode we will likely want to update the seed node id in our codebase with one of the new API hosts.
api1 - 78cfca0e01235db817728aec056d007672ffac63 api2 - 3a3145039bf2dec51698fc6cfb3a4b68655523f2
Ah! Good, then it's a feature, not a bug! I'll update on Monday unless @littleskunk wants to send a PR to update the node ID in lib/network/index.js
under Network.DEFUALTS
.
Tuesday**
I will update core, storjshare-cli and storjshare-gui this weekend. They are all affected.
@gordonwritescode I changed my config like this:
"seeds": [
"storj://api.storj.io:8443/78cfca0e01235db817728aec056d007672ffac63",
"storj://api.storj.io:8443/3a3145039bf2dec51698fc6cfb3a4b68655523f2"
],
I would expect 2 FIND_NODE requests but I see 4 of them a little brake and again 4 of them. Why ask both nodes twice without waiting for an answer? Looks like a new bug to me.
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698
fc6cfb3a4b68655523f2\",\"lastSeen\":1463165881056}","timestamp":"2016-05-13T18:58:01.056Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db81772
8aec056d007672ffac63\",\"lastSeen\":1463165881130}","timestamp":"2016-05-13T18:58:01.130Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698
fc6cfb3a4b68655523f2\",\"lastSeen\":1463165881218}","timestamp":"2016-05-13T18:58:01.218Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db81772
8aec056d007672ffac63\",\"lastSeen\":1463165881278}","timestamp":"2016-05-13T18:58:01.279Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda
9a6b9788af17e67c1052\",\"lastSeen\":1463165881676}","timestamp":"2016-05-13T18:58:01.677Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda
9a6b9788af17e67c1052\",\"lastSeen\":1463165881776}","timestamp":"2016-05-13T18:58:01.776Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda
9a6b9788af17e67c1052\",\"lastSeen\":1463165881868}","timestamp":"2016-05-13T18:58:01.868Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda
9a6b9788af17e67c1052\",\"lastSeen\":1463165881958}","timestamp":"2016-05-13T18:58:01.958Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db81772
8aec056d007672ffac63\",\"lastSeen\":1463165882059}","timestamp":"2016-05-13T18:58:02.059Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698
fc6cfb3a4b68655523f2\",\"lastSeen\":1463165882095}","timestamp":"2016-05-13T18:58:02.095Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463165882180}","timestamp":"2016-05-13T18:58:02.181Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463165882248}","timestamp":"2016-05-13T18:58:02.248Z"}
I believe the second set of FIND_NODE requests are part of a bucket refresh flow that is implemented in kad per the paper. When we connect to a new node, we ask it for nodes close to us, then we ask it for nodes that are further than are closest: https://github.com/kadtools/kad/blob/master/lib/router.js#L405
Than our* closest
@gordonwritescode Nope:
New config:
"seeds": [
"storj://api.storj.io:8443/593844dc7f0076a1aeda9a6b9788af17e67c1052",
"storj://api.storj.io:8443/78cfca0e01235db817728aec056d007672ffac63",
"storj://api.storj.io:8443/3a3145039bf2dec51698fc6cfb3a4b68655523f2"
],
I would expect 3 or 6 FIND_NODE requests but now I get 9. As you can see there are 2 blocks of them and that would match your further and closest request.
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463267340520}","timestamp":"2016-05-14T23:09:00.520Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267340585}","timestamp":"2016-05-14T23:09:00.585Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463267340679}","timestamp":"2016-05-14T23:09:00.679Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463267340789}","timestamp":"2016-05-14T23:09:00.789Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267340830}","timestamp":"2016-05-14T23:09:00.830Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463267340871}","timestamp":"2016-05-14T23:09:00.871Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463267340945}","timestamp":"2016-05-14T23:09:00.945Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267340986}","timestamp":"2016-05-14T23:09:00.986Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463267341054}","timestamp":"2016-05-14T23:09:01.054Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341447}","timestamp":"2016-05-14T23:09:01.447Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341560}","timestamp":"2016-05-14T23:09:01.560Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341636}","timestamp":"2016-05-14T23:09:01.637Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341732}","timestamp":"2016-05-14T23:09:01.733Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341818}","timestamp":"2016-05-14T23:09:01.818Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341901}","timestamp":"2016-05-14T23:09:01.901Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267341988}","timestamp":"2016-05-14T23:09:01.988Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267342063}","timestamp":"2016-05-14T23:09:02.063Z"}
{"level":"info","message":"received valid message from {\"protocol\":\"0.6.12\",\"address\":\"104.196.56.59\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267342138}","timestamp":"2016-05-14T23:09:02.138Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463267342224}","timestamp":"2016-05-14T23:09:02.224Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267342261}","timestamp":"2016-05-14T23:09:02.261Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463267342303}","timestamp":"2016-05-14T23:09:02.303Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463267342372}","timestamp":"2016-05-14T23:09:02.372Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267342437}","timestamp":"2016-05-14T23:09:02.437Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463267342501}","timestamp":"2016-05-14T23:09:02.501Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"78cfca0e01235db817728aec056d007672ffac63\",\"lastSeen\":1463267342537}","timestamp":"2016-05-14T23:09:02.537Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"593844dc7f0076a1aeda9a6b9788af17e67c1052\",\"lastSeen\":1463267342572}","timestamp":"2016-05-14T23:09:02.572Z"}
{"level":"info","message":"sending FIND_NODE message to {\"protocol\":\"0.6.13\",\"address\":\"api.storj.io\",\"port\":8443,\"nodeID\":\"3a3145039bf2dec51698fc6cfb3a4b68655523f2\",\"lastSeen\":1463267342638}","timestamp":"2016-05-14T23:09:02.638Z"}
Interesting. Okay, this is a bug in kad, so let's open an issue there. If the original issue here is resolved, let's close it.
Actually, what might be happening is each of those original seeds return each other in their FIND_NODE responses causing another FIND_NODE to each one.
The FIND_NODE is iterative, it will ask for nodes and then issue a FIND_NODE to the contacts returned from the original query.
I have a stupid question. Don't hit me if I am wrong. This will call kad connect for each seed: https://github.com/Storj/core/blob/19c805115d1f7ad750f3c50354e208aacc1e8497/lib/network/index.js#L116 This will send out FIND_NODE to everyone: https://github.com/kadtools/kad/blob/0b000446e54217ca211a5f3e4c620834f0fb076b/lib/node.js#L116
I am sure it is a core issue: https://github.com/Storj/core/issues/98
Lets close this issue.
Package Versions
Replace the values below using the output from
npm list storj
.Replace the values below using the output from
node --version
.Expected Behavior
Please describe the program's expected behavior. Include an example of your usage code in the back ticks below if applicable.
Actual Behavior
Please describe the program's actual behavior. Please include any stack traces or log output in the back ticks below.
Steps to Reproduce
Please include the steps the reproduce the issue, numbered below. Include as much detail as possible.
Workaround
I ask slack user blackduck to change his config and use my node. He was connected in a few seconds and get shards. He changed his config back to api.storj.io and is still connected. I assume the same workaround will work for unable to find tunnel connection as well.
Here is a list of nodes with good uptime. Use on node of the list and start storjshare. If you are connected please stop your node and change your config back to the default node api.storj.io. Please don't forget that last step or the nodes will go offline as well and the workaround will be useless. "storj://api.storj.io:8443/593844dc7f0076a1aeda9a6b9788af17e67c1052" "storj://littleskunk.myqnapcloud.com:4000/b408aeb67291e4d1994a95394e674c09eb683d3c" "storj://cybaer.dyndns.org:4000/1037e361a7a5f0ad918f45a551a9ed85105b3845" "storj://server046.storj.dk:5046/83ce8b82ead3a5c7daaca2996567f6bbdc83c669"