jtgrassie / monero-pool

A Monero mining pool server written in C
BSD 3-Clause "New" or "Revised" License
344 stars 121 forks source link

how to run, always fail, did I missed something #47

Closed kamisama23 closed 3 years ago

kamisama23 commented 3 years ago

16:36:24 INFO src/pool.c:4382: Starting pool on: 0.0.0.0:3333 16:36:24 DEBUG src/pool.c:488: Database (used/free): 8192/5368700928 16:36:24 INFO src/webui.c:182: Starting Web UI on 0.0.0.0:3334 16:36:24 INFO src/pool.c:4165: Starting trusted listener on: 0.0.0.0:6666 16:36:24 INFO src/pool.c:4177: Starting upstream connection to: 0.0.0.0:8888 16:36:24 INFO src/pool.c:2248: Fetching last block header 16:36:24 TRACE src/pool.c:1650: Payload: {"jsonrpc":"2.0","id":"0","method":"get_last_block_header"} 16:36:24 TRACE src/pool.c:1650: Payload: {"jsonrpc":"2.0","id":"0","method":"query_key","params":{"key_type":"view_key"}} 16:36:24 DEBUG src/pool.c:488: Database (used/free): 8192/5368700928 16:36:24 TRACE src/pool.c:2336: Sending message ping upstream 16:36:24 DEBUG src/pool.c:2679: Upstream connection error: 111 16:36:24 WARN src/pool.c:2697: No connection to upstream; retrying in 10s 16:36:24 ERROR src/pool.c:1549: HTTP status code 0 for /json_rpc. Aborting. 16:36:24 ERROR src/pool.c:1549: HTTP status code 0 for /json_rpc. Aborting. 16:36:34 INFO src/pool.c:2739: Reconnecting to upstream: 0.0.0.0:8888 16:36:34 DEBUG src/pool.c:2679: Upstream connection error: 111 16:36:34 WARN src/pool.c:2697: No connection to upstream; retrying in 10s 16:36:44 INFO src/pool.c:2739: Reconnecting to upstream: 0.0.0.0:8888 16:36:44 DEBUG src/pool.c:2679: Upstream connection error: 111 16:36:44 WARN src/pool.c:2697: No connection to upstream; retrying in 10s 16:36:54 INFO src/pool.c:2739: Reconnecting to upstream: 0.0.0.0:8888 16:36:54 DEBUG src/pool.c:2679: Upstream connection error: 111 16:36:54 WARN src/pool.c:2697: No connection to upstream; retrying in 10s 16:37:04 INFO src/pool.c:2739: Reconnecting to upstream: 0.0.0.0:8888 16:37:04 DEBUG src/pool.c:2679: Upstream connection error: 111 16:37:04 WARN src/pool.c:2697: No connection to upstream; retrying in 10s 16:37:14 INFO src/pool.c:2739: Reconnecting to upstream: 0.0.0.0:8888 16:37:14 DEBUG src/pool.c:2679: Upstream connection error: 111 16:37:14 WARN src/pool.c:2697: No connection to upstream; retrying in 10s 16:37:24 INFO src/pool.c:2739: Reconnecting to upstream: 0.0.0.0:8888 16:37:24 DEBUG src/pool.c:2679: Upstream connection error: 111 16:37:24 WARN src/pool.c:2697: No connection to upstream; retrying in 10s

jtgrassie commented 3 years ago

Your config file is bad. At minimum you have not read interconnected-pools properly as your logs above show you have set upstream-host to 0.0.0.0 (which is a bind address of any).

jtgrassie commented 3 years ago

If you are just running a single pool you can comment out the trusted / upstream lines (like they are in the default conf file.

jtgrassie commented 3 years ago

Set the correct things in your config file:

rpc-host = 127.0.0.1
rpc-port = 28081
wallet-rpc-host = 127.0.0.1
wallet-rpc-port = 28084

to point to the correct ip:port of each your monerod and monero-wallet-rpc listening RPC ip:port.

This should all be painfully obvious.

jtgrassie commented 3 years ago

"I use ip, no domain name, when i type the ip address it just show error page" - again, config. You have to use the same IP and port as you set in the pool config. And of course make sure you bound to an accessible interface and that you don't have any firewall blocking it. This is all very basic stuff that has nothing to do with the pool. This is not a support channel but a bug/issue tracker.

kamisama23 commented 3 years ago

"I use ip, no domain name, when i type the ip address it just show error page" - again, config. You have to use the same IP and port as you set in the pool config. And of course make sure you bound to an accessible interface and that you don't have any firewall blocking it. This is all very basic stuff that has nothing to do with the pool. This is not a support channel but a bug/issue tracker.

thanks for your reply, all have done!