stratum-mining / stratum

stratum
https://stratumprotocol.org
Other
218 stars 126 forks source link

Improve flaky MG tests #807

Closed pavlenex closed 5 months ago

pavlenex commented 7 months ago

Placeholder,until someone more knowledgable document sspecifics, but life shouldn't be waiting for MG tests to ✅ , so we should enure MG tests aren't flaky.

Waiting-Memes-52918

plebhash commented 7 months ago

this is a similar issue from a while ago: https://github.com/stratum-mining/stratum/issues/377

we should check if it is still relevant, and if not, close it

plebhash commented 6 months ago

I'm starting some investigation on this. I notice MG is quite non-deterministic, so I think a good first step is to start logging here the different error messages we get with it. I'll do that in the comments below.

plebhash commented 6 months ago

from CI at https://github.com/stratum-mining/stratum/commit/ecb9c76b87275a5d49dbd9eed9d383b8841241ad

it went away after re-running the test

EXECUTING ../../test/message-generator/test/bad-pool-config-test.json

STD ERR:     Finished dev [unoptimized + debuginfo] target(s) in 0.06s
STD ERR:      Running `target/debug/message_generator_sv2 ../../test/message-generator/mock/template-provider-mock0.json`
STD OUT: 
STD OUT: EXECUTING ../../test/message-generator/mock/template-provider-mock0.json
STD OUT: 
STD ERR: info: cargo-llvm-cov currently setting cfg(coverage); you can opt-out it by passing --no-cfg-coverage
...
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 1m 25s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -c ../test/config/pool-mock-tp-bad-coinbase.toml`
STD OUT: 2024-04-05T13:35:44.513419Z  INFO pool_sv2: Pool INITIALIZING with config: "../test/config/pool-mock-tp-bad-coinbase.toml"
STD OUT: 2024-04-05T13:35:44.513465Z ERROR pool_sv2: Failed to get coinbase output: InvalidOutputScript
STD ERR: info: cargo-llvm-cov currently setting cfg(coverage); you can opt-out it by passing --no-cfg-coverage
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.11s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -h -c ../test/config/pool-mock-tp.toml`
STD OUT: 2024-04-05T13:35:44.847786Z ERROR pool_sv2: Usage: -h/--help, -c/--config <path|default pool-config.toml>
STD ERR: info: cargo-llvm-cov currently setting cfg(coverage); you can opt-out it by passing --no-cfg-coverage
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.11s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -c ../test/config/pool-mock-tp-bad-config.toml`
STD OUT: 2024-04-05T13:35:45.182615Z ERROR pool_sv2: Failed to parse config: missing field `listen_address` at line 1 column 1
STD ERR: info: cargo-llvm-cov currently setting cfg(coverage); you can opt-out it by passing --no-cfg-coverage
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.11s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -c ../test/config/no-config-at-this-path.toml`
STD OUT: 2024-04-05T13:35:45.517442Z ERROR pool_sv2: Failed to read config: No such file or directory (os error 2)
STD ERR: info: cargo-llvm-cov currently setting cfg(coverage); you can opt-out it by passing --no-cfg-coverage
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.11s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -c ../test/config/pool-mock-tp.toml`
STD OUT: 2024-04-05T13:35:45.852768Z  INFO pool_sv2: Pool INITIALIZING with config: "../test/config/pool-mock-tp.toml"
STD OUT: 2024-04-05T13:35:45.853549Z  INFO pool_sv2::lib::template_receiver: Connected to template distribution server at 127.0.0.1:8442
STD OUT: 2024-04-05T13:35:45.853789Z DEBUG network_helpers_sv2::noise_connection_tokio: Initializing as downstream for - 127.0.0.1:8442
STD OUT: Initializing as upstream for - 127.0.0.1:44744
STD OUT: 2024-04-05T13:35:45.860531Z DEBUG network_helpers_sv2::noise_connection_tokio: Noise handshake complete - 127.0.0.1:8442
STD OUT: Noise handshake complete - 127.0.0.1:44744
STD OUT: actiondoc: "This action checks that a Setupconnection message is received"
STD OUT: Working on result 1/1: MatchMessageType: 0 (0x0)
thread 'main' panicked at src/main.rs:429:13:
TEST FAILED
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
mg test failed
plebhash commented 6 months ago

from my local setup at 96c593e0127b9fb59d2bbe32f8fb43ab5e4013f6

I actually ran ./message-generator-tests.sh multiple times and most of them were successful. But after many repetitions I got this error:

EXECUTING ../../test/message-generator/test/pool-sri-test-1-standard.json

STD ERR:     Finished dev [unoptimized + debuginfo] target(s) in 0.04s
STD ERR:      Running `target/debug/message_generator_sv2 ../../test/message-generator/mock/template-provider-mock0.json`
STD ERR:     Finished dev [unoptimized + debuginfo] target(s) in 0.05s
STD ERR:      Running `target/debug/message_generator_sv2 ../../../test/message-generator/mock/template-provider-mock0.json`
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.05s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -c ../test/config/pool-mock-tp.toml`
STD OUT: 2024-04-05T20:21:11.438508Z  INFO pool_sv2: Pool INITIALIZING with config: "../test/config/pool-mock-tp.toml"
STD OUT: 2024-04-05T20:21:11.438904Z ERROR pool_sv2: Could not connect to Template Provider: I/O error: `Os { code: 61, kind: ConnectionRefused, message: "Connection refused" }
STD OUT: 
STD OUT: EXECUTING ../../../test/message-generator/mock/template-provider-mock0.json
STD OUT: 
STD ERR: thread 'main' panicked at src/main.rs:389:16:
STD ERR: file not found in load_str!("../../../../test/message-generator/mock/template-provider-mock0.json") (resolved to: "src/../../../../test/message-generator/mock/template-provider-mock0.json")
STD ERR: note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
STD OUT: 
STD OUT: EXECUTING ../../test/message-generator/mock/template-provider-mock0.json
STD OUT: 
STD ERR:     Finished dev [unoptimized + debuginfo] target(s) in 0.05s
STD ERR:      Running `target/debug/message_generator_sv2 ../../test/message-generator/mock/job-declarator-mock.json`
thread 'main' panicked at src/main.rs:429:13:
TEST FAILED
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
mg test failed
plebhash commented 6 months ago

from my local setup at 96c593e0127b9fb59d2bbe32f8fb43ab5e4013f6

I also ran ./message-generator-tests.sh multiple times and most of them were successful. But after many repetitions I got this error:

EXECUTING ../../test/message-generator/test/standard-coverage-test.json

STD ERR:     Finished dev [unoptimized + debuginfo] target(s) in 0.04s
STD ERR:      Running `target/debug/message_generator_sv2 ../../test/message-generator/mock/template-provider-mock1.json`
STD ERR:     Finished dev [unoptimized + debuginfo] target(s) in 0.04s
STD ERR:      Running `target/debug/message_generator_sv2 ../../../test/message-generator/mock/template-provider-mock1.json`
STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.05s
STD ERR:      Running `target/llvm-cov-target/debug/pool_sv2 -c ../test/config/pool-mock-tp-standard-coverage.toml`
STD OUT: 2024-04-05T20:58:33.382705Z  INFO pool_sv2: Pool INITIALIZING with config: "../test/config/pool-mock-tp-standard-coverage.toml"
STD OUT: 2024-04-05T20:58:33.383092Z ERROR pool_sv2: Could not connect to Template Provider: I/O error: `Os { code: 61, kind: ConnectionRefused, message: "Connection refused" }
thread 'main' panicked at src/main.rs:429:13:
TEST FAILED
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
mg test failed
plebhash commented 6 months ago

from my local setup at 96c593e0127b9fb59d2bbe32f8fb43ab5e4013f6

I also ran ./message-generator-tests.sh multiple times and most of them were successful. But after many repetitions I got this error:

EXECUTING ../../test/message-generator/test/translation-proxy.json

STD ERR:     Finished dev [optimized + debuginfo] target(s) in 0.06s
STD ERR:      Running `target/debug/translator_sv2 -c ../test/config/tproxy-config-no-jd-sv1-cpu-md.toml`
STD OUT: 2024-04-05T21:13:40.240804Z  INFO translator_sv2: PC: ProxyConfig { upstream_address: "127.0.0.1", upstream_port: 34254, upstream_authority_pubkey: Secp256k1PublicKey(XOnlyPublicKey(e76c2b09eed7baa394dbb794896e913c86a5f719ea803bc0a4aaa104383cee24ac5b32268edbcc58d105534c281f112f5e7a5c1ff0e2d113bd938dc7698e2cce)), downstream_address: "0.0.0.0", downstream_port: 34255, max_supported_version: 2, min_supported_version: 2, min_extranonce2_size: 8, downstream_difficulty_config: DownstreamDifficultyConfig { min_individual_miner_hashrate: 100000.0, shares_per_minute: 100.0, submits_since_last_update: 0, timestamp_of_last_update: 0 }, upstream_difficulty_config: UpstreamDifficultyConfig { channel_diff_update_interval: 60, channel_nominal_hashrate: 500.0, timestamp_of_last_update: 0, should_aggregate: false } }
STD OUT: 2024-04-05T21:13:40.241461Z  INFO translator_sv2::lib::upstream_sv2::upstream: PROXY SERVER - ACCEPTING FROM UPSTREAM: 127.0.0.1:34254
Initializing as upstream for - 127.0.0.1:59456
STD OUT: 2024-04-05T21:13:40.241860Z DEBUG network_helpers_sv2::noise_connection_async_std: Initializing as downstream for - 127.0.0.1:34254
Noise handshake complete - 127.0.0.1:59456
actiondoc: "This action checks that a Setupconnection message is received"
Working on result 1/1: GetMessageField: "CommonMessages" [SaveField { field_name: "flags", keyword: "setup_connection_success_flag" }, SaveField { field_name: "min_version", keyword: "setup_connection_success_used_version" }]
STD OUT: 2024-04-05T21:13:40.245401Z DEBUG network_helpers_sv2::noise_connection_async_std: Noise handshake complete - 127.0.0.1:34254
STD OUT: 2024-04-05T21:13:40.245420Z DEBUG translator_sv2: Starting up signal listener
STD OUT: 2024-04-05T21:13:40.245422Z DEBUG translator_sv2: Starting up status listener
RECV Sv2Frame {
    header: Header {
        extension_type: 0,
        msg_type: 0,
        msg_length: U24(
            23,
        ),
    },
    payload: None,
    serialized: Some(
        Slice {
            offset: 0x0000000140058000,
            len: 29,
            index: 1,
            shared_state: SharedState(
                128,
            ),
            owned: None,
        },
    ),
}
SEND Common(
    SetupConnectionSuccess(
        SetupConnectionSuccess {
            used_version: 2,
            flags: 0,
        },
    ),
)
actiondoc: "Checks that OpenExtendedMiningChannel is received"
Working on result 1/1: GetMessageField: "MiningProtocol" [SaveField { field_name: "request_id", keyword: "open_extended_mining_channel_success_request_id" }, SaveField { field_name: "max_target", keyword: "open_extended_mining_channel_success_target" }]
STD OUT: 2024-04-05T21:13:40.245955Z  INFO roles_logic_sv2::handlers::common: Received SetupConnectionSuccess: version=2, flags=0
STD OUT: 2024-04-05T21:13:40.246005Z  INFO translator_sv2: Connected to Upstream!
STD OUT: 2024-04-05T21:13:40.246026Z DEBUG translator_sv2: Finished starting upstream listener
RECV Sv2Frame {
    header: Header {
        extension_type: 0,
        msg_type: 19,
        msg_length: U24(
            46,
        ),
    },
    payload: None,
    serialized: Some(
        Slice {
            offset: 0x0000000140058000,
            len: 52,
            index: 1,
            shared_state: SharedState(
                128,
            ),
            owned: None,
        },
    ),
}
actiondoc: "This action responds with OpenExtendedMiningChannelSuccess"
SEND Mining(
    OpenExtendedMiningChannelSuccess(
        OpenExtendedMiningChannelSuccess {
            request_id: 0,
            channel_id: 213491456,
            target: U256(
                Owned(
                    [
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        255,
                        255,
                        255,
                        255,
                        255,
                        255,
                        255,
                        255,
                    ],
                ),
            ),
            extranonce_size: 16,
            extranonce_prefix: B032(
                Owned(
                    [
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        1,
                    ],
                ),
            ),
        },
    ),
)
actiondoc: "Sends NewExtendedMiningJob and SetNewPrevHash and waits that a SubmitsShareExtended is submitted"
SEND Mining(
    NewExtendedMiningJob(
        NewExtendedMiningJob {
            channel_id: 0,
            job_id: 1,
            min_ntime: Sv2Option {
                seq: None,
                data: Some(
                    [],
                ),
            },
            version: 536870912,
            version_rolling_allowed: true,
            merkle_path: Seq0255 {
                seq: None,
                data: Some(
                    [],
                ),
            },
            coinbase_tx_prefix: B064K(
                Owned(
                    [
                        2,
                        0,
                        0,
                        0,
                        0,
                        1,
                        1,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        255,
                        255,
                        255,
                        255,
                        36,
                        2,
                        3,
                        15,
                        0,
                    ],
                ),
            ),
            coinbase_tx_suffix: B064K(
                Owned(
                    [
                        255,
                        255,
                        255,
                        255,
                        2,
                        149,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        67,
                        65,
                        4,
                        70,
                        109,
                        127,
                        202,
                        229,
                        99,
                        229,
                        203,
                        9,
                        160,
                        209,
                        135,
                        11,
                        181,
                        128,
                        52,
                        72,
                        4,
                        97,
                        120,
                        121,
                        161,
                        73,
                        73,
                        207,
                        34,
                        40,
                        95,
                        27,
                        174,
                        63,
                        39,
                        103,
                        40,
                        23,
                        108,
                        60,
                        100,
                        49,
                        248,
                        238,
                        218,
                        69,
                        56,
                        220,
                        55,
                        200,
                        101,
                        226,
                        120,
                        79,
                        58,
                        158,
                        119,
                        208,
                        68,
                        243,
                        62,
                        64,
                        119,
                        151,
                        225,
                        39,
                        138,
                        172,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        38,
                        106,
                        36,
                        170,
                        33,
                        169,
                        237,
                        226,
                        246,
                        28,
                        63,
                        113,
                        209,
                        222,
                        253,
                        63,
                        169,
                        153,
                        223,
                        163,
                        105,
                        83,
                        117,
                        92,
                        105,
                        6,
                        137,
                        121,
                        153,
                        98,
                        180,
                        139,
                        235,
                        216,
                        54,
                        151,
                        78,
                        140,
                        249,
                        1,
                        32,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                        0,
                    ],
                ),
            ),
        },
    ),
)
SEND Mining(
    SetNewPrevHash(
        SetNewPrevHash {
            channel_id: 0,
            job_id: 1,
            prev_hash: U256(
                Owned(
                    [
                        91,
                        30,
                        84,
                        205,
                        18,
                        124,
                        218,
                        102,
                        28,
                        163,
                        155,
                        204,
                        173,
                        55,
                        119,
                        61,
                        224,
                        199,
                        68,
                        229,
                        144,
                        22,
                        92,
                        0,
                        53,
                        44,
                        15,
                        204,
                        200,
                        245,
                        149,
                        0,
                    ],
                ),
            ),
            min_ntime: 1679128496,
            nbits: 545259519,
        },
    ),
)
Working on result 1/1: MatchMessageType: 27 (0x1b)
STD OUT: 2024-04-05T21:13:40.246907Z  INFO roles_logic_sv2::handlers::mining: Received OpenExtendedMiningChannelSuccess with request id: 0 and channel id: 213491456
STD OUT: 2024-04-05T21:13:40.246914Z DEBUG roles_logic_sv2::handlers::mining: OpenStandardMiningChannelSuccess: OpenExtendedMiningChannelSuccess { request_id: 0, channel_id: 213491456, target: Ref([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 255, 255, 255, 255, 255, 255]), extranonce_size: 16, extranonce_prefix: Ref([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1]) }
STD OUT: 2024-04-05T21:13:40.246937Z  INFO translator_sv2::lib::upstream_sv2::upstream: Up: Successfully Opened Extended Mining Channel
STD OUT: 2024-04-05T21:13:40.246976Z  INFO roles_logic_sv2::handlers::mining: Received new extended mining job for channel id: 0 with job id: 1 is_future: true
STD OUT: 2024-04-05T21:13:40.246980Z DEBUG roles_logic_sv2::handlers::mining: NewExtendedMiningJob: NewExtendedMiningJob { channel_id: 0, job_id: 1, min_ntime: Sv2Option([], PhantomData<&u32>), version: 536870912, version_rolling_allowed: true, merkle_path: Seq0255([], PhantomData<&binary_codec_sv2::datatypes::non_copy_data_types::inner::Inner<true, 32, 0, 0>>), coinbase_tx_prefix: Ref([2, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 255, 255, 36, 2, 3, 15, 0]), coinbase_tx_suffix: Ref([255, 255, 255, 255, 2, 149, 0, 0, 0, 0, 0, 0, 0, 67, 65, 4, 70, 109, 127, 202, 229, 99, 229, 203, 9, 160, 209, 135, 11, 181, 128, 52, 72, 4, 97, 120, 121, 161, 73, 73, 207, 34, 40, 95, 27, 174, 63, 39, 103, 40, 23, 108, 60, 100, 49, 248, 238, 218, 69, 56, 220, 55, 200, 101, 226, 120, 79, 58, 158, 119, 208, 68, 243, 62, 64, 119, 151, 225, 39, 138, 172, 0, 0, 0, 0, 0, 0, 0, 0, 38, 106, 36, 170, 33, 169, 237, 226, 246, 28, 63, 113, 209, 222, 253, 63, 169, 153, 223, 163, 105, 83, 117, 92, 105, 6, 137, 121, 153, 98, 180, 139, 235, 216, 54, 151, 78, 140, 249, 1, 32, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]) }
STD OUT: 2024-04-05T21:13:40.246988Z DEBUG translator_sv2::lib::proxy::bridge: Starting handle_new_prev_hash task
STD OUT: 2024-04-05T21:13:40.246992Z  INFO roles_logic_sv2::handlers::mining: Received SetNewPrevHash channel id: 0, job id: 1
STD OUT: 2024-04-05T21:13:40.246995Z DEBUG roles_logic_sv2::handlers::mining: SetNewPrevHash: SetNewPrevHash { channel_id: 0, job_id: 1, prev_hash: Ref([91, 30, 84, 205, 18, 124, 218, 102, 28, 163, 155, 204, 173, 55, 119, 61, 224, 199, 68, 229, 144, 22, 92, 0, 53, 44, 15, 204, 200, 245, 149, 0]), min_ntime: 1679128496, nbits: 545259519 }
STD OUT: 2024-04-05T21:13:40.246999Z DEBUG translator_sv2::lib::proxy::bridge: Starting handle_new_extended_mining_job task
STD OUT: 2024-04-05T21:13:40.247000Z DEBUG translator_sv2::lib::proxy::bridge: handle_new_prev_hash job_id: 1
STD OUT: 2024-04-05T21:13:40.247032Z DEBUG translator_sv2::lib::proxy::bridge: handle_new_extended_mining_job job_id: 1
STD OUT: 2024-04-05T21:13:40.247053Z DEBUG translator_sv2::lib::proxy::next_mining_notify: 
STD OUT: NextMiningNotify: Notify { job_id: "1", prev_hash: PrevHash(Owned([91, 30, 84, 205, 18, 124, 218, 102, 28, 163, 155, 204, 173, 55, 119, 61, 224, 199, 68, 229, 144, 22, 92, 0, 53, 44, 15, 204, 200, 245, 149, 0])), coin_base1: HexBytes([2, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 255, 255, 255, 255, 36, 2, 3, 15, 0]), coin_base2: HexBytes([255, 255, 255, 255, 2, 149, 0, 0, 0, 0, 0, 0, 0, 67, 65, 4, 70, 109, 127, 202, 229, 99, 229, 203, 9, 160, 209, 135, 11, 181, 128, 52, 72, 4, 97, 120, 121, 161, 73, 73, 207, 34, 40, 95, 27, 174, 63, 39, 103, 40, 23, 108, 60, 100, 49, 248, 238, 218, 69, 56, 220, 55, 200, 101, 226, 120, 79, 58, 158, 119, 208, 68, 243, 62, 64, 119, 151, 225, 39, 138, 172, 0, 0, 0, 0, 0, 0, 0, 0, 38, 106, 36, 170, 33, 169, 237, 226, 246, 28, 63, 113, 209, 222, 253, 63, 169, 153, 223, 163, 105, 83, 117, 92, 105, 6, 137, 121, 153, 98, 180, 139, 235, 216, 54, 151, 78, 140, 249, 0, 0, 0, 0]), merkle_branch: [], version: HexU32Be(536870912), bits: HexU32Be(545259519), time: HexU32Be(1679128496), clean_jobs: true }
STD OUT: 
STD OUT: 2024-04-05T21:13:40.495081Z  INFO translator_sv2::lib::downstream_sv1::downstream: PROXY SERVER - ACCEPTING FROM DOWNSTREAM: 127.0.0.1:59457
STD OUT: 2024-04-05T21:13:40.495234Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Receiving from Mining Device 127.0.0.1:59457: "{\"id\":1712351620,\"method\":\"mining.configure\",\"params\":[[],{}]}"
STD OUT: 2024-04-05T21:13:40.495313Z DEBUG sv1_api: Configure { extensions: [], id: 1712351620 }
STD OUT: 2024-04-05T21:13:40.495316Z  INFO translator_sv2::lib::downstream_sv1::downstream: Down: Configuring
STD OUT: 2024-04-05T21:13:40.495317Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Down: Handling mining.configure: Configure { extensions: [], id: 1712351620 }
STD OUT: 2024-04-05T21:13:40.495319Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Negotiated version_rolling_mask is None
STD OUT: 2024-04-05T21:13:40.495323Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To DOWN: OkResponse(Response { id: 1712351620, error: None, result: Object {"minimum-difficulty": Bool(false), "version-rolling": Bool(true), "version-rolling.mask": String("00000000"), "version-rolling.min-bit-count": String("00000000")} })
STD OUT: 2024-04-05T21:13:40.495339Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"id\":1712351620,\"error\":null,\"result\":{\"minimum-difficulty\":false,\"version-rolling\":true,\"version-rolling.mask\":\"00000000\",\"version-rolling.min-bit-count\":\"00000000\"}}\n"
STD OUT: 2024-04-05T21:13:40.495491Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Receiving from Mining Device 127.0.0.1:59457: "{\"id\":1712351620,\"method\":\"mining.subscribe\",\"params\":[\"80\",\"08000002\"]}"
STD OUT: 2024-04-05T21:13:40.495500Z  INFO translator_sv2::lib::downstream_sv1::downstream: Down: Subscribing
STD OUT: 2024-04-05T21:13:40.495502Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Down: Handling mining.subscribe: Subscribe { id: 1712351620, agent_signature: "80", extranonce1: Some(Extranonce(Owned([8, 0, 0, 2]))) }
STD OUT: 2024-04-05T21:13:40.495508Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To DOWN: OkResponse(Response { id: 1712351620, error: None, result: Array [Array [Array [String("mining.set_difficulty"), String("ae6812eb4cd7735a302a8a9dd95cf71f")], Array [String("mining.notify"), String("ae6812eb4cd7735a302a8a9dd95cf71f")]], String("000000000000000000000000000000010000000000000001"), Number(8)] })
STD OUT: 2024-04-05T21:13:40.495519Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"id\":1712351620,\"error\":null,\"result\":[[[\"mining.set_difficulty\",\"ae6812eb4cd7735a302a8a9dd95cf71f\"],[\"mining.notify\",\"ae6812eb4cd7735a302a8a9dd95cf71f\"]],\"000000000000000000000000000000010000000000000001\",8]}\n"
STD OUT: 2024-04-05T21:13:40.495634Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Receiving from Mining Device 127.0.0.1:59457: "{\"id\":1712351620,\"method\":\"mining.authorize\",\"params\":[\"user\",\"password\"]}"
STD OUT: 2024-04-05T21:13:40.495640Z  INFO translator_sv2::lib::downstream_sv1::downstream: Down: Authorizing
STD OUT: 2024-04-05T21:13:40.495641Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Down: Handling mining.authorize: Authorize { id: 1712351620, name: "user", password: "password" }
STD OUT: 2024-04-05T21:13:40.495644Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To DOWN: OkResponse(Response { id: 1712351620, error: None, result: Bool(true) })
STD OUT: 2024-04-05T21:13:40.495650Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"id\":1712351620,\"error\":null,\"result\":true}\n"
STD OUT: 2024-04-05T21:13:41.496250Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To Bridge: SetDownstreamTarget(SetDownstreamTarget { channel_id: 1, new_target: Target { head: 6555997669382258025093932537643106821, tail: 5671278260711295869458419150210300 } })
STD OUT: 2024-04-05T21:13:41.496281Z DEBUG translator_sv2::lib::downstream_sv1::diff_management: Target: [0, 1, 23, 157, 152, 97, 167, 97, 255, 218, 221, 17, 195, 7, 196, 252, 4, 238, 163, 164, 24, 247, 214, 135, 88, 78, 68, 52, 175, 21, 130, 5]
STD OUT: 2024-04-05T21:13:41.496285Z DEBUG translator_sv2::lib::downstream_sv1::diff_management: Difficulty from target: 1.3970187619619732e-5
STD OUT: 2024-04-05T21:13:41.496292Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To DOWN: Notification(Notification { method: "mining.set_difficulty", params: Array [Number(0.000013970187619619732)] })
STD OUT: 2024-04-05T21:13:41.496315Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To DOWN: Notification(Notification { method: "mining.notify", params: Array [String("1"), String("cd541e5b66da7c12cc9ba31c3d7737ade544c7e0005c1690cc0f2c350095f5c8"), String("02000000010000000000000000000000000000000000000000000000000000000000000000ffffffff2402030f00"), String("ffffffff029500000000000000434104466d7fcae563e5cb09a0d1870bb580344804617879a14949cf22285f1bae3f276728176c3c6431f8eeda4538dc37c865e2784f3a9e77d044f33e407797e1278aac0000000000000000266a24aa21a9ede2f61c3f71d1defd3fa999dfa36953755c690689799962b48bebd836974e8cf900000000"), Array [], String("20000000"), String("207fffff"), String("641577b0"), Bool(true)] })
STD OUT: 2024-04-05T21:13:41.496339Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"method\":\"mining.set_difficulty\",\"params\":[0.000013970187619619732]}\n"
STD OUT: 2024-04-05T21:13:41.496395Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"method\":\"mining.notify\",\"params\":[\"1\",\"cd541e5b66da7c12cc9ba31c3d7737ade544c7e0005c1690cc0f2c350095f5c8\",\"02000000010000000000000000000000000000000000000000000000000000000000000000ffffffff2402030f00\",\"ffffffff029500000000000000434104466d7fcae563e5cb09a0d1870bb580344804617879a14949cf22285f1bae3f276728176c3c6431f8eeda4538dc37c865e2784f3a9e77d044f33e407797e1278aac0000000000000000266a24aa21a9ede2f61c3f71d1defd3fa999dfa36953755c690689799962b48bebd836974e8cf900000000\",[],\"20000000\",\"207fffff\",\"641577b0\",true]}\n"
STD OUT: 2024-04-05T21:13:41.499698Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Receiving from Mining Device 127.0.0.1:59457: "{\"id\":0,\"method\":\"mining.submit\",\"params\":[\"user\",\"1\",\"0000000000000000\",\"661069c1\",\"00000feb\"]}"
STD OUT: 2024-04-05T21:13:41.499728Z  INFO translator_sv2::lib::downstream_sv1::downstream: Down: Submitting Share
STD OUT: 2024-04-05T21:13:41.499730Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Down: Handling mining.submit: Submit { user_name: "user", job_id: "1", extra_nonce2: Extranonce(Owned([0, 0, 0, 0, 0, 0, 0, 0])), time: HexU32Be(1712351681), nonce: HexU32Be(4075), version_bits: None, id: 0 }
STD OUT: 2024-04-05T21:13:41.499738Z DEBUG translator_sv2::lib::downstream_sv1::downstream: To DOWN: OkResponse(Response { id: 0, error: None, result: Bool(true) })
STD OUT: 2024-04-05T21:13:41.499750Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"id\":0,\"error\":null,\"result\":true}\n"
STD OUT: 2024-04-05T21:13:41.499792Z DEBUG roles_logic_sv2::channel_logic::channel_factory: Checking target for share Extended(SubmitSharesExtended { channel_id: 1, sequence_number: 0, job_id: 1, nonce: 4075, ntime: 1712351681, version: 536870912, extranonce: Owned([0, 0, 0, 0, 0, 0, 0, 0]) })
STD OUT: 2024-04-05T21:13:41.499816Z DEBUG roles_logic_sv2::channel_logic::channel_factory: Bitcoin target: Target { head: 0, tail: 0 }
STD OUT: 2024-04-05T21:13:41.499820Z DEBUG roles_logic_sv2::channel_logic::channel_factory: Upstream target: "000002cbd3e84cb8c95196d545017a21b956ff20c06c583a8aaebeee82979711"
STD OUT: 2024-04-05T21:13:41.499822Z DEBUG roles_logic_sv2::channel_logic::channel_factory: Hash: "00009b85f49babceeb78e7fbc7f133542a432bd5f02889b615d30ac1b9a3edfd"
STD OUT: 2024-04-05T21:13:41.499826Z DEBUG translator_sv2::lib::proxy::bridge: SHARE MEETS DOWNSTREAM TARGET
...
STD OUT: 2024-04-05T21:13:50.223597Z DEBUG translator_sv2::lib::downstream_sv1::downstream: Sending to Mining Device: 127.0.0.1:59457 - "{\"id\":0,\"error\":null,\"result\":true}\n"
RECV Sv2Frame {
    header: Header {
        extension_type: 32768,
        msg_type: 22,
        msg_length: U24(
            40,
        ),
    },
    payload: None,
    serialized: Some(
        Slice {
            offset: 0x0000000140058000,
            len: 46,
            index: 1,
            shared_state: SharedState(
                128,
            ),
            owned: None,
        },
    ),
}
WRONG MESSAGE TYPE expected: 27 received: 22
Disconnected from client while reading : early eof - 127.0.0.1:59456
thread 'main' panicked at src/main.rs:429:13:
TEST FAILED
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
mg test failed
Fi3 commented 6 months ago

@plebhash Good idea. Next thing should be add to the reports (maybe editing them so we don't lose context) the most likely cause that generate the error and a description of the error, and when possible a way to reproduce it.

Fi3 commented 6 months ago

@pavlenex 2 things that will make us wait less are (1) split the MG tests in single units so that can be ran independently when they fail (2) use github cache for target so that we are not going to recompile evertyhing from scratch everytime. This are 2 things pretty easy to do the will improve the ux A LOT

Fi3 commented 6 months ago

Another solution that I like less is to not require them in dev and fix the tests only before merging in main. So that we have to wait only one time instead of several times cause multiple PR in dev become only one in main

plebhash commented 6 months ago

Another solution that I like less is to not require them in dev and fix the tests only before merging in main. So that we have to wait only one time instead of several times cause multiple PR in dev become only one in main

concept NACK

I think this one has potential to create chaos and confusion. If a PR to dev is breaking MG, it's better that we catch it early and fix it ASAP.

If we only see MG is broken when we are merging from dev to main, the debugging process will be much harder, because we don't know which changes introduced the bug.

Debugging MG is already quite complex, we should avoid introducing more complexity.

plebhash commented 6 months ago

@plebhash Good idea. Next thing should be add to the reports (maybe editing them so we don't lose context) the most likely cause that generate the error and a description of the error, and when possible a way to reproduce it.

reports is not possible, because mg_coverage.xml is only generated after a successful execution of MG.

if it breaks, no report is generated.

Fi3 commented 6 months ago

@plebhash Good idea. Next thing should be add to the reports (maybe editing them so we don't lose context) the most likely cause that generate the error and a description of the error, and when possible a way to reproduce it.

reports is not possible, because mg_coverage.xml is only generated after a successful execution of MG.

if it breaks, no report is generated.

I dont' get you point? Mayebe you misunderstood I mean editing your above messages

plebhash commented 6 months ago

@plebhash Good idea. Next thing should be add to the reports (maybe editing them so we don't lose context) the most likely cause that generate the error and a description of the error, and when possible a way to reproduce it.

reports is not possible, because mg_coverage.xml is only generated after a successful execution of MG. if it breaks, no report is generated.

I dont' get you point? Mayebe you misunderstood I mean editing your above messages

yes I guess I misunderstood, I thought that by report you meant mg_coverage.xml, which is a report file that MG generates after every successful run

for the logs I pasted above, I already cleaned them up by following these criteria:

pavlenex commented 5 months ago

I believe this one should be fixed (hopefully by #905) If MG tests keep being unreliable we should log a new, more specified issue.