tarantool / tarantool-qa

QA related issues of Tarantool
3 stars 0 forks source link

[2.11] integration between tarantool and cartridge is broken #315

Closed ylobankov closed 1 year ago

ylobankov commented 1 year ago

Since commit https://github.com/tarantool/tarantool/commit/9749b8af070240f11681feadc1d2edbf3883e873 integration between tarantool and cartridge is broken according to this failed test:

Failed tests:
-------------

1) integration.rejoin.test_rebootstrap
...ork/tarantool/tarantool/test/integration/rejoin_test.lua:55: Expected an error when calling function but no error generated
stack traceback:
    ...ntool/tarantool/.rocks/share/tarantool/luatest/utils.lua:53: in function 'luatest_error'
    .../tarantool/.rocks/share/tarantool/luatest/assertions.lua:61: in function 'failure'
    .../tarantool/.rocks/share/tarantool/luatest/assertions.lua:94: in function 'assert_error'
    ...ork/tarantool/tarantool/test/integration/rejoin_test.lua:55: in function 'integration.rejoin.test_rebootstrap'
    ...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:348: in function <...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:347>
    [C]: in function 'xpcall'
    ...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:347: in function 'super'
    ...l/tarantool/.rocks/share/tarantool/luatest/capturing.lua:106: in function 'protected_call'
    ...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:424: in function 'super'
    ...
    [C]: in function 'xpcall'
    ...ntool/tarantool/.rocks/share/tarantool/luatest/utils.lua:37: in function 'run_tests'
    ...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:262: in function <...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:258>
    [C]: in function 'xpcall'
    ...l/tarantool/.rocks/share/tarantool/luatest/capturing.lua:74: in function <...l/tarantool/.rocks/share/tarantool/luatest/capturing.lua:72>
    [C]: in function 'xpcall'
    ...tool/tarantool/.rocks/share/tarantool/luatest/runner.lua:51: in function 'fn'
    ...tool/.rocks/share/tarantool/luatest/sandboxed_runner.lua:14: in function 'run'
    ...antool/.rocks/share/tarantool/luatest/cli_entrypoint.lua:4: in function <...antool/.rocks/share/tarantool/luatest/cli_entrypoint.lua:3>
    ....rocks/share/tarantool/rocks/luatest/0.5.7-1/bin/luatest:3: in main chunk

Captured stdout:
master | 2023-06-14 08:39:58.539 [12930] main/103/srv_basic.lua/cartridge I> Using advertise_uri "localhost:13301"
master | 2023-06-14 08:39:58.539 [12930] main/103/srv_basic.lua/membership.options I> Membership encryption enabled
master | 2023-06-14 08:39:58.[541](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:542) [12930] main/103/srv_basic.lua/cartridge I> Probe uri was successful
master | 2023-06-14 08:39:58.541 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13302
master | 2023-06-14 08:39:58.541 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13302
master | 2023-06-14 08:39:58.[542](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:543) [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13302
master | 2023-06-14 08:39:58.542 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13301
master | 2023-06-14 08:39:58.542 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13301
master | 2023-06-14 08:39:58.542 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13301
master | 2023-06-14 08:39:58.[543](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:544) [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:3301
master | 2023-06-14 08:39:58.543 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:3301
master | 2023-06-14 08:39:58.543 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:3301
master | 2023-06-14 08:39:58.543 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13300
master | 2023-06-14 08:39:58.543 [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13300
master | 2023-06-14 08:39:58.[544](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:545) [12930] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13300
master | 2023-06-14 08:39:58.[545](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:546) [12930] main/107/http/0.0.0.0:8081/socket I> started
master | 2023-06-14 08:39:58.[546](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:547) [12930] main/103/srv_basic.lua/cartridge I> Listening HTTP on 0.0.0.0:8081
master | 2023-06-14 08:39:58.546 [12930] main/103/srv_basic.lua/cartridge.roles I> Load modules and register
master | 2023-06-14 08:39:58.565 [12930] main/108/remote_control/127.0.0.1:13301/socket I> started
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> Remote control bound to 127.0.0.1:13301
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> Remote control ready to accept connections
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> Instance state changed:  -> Unconfigured
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> Cartridge 2.8.0
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> server alias master
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> advertise uri localhost:13301
master | 2023-06-14 08:39:58.565 [12930] main/103/srv_basic.lua/cartridge.confapplier I> working directory /tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13301
master | 2023-06-14 08:39:58.565 [12930] main I> entering the event loop
slave | 2023-06-14 08:39:58.793 [12940] main/103/srv_basic.lua/cartridge I> Using advertise_uri "localhost:13302"
slave | 2023-06-14 08:39:58.793 [12940] main/103/srv_basic.lua/membership.options I> Membership encryption enabled
slave | 2023-06-14 08:39:58.794 [12940] main/103/srv_basic.lua/cartridge I> Probe uri was successful
slave | 2023-06-14 08:39:58.795 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13303
slave | 2023-06-14 08:39:58.795 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13303
slave | 2023-06-14 08:39:58.796 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13303
slave | 2023-06-14 08:39:58.796 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13302
slave | 2023-06-14 08:39:58.796 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13302
slave | 2023-06-14 08:39:58.796 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13302
slave | 2023-06-14 08:39:58.797 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:3301
slave | 2023-06-14 08:39:58.797 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:3301
slave | 2023-06-14 08:39:58.797 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:3301
slave | 2023-06-14 08:39:58.798 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13301
slave | 2023-06-14 08:39:58.798 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13301
slave | 2023-06-14 08:39:58.798 [12940] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13301
slave | 2023-06-14 08:39:58.800 [12940] main/107/http/0.0.0.0:8082/socket I> started
slave | 2023-06-14 08:39:58.801 [12940] main/103/srv_basic.lua/cartridge I> Listening HTTP on 0.0.0.0:8082
slave | 2023-06-14 08:39:58.801 [12940] main/103/srv_basic.lua/cartridge.roles I> Load modules and register
slave | 2023-06-14 08:39:58.820 [12940] main/108/remote_control/127.0.0.1:13302/socket I> started
slave | 2023-06-14 08:39:58.820 [12940] main/103/srv_basic.lua/cartridge.confapplier I> Remote control bound to 127.0.0.1:13302
slave | 2023-06-14 08:39:58.821 [12940] main/103/srv_basic.lua/cartridge.confapplier I> Remote control ready to accept connections
slave | 2023-06-14 08:39:58.821 [12940] main/103/srv_basic.lua/cartridge.confapplier I> Instance state changed:  -> Unconfigured
slave | 2023-06-14 08:39:58.821 [12940] main/103/srv_basic.lua/cartridge.confapplier I> Cartridge 2.8.0
slave | 2023-06-14 08:39:58.821 [12940] main/103/srv_basic.lua/cartridge.confapplier I> server alias slave
slave | 2023-06-14 08:39:58.821 [12940] main/103/srv_basic.lua/cartridge.confapplier I> advertise uri localhost:13302
slave | 2023-06-14 08:39:58.821 [12940] main/103/srv_basic.lua/cartridge.confapplier I> working directory /tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13302
slave | 2023-06-14 08:39:58.822 [12940] main I> entering the event loop
master | 2023-06-14 08:39:58.904 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:576 W> Updating config clusterwide...
master | 2023-06-14 08:39:58.905 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:452 W> (2PC) patch_clusterwide upload phase...
master | 2023-06-14 08:39:58.909 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:465 W> (2PC) patch_clusterwide prepare phase...
master | 2023-06-14 08:39:58.909 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Validate roles configurations
master | 2023-06-14 08:39:58.909 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Validate config "ddl-manager" role
master | 2023-06-14 08:39:58.909 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully validated config "ddl-manager" role in 0.000010 sec
master | 2023-06-14 08:39:58.909 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Validate config "vshard-router" role
master | 2023-06-14 08:39:58.910 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully validated config "vshard-router" role in 0.000023 sec
master | 2023-06-14 08:39:58.910 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Validate config "myrole" role
master | 2023-06-14 08:39:58.910 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully validated config "myrole" role in 0.000001 sec
master | 2023-06-14 08:39:58.910 [12930] main/120/remote_control/127.0.0.1:49396/cartridge.roles I> Roles configuration validation finished
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Validate roles configurations
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Validate config "ddl-manager" role
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully validated config "ddl-manager" role in 0.000011 sec
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Validate config "vshard-router" role
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully validated config "vshard-router" role in 0.000026 sec
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Validate config "myrole" role
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully validated config "myrole" role in 0.000001 sec
slave | 2023-06-14 08:39:58.910 [12940] main/114/remote_control/127.0.0.1:60638/cartridge.roles I> Roles configuration validation finished
master | 2023-06-14 08:39:58.914 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:474 W> Prepared for patch_clusterwide at localhost:13301
master | 2023-06-14 08:39:58.914 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:474 W> Prepared for patch_clusterwide at localhost:13302
master | 2023-06-14 08:39:58.914 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:498 W> (2PC) patch_clusterwide commit phase...
slave | 2023-06-14 08:39:58.914 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Instance state changed: Unconfigured -> BootstrappingBox
master | 2023-06-14 08:39:58.914 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Instance state changed: Unconfigured -> BootstrappingBox
master | 2023-06-14 08:39:58.915 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier confapplier.lua:556 W> Calling box.cfg()...
master | 2023-06-14 08:39:58.915 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
slave | 2023-06-14 08:39:58.915 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier confapplier.lua:556 W> Calling box.cfg()...
slave | 2023-06-14 08:39:58.915 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
slave | 2023-06-14 08:39:58.916 [12940] main/115/remote_control/127.0.0.1:60638 I> Tarantool 2.11.1-entrypoint-125-g9749b8af0 Linux-x86_64-RelWithDebInfo
slave | 2023-06-14 08:39:58.917 [12940] main/115/remote_control/127.0.0.1:60638 I> log level 5
master | 2023-06-14 08:39:58.917 [12930] main/121/remote_control/127.0.0.1:49396 I> Tarantool 2.11.1-entrypoint-125-g9749b8af0 Linux-x86_64-RelWithDebInfo
master | 2023-06-14 08:39:58.917 [12930] main/121/remote_control/127.0.0.1:49396 I> log level 5
master | 2023-06-14 08:39:58.917 [12930] main/121/remote_control/127.0.0.1:49396 I> wal/engine cleanup is paused
master | 2023-06-14 08:39:58.917 [12930] main/121/remote_control/127.0.0.1:49396 I> mapping 268435456 bytes for memtx tuple arena...
master | 2023-06-14 08:39:58.917 [12930] main/121/remote_control/127.0.0.1:49396 I> Actual slab_alloc_factor calculated on the basis of desired slab_alloc_factor = 1.044274
slave | 2023-06-14 08:39:58.917 [12940] main/115/remote_control/127.0.0.1:60638 I> wal/engine cleanup is paused
master | 2023-06-14 08:39:58.917 [12930] main/121/remote_control/127.0.0.1:49396 I> mapping 134217728 bytes for vinyl tuple arena...
slave | 2023-06-14 08:39:58.917 [12940] main/115/remote_control/127.0.0.1:60638 I> mapping 268435456 bytes for memtx tuple arena...
slave | 2023-06-14 08:39:58.917 [12940] main/115/remote_control/127.0.0.1:60638 I> Actual slab_alloc_factor calculated on the basis of desired slab_alloc_factor = 1.044274
slave | 2023-06-14 08:39:58.917 [12940] main/115/remote_control/127.0.0.1:60638 I> mapping 134217728 bytes for vinyl tuple arena...
slave | 2023-06-14 08:39:58.925 [12940] main/115/remote_control/127.0.0.1:60638 I> update replication_synchro_quorum = 1
slave | 2023-06-14 08:39:58.925 [12940] main/115/remote_control/127.0.0.1:60638 I> instance uuid aaaaaaaa-aaaa-0000-0000-000000000002
slave | 2023-06-14 08:39:58.925 [12940] main/116/remote_control/127.0.0.1:60638/cartridge.confapplier I> Cartridge 2.8.0
slave | 2023-06-14 08:39:58.925 [12940] main/116/remote_control/127.0.0.1:60638/cartridge.confapplier I> server alias slave
slave | 2023-06-14 08:39:58.925 [12940] main/116/remote_control/127.0.0.1:60638/cartridge.confapplier I> advertise uri localhost:13302
slave | 2023-06-14 08:39:58.925 [12940] main/116/remote_control/127.0.0.1:60638/cartridge.confapplier I> working directory /tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13302
master | 2023-06-14 08:39:58.923 [12930] main/121/remote_control/127.0.0.1:49396 I> update replication_synchro_quorum = 1
master | 2023-06-14 08:39:58.923 [12930] main/121/remote_control/127.0.0.1:49396 I> instance uuid aaaaaaaa-aaaa-0000-0000-000000000001
master | 2023-06-14 08:39:58.923 [12930] main/122/remote_control/127.0.0.1:49396/cartridge.confapplier I> Cartridge 2.8.0
master | 2023-06-14 08:39:58.923 [12930] main/122/remote_control/127.0.0.1:49396/cartridge.confapplier I> server alias master
master | 2023-06-14 08:39:58.923 [12930] main/122/remote_control/127.0.0.1:49396/cartridge.confapplier I> advertise uri localhost:13301
master | 2023-06-14 08:39:58.924 [12930] main/122/remote_control/127.0.0.1:49396/cartridge.confapplier I> working directory /tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13301
master | 2023-06-14 08:39:58.924 [12930] main/121/remote_control/127.0.0.1:49396 I> initializing an empty data directory
slave | 2023-06-14 08:39:58.925 [12940] main/115/remote_control/127.0.0.1:60638 I> connecting to 1 replicas
master | 2023-06-14 08:39:58.933 [12930] main/121/remote_control/127.0.0.1:49396 I> assigned id 1 to replica aaaaaaaa-aaaa-0000-0000-000000000001
master | 2023-06-14 08:39:58.933 [12930] main/121/remote_control/127.0.0.1:49396 I> update replication_synchro_quorum = 1
master | 2023-06-14 08:39:58.933 [12930] main/121/remote_control/127.0.0.1:49396 I> cluster uuid aaaaaaaa-0000-0000-0000-000000000000
slave | 2023-06-14 08:39:58.933 [12940] main/123/applier/admin@localhost:13301 applier.cc:373 E> ER_NIL_UUID: Nil UUID is reserved and can't be used in replication
slave | 2023-06-14 08:39:58.933 [12940] main/123/applier/admin@localhost:13301 I> can't connect to master
slave | 2023-06-14 08:39:58.933 [12940] main/123/applier/admin@localhost:13301 applier.cc:373 E> ER_NIL_UUID: Nil UUID is reserved and can't be used in replication
master | 2023-06-14 08:39:58.933 [12930] main/129/remote_control/127.0.0.1:49412/cartridge.remote-control I> Peer closed when read packet size
master | 2023-06-14 08:39:58.934 [12930] snapshot/101/main I> saving snapshot `/tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13301/00000000000000000000.snap.inprogress'
master | 2023-06-14 08:39:58.936 [12930] snapshot/101/main I> done
master | 2023-06-14 08:39:58.936 [12930] main/121/remote_control/127.0.0.1:49396 I> ready to accept requests
master | 2023-06-14 08:39:58.936 [12930] main/123/gc I> wal/engine cleanup is resumed
master | 2023-06-14 08:39:58.936 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg I> set 'instance_uuid' configuration option to "aaaaaaaa-aaaa-0000-0000-000000000001"
master | 2023-06-14 08:39:58.936 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg I> set 'replication_connect_quorum' configuration option to 0
master | 2023-06-14 08:39:58.936 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg I> set 'bootstrap_strategy' configuration option to "legacy"
master | 2023-06-14 08:39:58.936 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg I> set 'replicaset_uuid' configuration option to "aaaaaaaa-0000-0000-0000-000000000000"
master | 2023-06-14 08:39:58.936 [12930] main/124/checkpoint_daemon I> scheduled next checkpoint for Wed Jun 14 10:17:52 2023
master | 2023-06-14 08:39:58.937 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Making sure user "admin" exists...
master | 2023-06-14 08:39:58.937 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Granting replication permissions to "admin"...
master | 2023-06-14 08:39:58.937 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Setting password for user "admin" ...
master | 2023-06-14 08:39:58.938 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Remote control stopped
master | 2023-06-14 08:39:58.938 [12930] main/108/remote_control/127.0.0.1:13301/socket I> stopped
master | 2023-06-14 08:39:58.939 [12930] main/121/remote_control/127.0.0.1:49396 I> tx_binary: bound to 127.0.0.1:13301
master | 2023-06-14 08:39:58.939 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg I> set 'listen' configuration option to "127.0.0.1:13301"
master | 2023-06-14 08:39:58.939 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.remote-control I> dropping fd 1787, aka 127.0.0.1:13301, peer 127.0.0.1:49394
master | 2023-06-14 08:39:58.939 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Instance state changed: BootstrappingBox -> ConnectingFullmesh
master | 2023-06-14 08:39:58.939 [12930] main/121/remote_control/127.0.0.1:49396 I> connecting to 2 replicas
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396 C> failed to connect to 2 out of 2 replicas
master | 2023-06-14 08:39:58.940 [12930] main/109/remote_control/127.0.0.1:49394/cartridge.remote-control I> Peer closed when read packet size
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396 I> leaving orphan mode
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg I> set 'replication' configuration option to ["admin@localhost:13301","admin@localhost:13302"]
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Instance state changed: ConnectingFullmesh -> BoxConfigured
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Instance state changed: BoxConfigured -> ConfiguringRoles
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
master | 2023-06-14 08:39:58.940 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.failover I> Failover disabled
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.failover I> Replicaset aaaaaaaa-0000-0000-0000-000000000000 (me): new leader aaaaaaaa-aaaa-0000-0000-000000000001 (me), was nil
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Start applying roles config
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Init "ddl-manager" role
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully initialized "ddl-manager" role in 0.000002 sec
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Applying "ddl-manager" role config
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully applied "ddl-manager" role config in 0.000019 sec
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Init "myrole-permanent" role
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/test.entrypoint.srv_basic I> --- init({is_master = true})
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully initialized "myrole-permanent" role in 0.000007 sec
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Applying "myrole-permanent" role config
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/test.entrypoint.srv_basic I> --- apply_config({is_master = true})
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully applied "myrole-permanent" role config in 0.000006 sec
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Init "myrole" role
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully initialized "myrole" role in 0.000026 sec
master | 2023-06-14 08:39:58.941 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Applying "myrole" role config
master | 2023-06-14 08:39:58.947 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Successfully applied "myrole" role config in 0.006688 sec
master | 2023-06-14 08:39:58.947 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.roles I> Roles configuration finished
master | 2023-06-14 08:39:58.947 [12930] main/121/remote_control/127.0.0.1:49396/cartridge.confapplier I> Instance state changed: ConfiguringRoles -> RolesConfigured
master | 2023-06-14 08:39:58.948 [12930] main/115/remote_control/127.0.0.1:49396/cartridge.remote-control I> Peer closed when read packet size
master | 2023-06-14 08:39:58.948 [12930] main/115/remote_control/127.0.0.1:49396 utils.c:610 E> LuajitError: builtin/socket.lua:88: attempt to use closed socket
slave | 2023-06-14 08:39:58.949 [12940] main/124/remote_control/127.0.0.1:60646/cartridge.remote-control I> Peer closed when read packet size
master | 2023-06-14 08:39:58.949 [12930] main/136/applier/admin@localhost:13301 I> remote master aaaaaaaa-aaaa-0000-0000-000000000001 at 127.0.0.1:13301 running Tarantool 2.11.1
master | 2023-06-14 08:39:58.949 [12930] main/135/applier/admin@localhost:13302 applier.cc:373 E> ER_NIL_UUID: Nil UUID is reserved and can't be used in replication
master | 2023-06-14 08:39:58.949 [12930] main/135/applier/admin@localhost:13302 I> can't connect to master
master | 2023-06-14 08:39:58.949 [12930] main/135/applier/admin@localhost:13302 applier.cc:373 E> ER_NIL_UUID: Nil UUID is reserved and can't be used in replication
master | 2023-06-14 08:39:58.949 [12930] main/136/applier/admin@localhost:13301 I> leaving orphan mode
slave | 2023-06-14 08:39:59.934 [12940] main/123/applier/admin@localhost:13301 I> remote master aaaaaaaa-aaaa-0000-0000-000000000001 at 127.0.0.1:13301 running Tarantool 2.11.1
slave | 2023-06-14 08:39:59.935 [12940] main/115/remote_control/127.0.0.1:60638 I> connected to 1 replicas
slave | 2023-06-14 08:39:59.935 [12940] main/123/applier/admin@localhost:13301 I> authenticated
slave | 2023-06-14 08:39:59.935 [12940] main/115/remote_control/127.0.0.1:60638 I> bootstrapping replica from aaaaaaaa-aaaa-0000-0000-000000000001 at 127.0.0.1:13301
master | 2023-06-14 08:39:59.935 [12930] main/128/main I> joining replica aaaaaaaa-aaaa-0000-0000-000000000002 at fd 1789, aka 127.0.0.1:13301, peer of 127.0.0.1:56562
slave | 2023-06-14 08:39:59.939 [12940] main/123/applier/admin@localhost:13301 I> cluster uuid aaaaaaaa-0000-0000-0000-000000000000
master | 2023-06-14 08:39:59.943 [12930] main/128/main I> initial data sent.
master | 2023-06-14 08:39:59.943 [12930] main/128/main I> assigned id 2 to replica aaaaaaaa-aaaa-0000-0000-000000000002
master | 2023-06-14 08:39:59.943 [12930] main/128/main I> update replication_synchro_quorum = 2
master | 2023-06-14 08:39:59.943 [12930] main/128/main I> RAFT: fencing paused
master | 2023-06-14 08:39:59.943 [12930] relay/127.0.0.1:56562/101/main I> recover from `/tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13301/00000000000000000000.xlog'
master | 2023-06-14 08:39:59.944 [12930] main/128/main I> final data sent.
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> assigned id 1 to replica aaaaaaaa-aaaa-0000-0000-000000000001
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> update replication_synchro_quorum = 1
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> initial data received
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> assigned id 2 to replica aaaaaaaa-aaaa-0000-0000-000000000002
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> update replication_synchro_quorum = 2
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> RAFT: fencing paused
slave | 2023-06-14 08:39:59.946 [12940] main/123/applier/admin@localhost:13301 I> final data received
slave | 2023-06-14 08:39:59.948 [12940] snapshot/101/main I> saving snapshot `/tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13302/00000000000000000003.snap.inprogress'
slave | 2023-06-14 08:39:59.950 [12940] snapshot/101/main I> done
master | 2023-06-14 08:39:59.950 [12930] main/135/applier/admin@localhost:13302 applier.cc:373 E> ER_NIL_UUID: Nil UUID is reserved and can't be used in replication
slave | 2023-06-14 08:39:59.950 [12940] main/115/remote_control/127.0.0.1:60638 I> ready to accept requests
slave | 2023-06-14 08:39:59.950 [12940] main/115/remote_control/127.0.0.1:60638 I> synchronizing with 1 replicas
master | 2023-06-14 08:39:59.951 [12930] main/128/main I> subscribed replica aaaaaaaa-aaaa-0000-0000-000000000002 at fd 1789, aka 127.0.0.1:13301, peer of 127.0.0.1:56562
master | 2023-06-14 08:39:59.951 [12930] main/128/main I> remote vclock {1: 3} local vclock {1: 3}
slave | 2023-06-14 08:39:59.951 [12940] main/126/remote_control/127.0.0.1:53260/cartridge.remote-control I> Peer closed when read packet size
slave | 2023-06-14 08:39:59.951 [12940] main/123/applier/admin@localhost:13301 I> subscribed
slave | 2023-06-14 08:39:59.951 [12940] main/123/applier/admin@localhost:13301 I> remote vclock {1: 3} local vclock {1: 3}
slave | 2023-06-14 08:39:59.951 [12940] main/123/applier/admin@localhost:13301 I> RAFT: message {term: 1, state: follower} from 1
master | 2023-06-14 08:39:59.951 [12930] relay/127.0.0.1:56562/101/main I> recover from `/tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13301/00000000000000000000.xlog'
slave | 2023-06-14 08:39:59.951 [12940] main/123/applier/admin@localhost:13301 I> leaving orphan mode
slave | 2023-06-14 08:39:59.951 [12940] main/115/remote_control/127.0.0.1:60638 I> replica set sync complete
slave | 2023-06-14 08:39:59.951 [12940] main/115/remote_control/127.0.0.1:60638 I> leaving orphan mode
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'replication' configuration option to ["admin@localhost:13301"]
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'instance_uuid' configuration option to "aaaaaaaa-aaaa-0000-0000-000000000002"
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'replication_connect_quorum' configuration option to 100
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'bootstrap_strategy' configuration option to "legacy"
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'replicaset_uuid' configuration option to "aaaaaaaa-0000-0000-0000-000000000000"
slave | 2023-06-14 08:39:59.952 [12940] main/118/checkpoint_daemon I> scheduled next checkpoint for Wed Jun 14 09:53:23 2023
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'read_only' configuration option to true
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Making sure user "admin" exists...
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Granting replication permissions to "admin"...
slave | 2023-06-14 08:39:59.952 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Remote control stopped
slave | 2023-06-14 08:39:59.953 [12940] main/108/remote_control/127.0.0.1:13302/socket I> stopped
slave | 2023-06-14 08:39:59.953 [12940] main/115/remote_control/127.0.0.1:60638 I> tx_binary: bound to 127.0.0.1:13302
slave | 2023-06-14 08:39:59.953 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'listen' configuration option to "127.0.0.1:13302"
slave | 2023-06-14 08:39:59.953 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.remote-control I> dropping fd 1794, aka 127.0.0.1:13302, peer 127.0.0.1:60626
slave | 2023-06-14 08:39:59.953 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Instance state changed: BootstrappingBox -> ConnectingFullmesh
slave | 2023-06-14 08:39:59.953 [12940] main/115/remote_control/127.0.0.1:60638 I> connecting to 2 replicas
slave | 2023-06-14 08:39:59.954 [12940] main/109/remote_control/127.0.0.1:60626/cartridge.remote-control I> Peer closed when read packet size
slave | 2023-06-14 08:39:59.955 [12940] main/130/applier/admin@localhost:13301 I> remote master aaaaaaaa-aaaa-0000-0000-000000000001 at 127.0.0.1:13301 running Tarantool 2.11.1
slave | 2023-06-14 08:39:59.956 [12940] main/131/applier/admin@localhost:13302 I> remote master aaaaaaaa-aaaa-0000-0000-000000000002 at 127.0.0.1:13302 running Tarantool 2.11.1
slave | 2023-06-14 08:39:59.957 [12940] main/115/remote_control/127.0.0.1:60638 I> connected to 2 replicas
slave | 2023-06-14 08:39:59.957 [12940] main/130/applier/admin@localhost:13301 I> authenticated
slave | 2023-06-14 08:39:59.957 [12940] main/130/applier/admin@localhost:13301 I> can't join/subscribe
slave | 2023-06-14 08:39:59.957 [12940] main/130/applier/admin@localhost:13301 box.cc:3939 E> ER_CFG: Incorrect value for option 'replication': duplicate connection with the same replica UUID
slave | 2023-06-14 08:39:59.957 [12940] main/130/applier/admin@localhost:13301 I> will retry every 1.00 second
master | 2023-06-14 08:40:00.952 [12930] main/135/applier/admin@localhost:13302 I> remote master aaaaaaaa-aaaa-0000-0000-000000000002 at 127.0.0.1:13302 running Tarantool 2.11.1
master | 2023-06-14 08:40:00.952 [12930] main/135/applier/admin@localhost:13302 I> authenticated
slave | 2023-06-14 08:40:00.953 [12940] main/122/main I> subscribed replica aaaaaaaa-aaaa-0000-0000-000000000001 at fd 1804, aka 127.0.0.1:13302, peer of 127.0.0.1:53294
slave | 2023-06-14 08:40:00.953 [12940] main/122/main I> remote vclock {1: 3} local vclock {1: 3}
master | 2023-06-14 08:40:00.953 [12930] main/135/applier/admin@localhost:13302 I> subscribed
master | 2023-06-14 08:40:00.953 [12930] main/135/applier/admin@localhost:13302 I> remote vclock {1: 3} local vclock {1: 3}
slave | 2023-06-14 08:40:00.953 [12940] main/117/gc I> wal/engine cleanup is resumed
master | 2023-06-14 08:40:00.953 [12930] main/135/applier/admin@localhost:13302 I> RAFT: message {term: 1, state: follower} from 2
master | 2023-06-14 08:40:00.954 [12930] main/135/applier/admin@localhost:13302 I> leaving orphan mode
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638 I> synchronizing with 2 replicas
slave | 2023-06-14 08:40:00.958 [12940] main/131/applier/admin@localhost:13302 I> leaving orphan mode
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638 I> replica set sync complete
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638 I> leaving orphan mode
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'replication' configuration option to ["admin@localhost:13301","admin@localhost:13302"]
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Instance state changed: ConnectingFullmesh -> BoxConfigured
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Instance state changed: BoxConfigured -> ConfiguringRoles
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638 I> leaving orphan mode
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638/box.load_cfg I> set 'replication_connect_quorum' configuration option to 0
slave | 2023-06-14 08:40:00.958 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.failover I> Failover disabled
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.failover I> Replicaset aaaaaaaa-0000-0000-0000-000000000000 (me): new leader aaaaaaaa-aaaa-0000-0000-000000000001 ("localhost:13301"), was nil
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Start applying roles config
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Init "ddl-manager" role
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully initialized "ddl-manager" role in 0.000003 sec
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Applying "ddl-manager" role config
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully applied "ddl-manager" role config in 0.000001 sec
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Init "myrole-permanent" role
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/test.entrypoint.srv_basic I> --- init({is_master = false})
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully initialized "myrole-permanent" role in 0.000008 sec
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Applying "myrole-permanent" role config
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/test.entrypoint.srv_basic I> --- apply_config({is_master = false})
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully applied "myrole-permanent" role config in 0.000005 sec
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Init "myrole" role
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully initialized "myrole" role in 0.000002 sec
slave | 2023-06-14 08:40:00.959 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Applying "myrole" role config
slave | 2023-06-14 08:40:00.966 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Successfully applied "myrole" role config in 0.006936 sec
slave | 2023-06-14 08:40:00.966 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.roles I> Roles configuration finished
slave | 2023-06-14 08:40:00.966 [12940] main/115/remote_control/127.0.0.1:60638/cartridge.confapplier I> Instance state changed: ConfiguringRoles -> RolesConfigured
master | 2023-06-14 08:40:00.966 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:507 W> Committed patch_clusterwide at localhost:13301
master | 2023-06-14 08:40:00.966 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:507 W> Committed patch_clusterwide at localhost:13302
master | 2023-06-14 08:40:00.966 [12930] main/110/http/127.0.0.1:56466/cartridge.twophase twophase.lua:652 W> Clusterwide config updated successfully
slave | 2023-06-14 08:40:00.966 [12940] main/110/remote_control/127.0.0.1:60638/cartridge.remote-control I> Peer closed when read packet size
slave | 2023-06-14 08:40:00.966 [12940] main/110/remote_control/127.0.0.1:60638 utils.c:610 E> LuajitError: builtin/socket.lua:88: attempt to use closed socket
spare | 2023-06-14 08:40:01.414 [12964] main/103/srv_basic.lua/cartridge I> Using advertise_uri "localhost:13303"
spare | 2023-06-14 08:40:01.414 [12964] main/103/srv_basic.lua/membership.options I> Membership encryption enabled
spare | 2023-06-14 08:40:01.415 [12964] main/103/srv_basic.lua/cartridge I> Probe uri was successful
spare | 2023-06-14 08:40:01.416 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13304
spare | 2023-06-14 08:40:01.416 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13304
spare | 2023-06-14 08:40:01.417 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13304
spare | 2023-06-14 08:40:01.417 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13303
spare | 2023-06-14 08:40:01.417 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13303
spare | 2023-06-14 08:40:01.418 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13303
spare | 2023-06-14 08:40:01.418 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:3301
spare | 2023-06-14 08:40:01.418 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:3301
spare | 2023-06-14 08:40:01.419 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:3301
spare | 2023-06-14 08:40:01.419 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 127.0.0.1:13302
spare | 2023-06-14 08:40:01.420 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 10.1.255.255:13302
spare | 2023-06-14 08:40:01.420 [12964] main/103/srv_basic.lua/membership I> Membership BROADCAST sent to 172.17.255.255:13302
spare | 2023-06-14 08:40:01.422 [12964] main/107/http/0.0.0.0:8083/socket I> started
spare | 2023-06-14 08:40:01.422 [12964] main/103/srv_basic.lua/cartridge I> Listening HTTP on 0.0.0.0:8083
spare | 2023-06-14 08:40:01.423 [12964] main/103/srv_basic.lua/cartridge.roles I> Load modules and register
spare | 2023-06-14 08:40:01.442 [12964] main/108/remote_control/127.0.0.1:13303/socket I> started
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> Remote control bound to 127.0.0.1:13303
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> Remote control ready to accept connections
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> Instance state changed:  -> Unconfigured
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> Cartridge 2.8.0
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> server alias spare
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> advertise uri localhost:13303
spare | 2023-06-14 08:40:01.442 [12964] main/103/srv_basic.lua/cartridge.confapplier I> working directory /tmp/tmp.cartridge.voAgxvz4hqGS/13303
spare | 2023-06-14 08:40:01.442 [12964] main I> entering the event loop
master | 2023-06-14 08:40:01.531 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:576 W> Updating config clusterwide...
master | 2023-06-14 08:40:01.532 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:452 W> (2PC) patch_clusterwide upload phase...
master | 2023-06-14 08:40:01.536 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:465 W> (2PC) patch_clusterwide prepare phase...
slave | 2023-06-14 08:40:01.536 [12940] main/135/main/cartridge.roles I> Validate roles configurations
slave | 2023-06-14 08:40:01.536 [12940] main/135/main/cartridge.roles I> Validate config "ddl-manager" role
slave | 2023-06-14 08:40:01.536 [12940] main/135/main/cartridge.roles I> Successfully validated config "ddl-manager" role in 0.000011 sec
slave | 2023-06-14 08:40:01.536 [12940] main/135/main/cartridge.roles I> Validate config "myrole" role
slave | 2023-06-14 08:40:01.536 [12940] main/135/main/cartridge.roles I> Successfully validated config "myrole" role in 0.000002 sec
slave | 2023-06-14 08:40:01.536 [12940] main/135/main/cartridge.roles I> Roles configuration validation finished
master | 2023-06-14 08:40:01.537 [12930] main/139/main/cartridge.roles I> Validate roles configurations
master | 2023-06-14 08:40:01.537 [12930] main/139/main/cartridge.roles I> Validate config "ddl-manager" role
master | 2023-06-14 08:40:01.537 [12930] main/139/main/cartridge.roles I> Successfully validated config "ddl-manager" role in 0.000011 sec
master | 2023-06-14 08:40:01.537 [12930] main/139/main/cartridge.roles I> Validate config "myrole" role
master | 2023-06-14 08:40:01.537 [12930] main/139/main/cartridge.roles I> Successfully validated config "myrole" role in 0.000001 sec
master | 2023-06-14 08:40:01.537 [12930] main/139/main/cartridge.roles I> Roles configuration validation finished
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Validate roles configurations
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Validate config "ddl-manager" role
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully validated config "ddl-manager" role in 0.000011 sec
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Validate config "vshard-router" role
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully validated config "vshard-router" role in 0.000043 sec
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Validate config "myrole" role
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully validated config "myrole" role in 0.000001 sec
spare | 2023-06-14 08:40:01.537 [12964] main/117/remote_control/127.0.0.1:39996/cartridge.roles I> Roles configuration validation finished
master | 2023-06-14 08:40:01.542 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:474 W> Prepared for patch_clusterwide at localhost:13301
master | 2023-06-14 08:40:01.542 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:474 W> Prepared for patch_clusterwide at localhost:13302
master | 2023-06-14 08:40:01.542 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:474 W> Prepared for patch_clusterwide at localhost:13303
master | 2023-06-14 08:40:01.542 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:498 W> (2PC) patch_clusterwide commit phase...
master | 2023-06-14 08:40:01.542 [12930] main/139/main/cartridge.twophase I> Backup of active config created: "/tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13301/config.backup"
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.confapplier I> Instance state changed: RolesConfigured -> ConfiguringRoles
master | 2023-06-14 08:40:01.543 [12930] main/139/main/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.failover I> Failover disabled
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.failover I> Replicaset bbbbbbbb-0000-0000-0000-000000000000: new leader bbbbbbbb-bbbb-0000-0000-000000000001 ("localhost:13303"), was nil
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.roles I> Start applying roles config
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.roles I> Applying "ddl-manager" role config
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.roles I> Successfully applied "ddl-manager" role config in 0.000013 sec
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.roles I> Applying "myrole-permanent" role config
master | 2023-06-14 08:40:01.543 [12930] main/139/main/test.entrypoint.srv_basic I> --- apply_config({is_master = true})
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.roles I> Successfully applied "myrole-permanent" role config in 0.000008 sec
master | 2023-06-14 08:40:01.543 [12930] main/139/main/cartridge.roles I> Applying "myrole" role config
slave | 2023-06-14 08:40:01.542 [12940] main/135/main/cartridge.twophase I> Backup of active config created: "/tmp/tmp.cartridge.voAgxvz4hqGS/localhost-13302/config.backup"
slave | 2023-06-14 08:40:01.542 [12940] main/135/main/cartridge.confapplier I> Instance state changed: RolesConfigured -> ConfiguringRoles
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.failover I> Failover disabled
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.failover I> Replicaset bbbbbbbb-0000-0000-0000-000000000000: new leader bbbbbbbb-bbbb-0000-0000-000000000001 ("localhost:13303"), was nil
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.roles I> Start applying roles config
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.roles I> Applying "ddl-manager" role config
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.roles I> Successfully applied "ddl-manager" role config in 0.000002 sec
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.roles I> Applying "myrole-permanent" role config
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/test.entrypoint.srv_basic I> --- apply_config({is_master = false})
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.roles I> Successfully applied "myrole-permanent" role config in 0.000008 sec
slave | 2023-06-14 08:40:01.543 [12940] main/135/main/cartridge.roles I> Applying "myrole" role config
slave | 2023-06-14 08:40:01.550 [12940] main/135/main/cartridge.roles I> Successfully applied "myrole" role config in 0.007410 sec
slave | 2023-06-14 08:40:01.550 [12940] main/135/main/cartridge.roles I> Roles configuration finished
slave | 2023-06-14 08:40:01.551 [12940] main/135/main/cartridge.confapplier I> Instance state changed: ConfiguringRoles -> RolesConfigured
master | 2023-06-14 08:40:01.[549](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:550) [12930] main/139/main/cartridge.roles I> Successfully applied "myrole" role config in 0.006442 sec
master | 2023-06-14 08:40:01.549 [12930] main/139/main/cartridge.roles I> Roles configuration finished
master | 2023-06-14 08:40:01.[550](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:551) [12930] main/139/main/cartridge.confapplier I> Instance state changed: ConfiguringRoles -> RolesConfigured
spare | 2023-06-14 08:40:01.[551](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:552) [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Instance state changed: Unconfigured -> BootstrappingBox
spare | 2023-06-14 08:40:01.551 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier confapplier.lua:556 W> Calling box.cfg()...
spare | 2023-06-14 08:40:01.[552](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:553) [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
spare | 2023-06-14 08:40:01.554 [12964] main/118/remote_control/127.0.0.1:39996 I> Tarantool 2.11.1-entrypoint-125-g9749b8af0 Linux-x86_64-RelWithDebInfo
spare | 2023-06-14 08:40:01.554 [12964] main/118/remote_control/127.0.0.1:39996 I> log level 5
spare | 2023-06-14 08:40:01.554 [12964] main/118/remote_control/127.0.0.1:39996 I> wal/engine cleanup is paused
spare | 2023-06-14 08:40:01.554 [12964] main/118/remote_control/127.0.0.1:39996 I> mapping 67108864 bytes for memtx tuple arena...
spare | 2023-06-14 08:40:01.555 [12964] main/118/remote_control/127.0.0.1:39996 I> Actual slab_alloc_factor calculated on the basis of desired slab_alloc_factor = 1.044274
spare | 2023-06-14 08:40:01.555 [12964] main/118/remote_control/127.0.0.1:39996 I> mapping 134217728 bytes for vinyl tuple arena...
spare | 2023-06-14 08:40:01.559 [12964] main/119/remote_control/127.0.0.1:39996/cartridge.confapplier I> Cartridge 2.8.0
spare | 2023-06-14 08:40:01.559 [12964] main/119/remote_control/127.0.0.1:39996/cartridge.confapplier I> server alias spare
spare | 2023-06-14 08:40:01.559 [12964] main/119/remote_control/127.0.0.1:39996/cartridge.confapplier I> advertise uri localhost:13303
spare | 2023-06-14 08:40:01.559 [12964] main/119/remote_control/127.0.0.1:39996/cartridge.confapplier I> working directory /tmp/tmp.cartridge.voAgxvz4hqGS/13303
spare | 2023-06-14 08:40:01.561 [12964] main/118/remote_control/127.0.0.1:39996 I> update replication_synchro_quorum = 1
spare | 2023-06-14 08:40:01.561 [12964] main/118/remote_control/127.0.0.1:39996 I> instance uuid bbbbbbbb-bbbb-0000-0000-000000000001
spare | 2023-06-14 08:40:01.561 [12964] main/118/remote_control/127.0.0.1:39996 I> initializing an empty data directory
spare | 2023-06-14 08:40:01.570 [12964] main/118/remote_control/127.0.0.1:39996 I> assigned id 1 to replica bbbbbbbb-bbbb-0000-0000-000000000001
spare | 2023-06-14 08:40:01.570 [12964] main/118/remote_control/127.0.0.1:39996 I> update replication_synchro_quorum = 1
spare | 2023-06-14 08:40:01.571 [12964] main/118/remote_control/127.0.0.1:39996 I> cluster uuid bbbbbbbb-0000-0000-0000-000000000000
spare | 2023-06-14 08:40:01.572 [12964] snapshot/101/main I> saving snapshot `/tmp/tmp.cartridge.voAgxvz4hqGS/13303/00000000000000000000.snap.inprogress'
spare | 2023-06-14 08:40:01.574 [12964] snapshot/101/main I> done
spare | 2023-06-14 08:40:01.574 [12964] main/118/remote_control/127.0.0.1:39996 I> ready to accept requests
spare | 2023-06-14 08:40:01.574 [12964] main/120/gc I> wal/engine cleanup is resumed
spare | 2023-06-14 08:40:01.574 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'memtx_memory' configuration option to 1
spare | 2023-06-14 08:40:01.574 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'instance_uuid' configuration option to "bbbbbbbb-bbbb-0000-0000-000000000001"
spare | 2023-06-14 08:40:01.574 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'replication_connect_quorum' configuration option to 0
spare | 2023-06-14 08:40:01.574 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'bootstrap_strategy' configuration option to "legacy"
spare | 2023-06-14 08:40:01.574 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'replicaset_uuid' configuration option to "bbbbbbbb-0000-0000-0000-000000000000"
spare | 2023-06-14 08:40:01.574 [12964] main/121/checkpoint_daemon I> scheduled next checkpoint for Wed Jun 14 09:44:33 2023
spare | 2023-06-14 08:40:01.575 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Making sure user "admin" exists...
spare | 2023-06-14 08:40:01.575 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Granting replication permissions to "admin"...
spare | 2023-06-14 08:40:01.575 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Setting password for user "admin" ...
spare | 2023-06-14 08:40:01.575 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Remote control stopped
spare | 2023-06-14 08:40:01.576 [12964] main/108/remote_control/127.0.0.1:13303/socket I> stopped
spare | 2023-06-14 08:40:01.576 [12964] main/118/remote_control/127.0.0.1:39996 I> tx_binary: bound to 127.0.0.1:13303
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'listen' configuration option to "127.0.0.1:13303"
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.remote-control I> dropping fd 1803, aka 127.0.0.1:13303, peer 127.0.0.1:39990
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Instance state changed: BootstrappingBox -> ConnectingFullmesh
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996 I> connecting to 1 replicas
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996 C> failed to connect to 1 out of 1 replicas
spare | 2023-06-14 08:40:01.577 [12964] main/109/remote_control/127.0.0.1:39990/cartridge.remote-control I> Peer closed when read packet size
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996 I> leaving orphan mode
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg I> set 'replication' configuration option to ["admin@localhost:13303"]
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Instance state changed: ConnectingFullmesh -> BoxConfigured
spare | 2023-06-14 08:40:01.577 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Instance state changed: BoxConfigured -> ConfiguringRoles
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/box.load_cfg load_cfg.lua:734 W> Deprecated option replication_connect_quorum, please use bootstrap_strategy instead
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.failover I> Failover disabled
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.failover I> Replicaset bbbbbbbb-0000-0000-0000-000000000000 (me): new leader bbbbbbbb-bbbb-0000-0000-000000000001 (me), was nil
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.failover I> Replicaset aaaaaaaa-0000-0000-0000-000000000000: new leader aaaaaaaa-aaaa-0000-0000-000000000001 ("localhost:13301"), was nil
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Start applying roles config
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Init "ddl-manager" role
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully initialized "ddl-manager" role in 0.000002 sec
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Applying "ddl-manager" role config
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully applied "ddl-manager" role config in 0.000014 sec
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Init "myrole-permanent" role
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/test.entrypoint.srv_basic I> --- init({is_master = true})
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully initialized "myrole-permanent" role in 0.000006 sec
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Applying "myrole-permanent" role config
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/test.entrypoint.srv_basic I> --- apply_config({is_master = true})
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Successfully applied "myrole-permanent" role config in 0.000006 sec
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.roles I> Roles configuration finished
spare | 2023-06-14 08:40:01.578 [12964] main/118/remote_control/127.0.0.1:39996/cartridge.confapplier I> Instance state changed: ConfiguringRoles -> RolesConfigured
spare | 2023-06-14 08:40:01.578 [12964] main/113/remote_control/127.0.0.1:39996/cartridge.remote-control I> Peer closed when read packet size
spare | 2023-06-14 08:40:01.578 [12964] main/113/remote_control/127.0.0.1:39996 utils.c:610 E> LuajitError: builtin/socket.lua:88: attempt to use closed socket
spare | 2023-06-14 08:40:01.578 [12964] main/130/applier/admin@localhost:13303 I> remote master bbbbbbbb-bbbb-0000-0000-000000000001 at 127.0.0.1:13303 running Tarantool 2.11.1
spare | 2023-06-14 08:40:01.579 [12964] main/130/applier/admin@localhost:13303 I> leaving orphan mode
master | 2023-06-14 08:40:01.578 [12930] main/141/http/127.0.0.1:[553](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:554)50/cartridge.twophase twophase.lua:507 W> Committed patch_clusterwide at localhost:13301
master | 2023-06-14 08:40:01.578 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:507 W> Committed patch_clusterwide at localhost:13302
master | 2023-06-14 08:40:01.578 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:507 W> Committed patch_clusterwide at localhost:13303
master | 2023-06-14 08:40:01.578 [12930] main/141/http/127.0.0.1:55350/cartridge.twophase twophase.lua:652 W> Clusterwide config updated successfully

Ran 460 tests in [570](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:571).[849](https://github.com/tarantool/tarantool/actions/runs/5257103085/jobs/9516712048#step:11:850) seconds, 457 succeeded, 2 xfailed, 1 failed, 30 skipped

=========================================================
Failed tests:

integration.rejoin.test_rebootstrap

Workflow run.