oxidecomputer / propolis

VMM userspace for illumos bhyve
Mozilla Public License 2.0
177 stars 22 forks source link

propolis-server can fail to stop/reboot wedged instances #371

Open jordanhendricks opened 1 year ago

jordanhendricks commented 1 year ago

One thing I noticed in my latest round of debugging #336 was that I couldn't stop or reboot the guest in question. In the case of #336, the crucible upstairs and downstairs were incompatible, and the failure mode was that only a handful of I/Os were making it through, so the guest wasn't able to do much. While this obviously isn't the happy path, I think we need to be able to stop and reboot instances that get stuck in this way.

It's easy to reproduce some form of #336 by combining a downstairs at 894d44 and an upstairs at e7ce7a. For this case, reboot seems to work, but stop failed, first with a 400, then a 500:

jordan@maxwell ~/propolis $ ./cli.sh state run  
Apr 23 20:26:23.437 INFO PUT request to http://172.20.3.73:8000/instance/state, propolis_client address: 172.20.3.73:8000
Error: failed to set instance state

Caused by:
    Bad Status: 400
jordan@maxwell ~/propolis $ 
jordan@maxwell ~/propolis $ ./cli.sh state run
Apr 23 20:26:57.455 INFO PUT request to http://172.20.3.73:8000/instance/state, propolis_client address: 172.20.3.73:8000
Error: failed to set instance state

Caused by:
    Bad Status: 500

server logs:

Apr 23 20:26:23.438 INFO Requested state Run via API, component: vm_controller                                                                                                                                     
Apr 23 20:26:23.438 INFO Queuing external request, disposition: Deny(HaltPending), request: Start, component: external_request_queue
Apr 23 20:26:23.439 INFO request completed, error_message_external: Instance operation failed: Failed to queue requested state change: Instance is preparing to stop, error_message_internal: Instance operation fa
iled: Failed to queue requested state change: Instance is preparing to stop, response_code: 400, uri: /instance/state, method: PUT, req_id: da972575-1e38-46c4-9b06-88fff4c227de, remote_addr: 172.20.3.73:54058, l
ocal_addr: 172.20.3.73:8000                

Apr 23 20:26:57.455 INFO accepted connection, remote_addr: 172.20.3.73:61958, local_addr: 172.20.3.73:8000
Apr 23 20:26:57.456 INFO request completed, error_message_external: Internal Server Error, error_message_internal: Server not initialized (no instance), response_code: 500, uri: /instance/state, method: PUT, req_id: 40a97378-4272-4907-a44b-2320c0c7c1b2, remote_addr: 172.20.3.73:61958, local_addr: 172.20.3.73:8000

I can't recall if this is the exact failure mode I saw when debugging #336, but this particular instance of it feels in the same realm as #363 (maybe?).

gjcolombo commented 1 year ago

@jordanhendricks Just to check, is that the entire unabridged server log or just a snippet? This reminds me a bit of #302 (Crucible was slow to pause during clean VM shutdown), but there we had some extra clues in the logs (messages about state driver-led state transitions and device pause requests) that helped point us to the problem.

jordanhendricks commented 1 year ago

the rest of the server log:

propolis-server log ``` jordan@maxwell ~/propolis $ ./server.sh confs/minimal.conf Apr 23 19:48:53.441 INFO reservoir too small (0MiB) to use for guest memory Apr 23 19:48:53.442 INFO Starting server... Apr 23 19:48:53.443 INFO listening, local_addr: 172.20.3.73:8000 Apr 23 19:49:53.971 INFO accepted connection, remote_addr: 172.20.3.73:54475, local_addr: 172.20.3.73:800 0 Apr 23 19:49:54.467 INFO Creating storage device helios-blockdev of kind Virtio Apr 23 19:49:54.467 INFO Creating Crucible disk from request Volume { id: 0cedae45-3d6e-4d90-b2cb-56f1a1a 42a89, block_size: 512, sub_volumes: [Region { block_size: 512, blocks_per_extent: 64000, extent_count: 6 4, opts: CrucibleOpts { id: 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, target: [172.20.3.73:2210, 172.20.3.73: 2220, 172.20.3.73:2230], lossy: false, flush_timeout: None, key: None, cert_pem: None, key_pem: None, roo t_cert_pem: None, control: None, read_only: false }, gen: 2 }], read_only_parent: None } Apr 23 19:49:54.468 INFO Upstairs starts Apr 23 19:49:54.468 INFO Crucible stats registered with UUID: 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Apr 23 19:49:54.468 INFO Crucible 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 has session id: 4f166805-3598-4eda -80a0-ea0ec14dbefb Apr 23 19:49:54.468 INFO [2] connecting to 172.20.3.73:2230, looper: 2 Apr 23 19:49:54.468 INFO up_listen starts, task: up_listen Apr 23 19:49:54.468 INFO Wait for all three downstairs to come online Apr 23 19:49:54.468 INFO Flush timeout: 5 Apr 23 19:49:54.468 INFO [1] connecting to 172.20.3.73:2220, looper: 1 Apr 23 19:49:54.469 INFO [0] connecting to 172.20.3.73:2210, looper: 0 Apr 23 19:49:54.469 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 1 Apr 23 19:49:54.469 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 2 Apr 23 19:49:54.469 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 0 Apr 23 19:49:54.469 INFO [1] Proc runs for 172.20.3.73:2220 in state New Apr 23 19:49:54.469 INFO [2] Proc runs for 172.20.3.73:2230 in state New Apr 23 19:49:54.469 INFO [0] Proc runs for 172.20.3.73:2210 in state New Apr 23 19:49:54.469 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) New New New ds_transition to WaitActive Apr 23 19:49:54.469 INFO [0] Transition from New to WaitActive Apr 23 19:49:54.469 INFOApr 23 19:49:54.469 [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a -9252-74a7704ac738) WaitActive New New ds_transition to WaitActiveINFO Apr 23 19:49:54.470Starting vCPU thread INFO , [1] Transition from New to WaitActivevcpu :Apr 23 19:49:54.470 0INFO, component[2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252- 74a7704ac738) WaitActive WaitActive New ds_transition to WaitActive: vcpu_tasksApr 23 19:49:54.470 INFO Apr 23 19:49:54.470[2] Transition from New to WaitActive INFO vCPU paused, vcpu: 0, component: vcpu_tasks Apr 23 19:49:54.470 INFO Starting vCPU thread, vcpu: 2, component: vcpu_tasks Apr 23 19:49:54.470 INFO vCPU paused, vcpu: 2, component: vcpu_tasks Apr 23 19:49:54.470 INFO Starting vCPU thread, vcpu: 1, component: vcpu_tasks Apr 23 19:49:54.470 INFO vCPU paused, vcpu: 1, component: vcpu_tasks Apr 23 19:49:54.470 INFO Starting vCPU thread, vcpu: 3, component: vcpu_tasks Apr 23 19:49:54.470 INFO State worker launched, component: vm_state_worker Apr 23 19:49:54.470 INFO vCPU paused, vcpu: 3, component: vcpu_tasks Apr 23 19:49:54.471 INFO Entered serial task, component: serial task, uri: /instance, method: PUT, req_id : e74b5039-eedb-451c-8aff-f0e6c670545e, remote_addr: 172.20.3.73:54475, local_addr: 172.20.3.73:8000 Apr 23 19:49:54.471 INFO request completed, response_code: 201, uri: /instance, method: PUT, req_id: e74b 5039-eedb-451c-8aff-f0e6c670545e, remote_addr: 172.20.3.73:54475, local_addr: 172.20.3.73:8000 Apr 23 19:49:54.511 INFO accepted connection, remote_addr: 172.20.3.73:39750, local_addr: 172.20.3.73:800 0 Apr 23 19:49:54.512 INFO Requested state Run via API, component: vm_controller Apr 23 19:49:54.512 INFO Queuing external request, disposition: Enqueue, request: Start, component: exter nal_request_queue Apr 23 19:49:54.512 INFO State worker handling event, event: External(Start), component: vm_state_worker Apr 23 19:49:54.512 INFO Starting instance, reason: ExplicitRequest, component: vm_state_worker Apr 23 19:49:54.513 INFO request completed, response_code: The guest has requested activation 204, uri: /instance/state, method: PUTApr 23 19:49:54.513, req_id:INFO 02c30aeb-6d38-455e-ba84-100aad0e ed470cedae45-3d6e-4d90-b2cb-56f1a1a42a89 active request set, remote_addr:Apr 23 19:49:54.513 172.20INFO.3 .73[2] received activate with gen 2:39750 , local_addrApr 23 19:49:54.513: INFO172 .20[2] client got ds_active_rx, promote! session 96f8ff7c-9345- 474a-9252-74a7704ac738.3.73 :8000 Apr 23 19:49:54.513 INFOApr 23 19:49:54.513 [0] received activate with gen 2INFO Resetting vCPU 0Apr 23 19:49:54.513, componentINFO: [0] client got ds_active_rx, promote! session 96f8 ff7c-9345-474a-9252-74a7704ac738vm_controller Apr 23 19:49:54.513Apr 23 19:49:54.513 INFOINFO [1] received activate with gen 2Resetting BSP vCPU 0 , componentApr 23 19:49:54.513: INFOvm_controller [1] client got ds_active_rx, promote! session 96f8ff7c-9345-474a-9252-74a7704ac738 Apr 23 19:49:54.513 Apr 23 19:49:54.513INFO INFOResetting vCPU 1 , [2] downstairs client at 172.20.3.73: 2230 has UUID bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2component: vm_controllerApr 23 19:49:54.513 INFOApr 23 19:49:54.513 [2] Got region def RegionDefinition { block_size: 512, extent_size: Block { val ue: 64000, shift: 9 }, extent_count: 64, uuid: bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2, encrypted: false }IN FO Resetting vCPU 2Apr 23 19:49:54.513, componentINFO: [0] downstairs client at 172.20.3.73:2210 has UUID 14173ed5-e3b9-4bd6-d63d-ed81b791c86cvm_controller Apr 23 19:49:54.513Apr 23 19:49:54.513 INFOINFO [1] downstairs client at 172.20.3.73:2220 has UUID 4715 48f8-e726-65f7-c31c-a62f570393adResetting vCPU 3 , componentApr 23 19:49:54.513: INFOvm_controller [0] Got region def RegionDefinition { block_size: 512, extent_size: Block { value: 64000, shift: 9 }, ext ent_count: 64, uuid: 14173ed5-e3b9-4bd6-d63d-ed81b791c86c, encrypted: false } Apr 23 19:49:54.513 Apr 23 19:49:54.513INFO INFOSending startup complete to lpc-bhyve-atpic , [1] Got re gion def RegionDefinition { block_size: 512, extent_size: Block { value: 64000, shift: 9 }, extent_count: 64, uuid: 471548f8-e726-65f7-c31c-a62f570393ad, encrypted: false }component : Apr 23 19:49:54.513vm_controller INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 WaitActive WaitActive WaitActiveApr 23 19:49:54.513 INFOApr 23 19:49:54.513 Sending startup complete to lpc-bhyve-atpitINFO, component0cedae45-3d6e-4d90-b 2cb-56f1a1a42a89 WaitActive WaitActive WaitActive: vm_controllerApr 23 19:49:54.513 INFOApr 23 19:49:54.514 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 WaitActive WaitActive WaitActiveINFO Sending startup complete to lpc-bhyve-hpet, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-bhyve-ioapic, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-bhyve-rtc, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to bhyve-vcpu-0, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to bhyve-vcpu-1, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to bhyve-vcpu-2, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to bhyve-vcpu-3, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to chipset-i440fx, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to pci-piix4-hb, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to pci-piix3-lpc, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to pci-piix3-pm, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-bhyve-pmtimer, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-uart-com2, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-uart-com1, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-uart-com3, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-uart-com4, component: vm_controller Apr 23 19:49:54.514 INFO Sending startup complete to lpc-ps2ctrl, component: vm_controller Apr 23 19:49:54.515 INFO Sending startup complete to qemu-lpc-debug, component: vm_controller Apr 23 19:49:54.515 INFO Sending startup complete to pci-virtio-block-0.17.0, component: vm_controller Apr 23 19:49:54.515 INFO Sending startup complete to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, component: vm_controller Apr 23 19:49:54.516 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) WaitActive WaitActive WaitActive ds_transition to WaitQuorum Apr 23 19:49:54.516 INFO [2] Transition from WaitActive to WaitQuorum Apr 23 19:49:54.516 WARN [2] new RM replaced this: None Apr 23 19:49:54.516 INFO [2] Starts reconcile loop Apr 23 19:49:54.516 INFO [2] 172.20.3.73:2230 task reports connection:true Apr 23 19:49:54.516 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 WaitActive WaitActive WaitQuorum Apr 23 19:49:54.516 INFO Waiting for 2 more clients to be ready Apr 23 19:49:54.516 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) WaitActive WaitActive WaitQuorum ds_transition to WaitQuorum Apr 23 19:49:54.516 INFO [0] Transition from WaitActive to WaitQuorum Apr 23 19:49:54.516 WARN [0] new RM replaced this: None Apr 23 19:49:54.516 INFO [0] Starts reconcile loop Apr 23 19:49:54.516 INFO [0] 172.20.3.73:2210 task reports connection:true Apr 23 19:49:54.516 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 WaitQuorum WaitActive WaitQuorum Apr 23 19:49:54.516 INFO Waiting for 1 more clients to be ready Apr 23 19:49:54.516 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) WaitQuorum WaitActive WaitQuorum ds_transition to WaitQuorum Apr 23 19:49:54.516 INFO [1] Transition from WaitActive to WaitQuorum The guest has finished waiting for activation Apr 23 19:49:54.516 WARN [1] new RM replaced this: None Apr 23 19:49:54.516 INFO [1] Starts reconcile loop Apr 23 19:49:54.516 INFO [1] 172.20.3.73:2220 task reports connection:true Apr 23 19:49:54.516 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 WaitQuorum WaitQuorum WaitQuorum Apr 23 19:49:54.516 Apr 23 19:49:54.516INFO INFO[1]R flush_numbers[0..12]: [1, 1, 1, 1, 1, 1, 1, 1, 1, 1 , 1, 1] Sending startup complete to qemu-fwcfg, Apr 23 19:49:54.517component :INFO vm_controller[1]R generation[ 0..12]: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] Apr 23 19:49:54.517Apr 23 19:49:54.517 INFOINFO [1]R dirty[0..12]: [false, false, false, false, false, false, false, false, false, false, false, false]Sending startup complete to qemu-ramfb , componentApr 23 19:49:54.517: INFOvm_controller [2]R flush_numbers[0..12]: [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] Apr 23 19:49:54.517 Apr 23 19:49:54.517INFO INFOvCPU released from hold , [2]R generation[0..12]: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]vcpu : Apr 23 19:49:54.5170 , INFOcomponent :[2]R dirty[0..12]: [false, false, false, false, false, false, fal se, false, false, false, false, false] vcpu_tasks Apr 23 19:49:54.517 Apr 23 19:49:54.517INFO INFO[0]R flush_numbers[0..12]: [1, 1, 1, 1, 1, 1, 1, 1, 1, 1 , 1, 1] vCPU released from hold, Apr 23 19:49:54.517vcpu :INFO 1[0]R generation[0..12]: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], component:Apr 23 19:49:54.517 vcpu_tasksINFO [0]R dirty[0..12]: [false, false, false, false, false, false, false, false, false, false, false, false]A pr 23 19:49:54.517 INFOApr 23 19:49:54.517 vCPU released from holdINFO, vcpuMax found gen is 1: 2Apr 23 19:49:54.517, componentINFO: Generation requested: 2 >= found:1vcpu_tasks Apr 23 19:49:54.517Apr 23 19:49:54.517 INFOINFO Next flush: 2vCPU released from hold , vcpuApr 23 19:49:54.517: INFO3 , All extents matchcomponent : Apr 23 19:49:54.517vcpu_tasks INFO Apr 23 19:49:54.517No downstairs repair required INFO Apr 23 19:49:54.517State worker handled event , INFOoutcome :No repair work was required Continue, Apr 23 19:49:54.517component :INFO vm_state_workerSet Downstairs and Upstairs active Apr 23 19:49:54.517 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 is now active with session: 96f8ff7c-9345-4 74a-9252-74a7704ac738 Apr 23 19:49:54.517 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Set Active after no repair Apr 23 19:49:54.517 INFO Notify all downstairs, region set compare is done. Apr 23 19:49:54.517 INFO [2] received reconcile message Apr 23 19:49:54.517 INFO [2] All repairs completed, exit Apr 23 19:49:54.518 INFO [2] Starts cmd_loop Apr 23 19:49:54.518 INFO [0] received reconcile message Apr 23 19:49:54.518 INFO [0] All repairs completed, exit Apr 23 19:49:54.518 INFO [0] Starts cmd_loop Apr 23 19:49:54.518 INFO [1] received reconcile message Apr 23 19:49:54.518 INFO [1] All repairs completed, exit Apr 23 19:49:54.518 INFO [1] Starts cmd_loop Apr 23 19:49:54.550 INFO accepted connection, remote_addr: 172.20.3.73:49567, local_addr: 172.20.3.73:800 0 Apr 23 19:49:54.550 INFO request completed, response_code: 101, uri: /instance/serial?most_recent=16384, method: GET, req_id: d7764276-570d-42b4-8817-c7fd49b97065, remote_addr: 172.20.3.73:49567, local_addr: 17 2.20.3.73:8000 Apr 23 19:49:54.954 INFO ramfb change, config: Config { addr: 1050902528, fourcc: 875713112, flags: 0, wi dth: 800, height: 600, stride: 3200 }, state: valid, component: ramfb Apr 23 19:49:54.954 INFO notifying, component: ramfb Apr 23 19:49:54.954 INFO pixel format set to fourcc=0x34325258, component: vnc-server Apr 23 19:50:49.474 WARN [0] downstairs disconnected Apr 23 19:50:49.474 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 19:50:49.474 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2210 closed, looper: 0 Apr 23 19:50:49.474 INFO [0] 172.20.3.73:2210 task reports connection:false Apr 23 19:50:49.474 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Active Active Apr 23 19:50:49.474 INFO [0] 172.20.3.73:2210 task reports offline Apr 23 19:50:49.474 WARN [1] downstairs disconnected Apr 23 19:50:49.474 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 19:50:49.475 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2220 closed, looper: 1 Apr 23 19:50:49.475 INFO [1] 172.20.3.73:2220 task reports connection:false Apr 23 19:50:49.475 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Active Apr 23 19:50:49.475 INFO [1] 172.20.3.73:2220 task reports offline Apr 23 19:50:49.475 WARN [2] downstairs disconnected Apr 23 19:50:49.475 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 19:50:49.475 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2230 closed, looper: 2 Apr 23 19:50:49.475 INFO [2] 172.20.3.73:2230 task reports connection:false Apr 23 19:50:49.475 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 19:50:49.475 INFO [2] 172.20.3.73:2230 task reports offline Apr 23 19:50:50.475 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 1 Apr 23 19:50:50.475 INFO [1] Proc runs for 172.20.3.73:2220 in state Offline Apr 23 19:50:50.475 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 0 Apr 23 19:50:50.475 INFO [1] client re-new 0 jobs since flush 1307, : downstairs Apr 23 19:50:50.476 INFO [0] Proc runs for 172.20.3.73:2210 in state Offline Apr 23 19:50:50.476 INFO [0] client re-new 0 jobs since flush 1307, : downstairs Apr 23 19:50:50.476 INFO [1] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 19:50:50.476 INFO [0] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 19:50:50.476 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 2 Apr 23 19:50:50.476 INFO [2] Proc runs for 172.20.3.73:2230 in state Offline Apr 23 19:50:50.476 INFO [2] client re-new 0 jobs since flush 1307, : downstairs Apr 23 19:50:50.476 INFO [1] downstairs client at 172.20.3.73:2220 has UUID 471548f8-e726-65f7-c31c-a62f5 70393ad Apr 23 19:50:50.476 INFO [1] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: 471548f8-e726-65f7-c31c-a62f570393ad, encrypted: false } Apr 23 19:50:50.476 INFO Returning client:1 UUID:471548f8-e726-65f7-c31c-a62f570393ad matches Apr 23 19:50:50.476 INFO [1] send last flush ID to this DS: 1307 Apr 23 19:50:50.476 INFO [0] downstairs client at 172.20.3.73:2210 has UUID 14173ed5-e3b9-4bd6-d63d-ed81b 791c86c Apr 23 19:50:50.476 INFO [0] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: 14173ed5-e3b9-4bd6-d63d-ed81b791c86c, encrypted: false } Apr 23 19:50:50.476 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 19:50:50.476 INFO Returning client:0 UUID:14173ed5-e3b9-4bd6-d63d-ed81b791c86c matches Apr 23 19:50:50.476 INFO [0] send last flush ID to this DS: 1307 Apr 23 19:50:50.476 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 19:50:50.476 INFO [1] replied this last flush ID: 1307 Apr 23 19:50:50.476 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Offline ds_transition to Replay Apr 23 19:50:50.476 INFO [1] Transition from Offline to Replay Apr 23 19:50:50.476 INFO [2] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 19:50:50.476 INFO [1] 172.20.3.73:2220 task reports connection:true Apr 23 19:50:50.476 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 19:50:50.476 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Replay Offline ds_transition to Active Apr 23 19:50:50.477 INFO [1] Transition from Replay to Active Apr 23 19:50:50.477 INFO [1] Starts cmd_loop Apr 23 19:50:50.477 INFO [0] replied this last flush ID: 1307 Apr 23 19:50:50.477 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Active Offline Apr 23 19:50:50.477 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Active Offline ds_transition to Replay Apr 23 19:50:50.477 INFO [0] Transition from Offline to Replay Apr 23 19:50:50.477 INFO [2] downstairs client at 172.20.3.73:2230 has UUID bf4a2dfd-8dfb-6a57-d95e-9a9d2 bcab7b2 Apr 23 19:50:50.477 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 19:50:50.477 INFO [2] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2, encrypted: false } Apr 23 19:50:50.477 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Replay Active Offline ds_transition to Active Apr 23 19:50:50.477 INFO [0] Transition from Replay to Active Apr 23 19:50:50.477 INFO [0] Starts cmd_loop Apr 23 19:50:50.477 INFO [0] 172.20.3.73:2210 task reports connection:true Apr 23 19:50:50.477 INFO Returning client:2 UUID:bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2 matches Apr 23 19:50:50.477 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Active Offline Apr 23 19:50:50.477 INFO [2] send last flush ID to this DS: 1307 Apr 23 19:50:50.477 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Active Offline Apr 23 19:50:50.477 INFO [2] replied this last flush ID: 1307 Apr 23 19:50:50.477 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Active Active Offline ds_transition to Replay Apr 23 19:50:50.477 INFO [2] Transition from Offline to Replay Apr 23 19:50:50.477 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 19:50:50.477 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Active Active Replay ds_transition to Active Apr 23 19:50:50.477 INFO [2] Transition from Replay to Active Apr 23 19:50:50.477 INFO [2] Starts cmd_loop Apr 23 19:50:50.477 INFO [2] 172.20.3.73:2230 task reports connection:true Apr 23 19:50:50.477 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Active Active Apr 23 19:50:51.477 WARN [2] flow control end Apr 23 19:50:51.477 WARN [0] flow control end Apr 23 19:50:51.477 WARN [1] flow control end Apr 23 20:25:59.443 INFO accepted connection, remote_addr: 172.20.3.73:38177, local_addr: 172.20.3.73:800 0 Apr 23 20:25:59.444 INFO Requested state Reboot via API, vcpu_tasks: pause component: vm_controller Apr 23 20:25:59.444 INFO Queuing external request, disposition: Enqueue, request: Reboot, component: exte rnal_request_queue Apr 23 20:25:59.444 INFO controller: pause entities request completed, response_code: 204, uri: /instance/state, method: PUT, req_id: 74ff3739-a87b-4e89-a194 -ab2af451f080, remote_addr: 172.20.3.73:38177, local_addr: 172.20.3.73:8000 Apr 23 20:25:59.444 INFO State worker handling event, event: External(Reboot), component: controller: res et entities vm_state_worker Apr 23 20:25:59.444 INFO Resetting instance, component: vm_state_worker Apr 23 20:25:59.444 INFO vCPU paused, vcpu: 0, component: vcpu_tasks Apr 23 20:25:59.444 INFO vCPU paused, vcpu: 1, component: vcpu_tasks Apr 23 20:25:59.445 INFO vCPU paused, vcpu: 2, component: vcpu_tasks Apr 23 20:25:59.445 INFO vCPU paused, vcpu: 3, component: vcpu_tasks Apr 23 20:25:59.445 INFO Sending pause request to lpc-bhyve-atpic, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to lpc-bhyve-atpit, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to lpc-bhyve-hpet, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to lpc-bhyve-ioapic, component: vm_controller reset vcpus Apr 23 20:25:59.445 INFO Sending pause request to lpc-bhyve-rtccontroller: resume entities , component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to bhyve-vcpu-0, vcpu_tasks: resume all component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to bhyve-vcpu-1, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to bhyve-vcpu-2, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to bhyve-vcpu-3, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to chipset-i440fx, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to pci-piix4-hb, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to pci-piix3-lpc, component: vm_controller Apr 23 20:25:59.445 INFO Sending pause request to pci-piix3-pm, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to lpc-uart-com2, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to lpc-uart-com1, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to lpc-uart-com3, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to lpc-uart-com4, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to lpc-ps2ctrl, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to qemu-lpc-debug, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, co mponent: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to qemu-fwcfg, component: vm_controller Apr 23 20:25:59.446 INFO Sending pause request to qemu-ramfb, component: vm_controller Apr 23 20:25:59.446 INFO Waiting for entities to pause, component: vm_controller Apr 23 20:25:59.446 INFO Got paused future from entity lpc-bhyve-atpic, component: vm_controller Apr 23 20:25:59.446 INFO Got paused future from entity lpc-bhyve-atpit, component: vm_controller Apr 23 20:25:59.446 INFO Got paused future from entity lpc-bhyve-hpet, component: vm_controller Apr 23 20:25:59.446 INFO Got paused future from entity lpc-bhyve-ioapic, component: vm_controller Apr 23 20:25:59.446 INFO Got paused future from entity lpc-bhyve-rtc, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity bhyve-vcpu-0, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity bhyve-vcpu-1, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity bhyve-vcpu-2, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity bhyve-vcpu-3, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity pci-piix4-hb, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity pci-piix3-lpc, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity pci-piix3-pm, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity chipset-i440fx, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity lpc-uart-com2, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity lpc-uart-com1, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity lpc-uart-com3, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity lpc-uart-com4, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity lpc-ps2ctrl, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity qemu-lpc-debug, component: vm_controller Apr 23 20:25:59.447 INFO Got paused future from entity block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a8 9, component: vm_controller Apr 23 20:25:59.448 INFO Got paused future from entity pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:25:59.448 INFO Got paused future from entity qemu-fwcfg, component: vm_controller Apr 23 20:25:59.448 INFO Got paused future from entity qemu-ramfb, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to lpc-bhyve-atpic, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to lpc-bhyve-atpit, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to lpc-bhyve-hpet, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to lpc-bhyve-ioapic, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to lpc-bhyve-rtc, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to bhyve-vcpu-0, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to bhyve-vcpu-1, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to bhyve-vcpu-2, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to bhyve-vcpu-3, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to chipset-i440fx, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to pci-piix4-hb, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to pci-piix3-lpc, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to pci-piix3-pm, component: vm_controller Apr 23 20:25:59.448 INFO Sending reset request to lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to lpc-uart-com2, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to lpc-uart-com1, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to lpc-uart-com3, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to lpc-uart-com4, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to lpc-ps2ctrl, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to qemu-lpc-debug, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, co mponent: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to qemu-fwcfg, component: vm_controller Apr 23 20:25:59.449 INFO Sending reset request to qemu-ramfb, component: vm_controller Apr 23 20:25:59.449 INFO Resetting vCPU 0, component: vm_controller Apr 23 20:25:59.449 INFO Resetting BSP vCPU 0, component: vm_controller Apr 23 20:25:59.449 INFO Resetting vCPU 1, component: vm_controller Apr 23 20:25:59.449 INFO Resetting vCPU 2, component: vm_controller Apr 23 20:25:59.449 INFO Resetting vCPU 3, component: vm_controller Apr 23 20:25:59.449 INFO Sending resume request to lpc-bhyve-atpic, component: vm_controller Apr 23 20:25:59.449 INFO Sending resume request to lpc-bhyve-atpit, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-bhyve-hpet, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-bhyve-ioapic, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-bhyve-rtc, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to bhyve-vcpu-0, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to bhyve-vcpu-1, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to bhyve-vcpu-2, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to bhyve-vcpu-3, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to chipset-i440fx, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to pci-piix4-hb, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to pci-piix3-lpc, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to pci-piix3-pm, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-uart-com2, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-uart-com1, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-uart-com3, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-uart-com4, component: vm_controller Apr 23 20:25:59.450 INFO Sending resume request to lpc-ps2ctrl, component: vm_controller Apr 23 20:25:59.451 INFO Sending resume request to qemu-lpc-debug, component: vm_controller Apr 23 20:25:59.451 INFO Sending resume request to pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:25:59.451 INFO Sending resume request to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, c omponent: vm_controller Apr 23 20:25:59.451 INFO Sending resume request to qemu-fwcfg, component: vm_controller Apr 23 20:25:59.451 INFO Sending resume request to qemu-ramfb, component: vm_controller Apr 23 20:25:59.451 INFO vCPU released from hold, vcpu: 0, component: vcpu_tasks Apr 23 20:25:59.451 INFO vCPU released from hold, vcpu: 1, component: vcpu_tasks Apr 23 20:25:59.451 INFO State worker handled event, outcome: Continue, component: vm_state_worker Apr 23 20:25:59.451 INFO vCPU released from hold, vcpu: 3, component: vcpu_tasks Apr 23 20:25:59.451 INFO vCPU released from hold, vcpu: 2, component: vcpu_tasks Apr 23 20:25:59.863 INFO ramfb change, config: Config { addr: 1050902528, fourcc: 875713112, flags: 0, wi dth: 800, height: 600, stride: 3200 }, state: valid, component: ramfb Apr 23 20:25:59.863 INFO notifying, component: ramfb Apr 23 20:25:59.863 INFO pixel format set to fourcc=0x34325258, component: vnc-server Apr 23 20:26:11.811 INFO accepted connection, remote_addr: 172.20.3.73:56055, local_addr: 172.20.3.73:800 0 Apr 23 20:26:11.812 self.paused = false vcpu_tasks: pause all INFO Requested state Stop via API, component: vm_controller Apr 23 20:26:11.812 INFO Queuing external request, disposition: Enqueuecontroller: pause entities , request: Stop, component: external_request_queue Apr 23 20:26:11.813 INFO request completed, response_code: 204, uri: /instance/state, method: PUT, req_id : 6b6babc1-66ea-4444-9a29-13a1331cf4f3, remote_addr: 172.20.3.73:56055, local_addr: 172.20.3.73:8000 Apr 23 20:26:11.813 INFO State worker handling event, event: External(Stop), component: vm_state_worker Apr 23 20:26:11.813 INFO Stopping instance, component: vm_state_worker Apr 23 20:26:11.813 INFO vCPU paused, vcpu: 0, component: vcpu_tasks Apr 23 20:26:11.813 INFO vCPU paused, vcpu: 1, component: vcpu_tasks Apr 23 20:26:11.813 INFO vCPU paused, vcpu: 2, component: vcpu_tasks Apr 23 20:26:11.813 INFO vCPU paused, vcpu: 3, component: vcpu_tasks Apr 23 20:26:11.813 INFO Sending pause request to lpc-bhyve-atpic, component: vm_controller Apr 23 20:26:11.813 INFO Sending pause request to lpc-bhyve-atpit, component: vm_controller Apr 23 20:26:11.813 INFO Sending pause request to lpc-bhyve-hpet, component: vm_controller Apr 23 20:26:11.813 INFO Sending pause request to lpc-bhyve-ioapic, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-bhyve-rtc, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to bhyve-vcpu-0, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to bhyve-vcpu-1, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to bhyve-vcpu-2, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to bhyve-vcpu-3, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to chipset-i440fx, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to pci-piix4-hb, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to pci-piix3-lpc, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to pci-piix3-pm, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-uart-com2, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-uart-com1, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-uart-com3, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-uart-com4, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to lpc-ps2ctrl, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to qemu-lpc-debug, component: vm_controller Apr 23 20:26:11.814 INFO Sending pause request to pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:26:11.815 INFO Sending pause request to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, co mponent: vm_controller Apr 23 20:26:23.437 INFO accepted connection, remote_addr: 172.20.3.73:54058, local_addr: 172.20.3.73:800 0 Apr 23 20:26:23.438 INFO Requested state Run via API, component: vm_controller Apr 23 20:26:23.438 INFO Queuing external request, disposition: Deny(HaltPending), request: Start, compon ent: external_request_queue Apr 23 20:26:23.439 INFO request completed, error_message_external: Instance operation failed: Failed to queue requested state change: Instance is preparing to stop, error_message_internal: Instance operation f ailed: Failed to queue requested state change: Instance is preparing to stop, response_code: 400, uri: /i nstance/state, method: PUT, req_id: da972575-1e38-46c4-9b06-88fff4c227de, remote_addr: 172.20.3.73:54058, local_addr: 172.20.3.73:8000 Apr 23 20:26:54.935 WARN [2] downstairs disconnected Apr 23 20:26:54.936 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 20:26:54.936 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2230 closed, looper: 2 Apr 23 20:26:54.936 INFO [2] 172.20.3.73:2230 task reports connection:false Apr 23 20:26:54.936 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Active Offline Apr 23 20:26:54.936 INFO [2] 172.20.3.73:2230 task reports offline Apr 23 20:26:54.936 WARN [0] downstairs disconnected Apr 23 20:26:54.936 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 20:26:54.936 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2210 closed, looper: 0 Apr 23 20:26:54.936 INFO [0] 172.20.3.73:2210 task reports connection:false Apr 23 20:26:54.937 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Active Offline Apr 23 20:26:54.937 INFO [0] 172.20.3.73:2210 task reports offline Apr 23 20:26:54.937 WARN [1] downstairs disconnected Apr 23 20:26:54.937 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 20:26:54.937 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2220 closed, looper: 1 Apr 23 20:26:54.937 INFO [1] 172.20.3.73:2220 task reports connection:false Apr 23 20:26:54.937 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 20:26:54.937 INFO [1] 172.20.3.73:2220 task reports offline Apr 23 20:26:55.937 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 2 Apr 23 20:26:55.937 INFO [2] Proc runs for 172.20.3.73:2230 in state Offline Apr 23 20:26:55.937 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 1 Apr 23 20:26:55.938 INFO [2] client re-new 2 jobs since flush 1570, : downstairs Apr 23 20:26:55.938 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 0 Apr 23 20:26:55.938 INFO [1] Proc runs for 172.20.3.73:2220 in state Offline Apr 23 20:26:55.938 INFO [1] client re-new 2 jobs since flush 1570, : downstairs Apr 23 20:26:55.938 INFO [0] Proc runs for 172.20.3.73:2210 in state Offline Apr 23 20:26:55.938 INFO [0] client re-new 2 jobs since flush 1570, : downstairs Apr 23 20:26:55.938 INFO [2] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 20:26:55.938 INFO [0] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 20:26:55.938 INFO [1] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 20:26:55.938 INFO [2] downstairs client at 172.20.3.73:2230 has UUID bf4a2dfd-8dfb-6a57-d95e-9a9d2 bcab7b2 Apr 23 20:26:55.938 INFO [2] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2, encrypted: false } Apr 23 20:26:55.938 INFO Returning client:2 UUID:bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2 matches Apr 23 20:26:55.939 INFO [2] send last flush ID to this DS: 1570 Apr 23 20:26:55.939 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 20:26:55.939 INFO [0] downstairs client at 172.20.3.73:2210 has UUID 14173ed5-e3b9-4bd6-d63d-ed81b 791c86c Apr 23 20:26:55.939 INFO [0] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: 14173ed5-e3b9-4bd6-d63d-ed81b791c86c, encrypted: false } Apr 23 20:26:55.939 INFO Returning client:0 UUID:14173ed5-e3b9-4bd6-d63d-ed81b791c86c matches Apr 23 20:26:55.939 INFO [1] downstairs client at 172.20.3.73:2220 has UUID 471548f8-e726-65f7-c31c-a62f5 70393ad Apr 23 20:26:55.939 INFO [1] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: 471548f8-e726-65f7-c31c-a62f570393ad, encrypted: false } Apr 23 20:26:55.939 INFO [0] send last flush ID to this DS: 1570 Apr 23 20:26:55.939 INFO Returning client:1 UUID:471548f8-e726-65f7-c31c-a62f570393ad matches Apr 23 20:26:55.939 INFO [1] send last flush ID to this DS: 1570 Apr 23 20:26:55.939 INFO [2] replied this last flush ID: 1570 Apr 23 20:26:55.939 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 20:26:55.939 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 20:26:55.939 INFO [0] replied this last flush ID: 1570 Apr 23 20:26:55.939 INFO [1] replied this last flush ID: 1570 Apr 23 20:26:55.939 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Offline ds_transition to Replay Apr 23 20:26:55.939 INFO [2] Transition from Offline to Replay Apr 23 20:26:55.939 INFO [2] 172.20.3.73:2230 task reports connection:true Apr 23 20:26:55.939 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 20:26:55.939 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Replay ds_transition to Active Apr 23 20:26:55.940 INFO [2] Transition from Replay to Active Apr 23 20:26:55.940 INFO [2] Starts cmd_loop Apr 23 20:26:55.940 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Active Apr 23 20:26:55.940 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Active ds_transition to Replay Apr 23 20:26:55.940 INFO [0] Transition from Offline to Replay Apr 23 20:26:55.940 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Replay Offline Active ds_transition to Replay Apr 23 20:26:55.940 INFO [1] Transition from Offline to Replay Apr 23 20:26:55.940 INFO [0] 172.20.3.73:2210 task reports connection:true Apr 23 20:26:55.940 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Replay Replay Active Apr 23 20:26:55.940 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 20:26:55.940 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Replay Replay Active ds_transition to Active Apr 23 20:26:55.940 INFO [0] Transition from Replay to Active Apr 23 20:26:55.940 INFO [0] Starts cmd_loop Apr 23 20:26:55.940 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 20:26:55.940 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Active Replay Active ds_transition to Active Apr 23 20:26:55.940 INFO [1] Transition from Replay to Active Apr 23 20:26:55.940 INFO [1] Starts cmd_loop Apr 23 20:26:55.940 INFO [1] 172.20.3.73:2220 task reports connection:true Apr 23 20:26:55.940 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Active Active Apr 23 20:26:56.940 WARN [1] flow control end Apr 23 20:26:56.941 WARN [2] flow control end Apr 23 20:26:56.941 WARN [0] flow control end Apr 23 20:26:56.955 INFO vcpu_tasks: exit all Sending pause request to qemu-fwcfg, component: vm_controller Apr 23 20:26:56.955 INFO Sending pause request to qemu-ramfbcontroller: halt entities , component: vm_controller publish steady state Apr 23 20:26:56.955 INFO Waiting for entities to pause, component: vm_controller Apr 23 20:26:56.955 INFO Got paused future from entity lpc-bhyve-atpic, component: vm_controller Apr 23 20:26:56.955 INFO Got paused future from entity lpc-bhyve-atpit, component: vm_controller Apr 23 20:26:56.956 INFO Got paused future from entity lpc-bhyve-hpet, component: vm_controller Apr 23 20:26:56.956 INFO Got paused future from entity lpc-bhyve-ioapic, component: vm_controller Apr 23 20:26:56.956 INFO Got paused future from entity lpc-bhyve-rtc, component: vm_controller Apr 23 20:26:56.956 INFO Got paused future from entity bhyve-vcpu-0, component: vm_controller Apr 23 20:26:56.956 INFO Got paused future from entity bhyve-vcpu-1, component: vm_controller Apr 23 20:26:56.956 INFO Got paused future from entity bhyve-vcpu-2, component: vm_controller Apr 23 20:26:56.957 INFO Got paused future from entity bhyve-vcpu-3, component: vm_controller Apr 23 20:26:56.957 INFO Got paused future from entity pci-piix4-hb, component: vm_controller Apr 23 20:26:56.957 INFO Got paused future from entity pci-piix3-lpc, component: vm_controller Apr 23 20:26:56.957 INFO Got paused future from entity pci-piix3-pm, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity chipset-i440fx, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity lpc-uart-com2, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity lpc-uart-com1, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity lpc-uart-com3, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity lpc-uart-com4, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity lpc-ps2ctrl, component: vm_controller Apr 23 20:26:56.958 INFO Got paused future from entity qemu-lpc-debug, component: vm_controller Apr 23 20:26:56.959 INFO Got paused future from entity block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a8 9, component: vm_controller Apr 23 20:26:56.959 INFO Got paused future from entity pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:26:56.959 INFO Got paused future from entity qemu-fwcfg, component: vm_controller Apr 23 20:26:56.959 INFO Got paused future from entity qemu-ramfb, component: vm_controller Apr 23 20:26:56.959 INFO vCPU released from hold, vcpu: 0, component: vcpu_tasks Apr 23 20:26:56.959 INFO Exiting vCPU thread for CPU 0, vcpu: 0, component: vcpu_tasks Apr 23 20:26:56.959 INFO vCPU released from hold, vcpu: 1, component: vcpu_tasks Apr 23 20:26:56.959 INFO Exiting vCPU thread for CPU 1, vcpu: 1, component: vcpu_tasks Apr 23 20:26:56.959 INFO vCPU released from hold, vcpu: 3, component: vcpu_tasks Apr 23 20:26:56.960 INFO vCPU released from hold, vcpu: 2, component: vcpu_tasks Apr 23 20:26:56.960 INFO Exiting vCPU thread for CPU 3, vcpu: 3, component: vcpu_tasks Apr 23 20:26:56.960 INFO Exiting vCPU thread for CPU 2, vcpu: 2, component: vcpu_tasks Apr 23 20:26:56.960 INFO Sending halt request to lpc-bhyve-atpic, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to lpc-bhyve-atpit, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to lpc-bhyve-hpet, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to lpc-bhyve-ioapic, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to lpc-bhyve-rtc, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to bhyve-vcpu-0, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to bhyve-vcpu-1, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to bhyve-vcpu-2, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to bhyve-vcpu-3, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to chipset-i440fx, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to pci-piix4-hb, component: vm_controller Apr 23 20:26:56.960 INFO Sending halt request to pci-piix3-lpc, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to pci-piix3-pm, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to lpc-bhyve-pmtimer, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to lpc-uart-com2, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to lpc-uart-com1, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to lpc-uart-com3, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to lpc-uart-com4, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to lpc-ps2ctrl, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to qemu-lpc-debug, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to pci-virtio-block-0.17.0, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, com ponent: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to qemu-fwcfg, component: vm_controller Apr 23 20:26:56.961 INFO Sending halt request to qemu-ramfb, component: vm_controller Apr 23 20:26:56.961 INFO State worker handled event, outcome: Exit, component: vm_state_worker Apr 23 20:26:56.961 INFO State worker exiting, component: vm_state_worker Apr 23 20:26:56.961 INFO stopping VNC server, component: vnc-server Apr 23 20:26:56.962 INFO Dropping server's VM controller reference, weak_refs: 1, strong_refs: 1 Apr 23 20:26:56.962 INFO Dropping VM controller, component: vm_controller Apr 23 20:26:56.962 INFO Terminating serial task, component: serial task, uri: /instance, method: PUT, re q_id: e74b5039-eedb-451c-8aff-f0e6c670545e, remote_addr: 172.20.3.73:54475, local_addr: 172.20.3.73:8000 Apr 23 20:26:56.962 INFO Returning from serial task, component: serial task, uri: /instance, method: PUT, req_id: e74b5039-eedb-451c-8aff-f0e6c670545e, remote_addr: 172.20.3.73:54475, local_addr: 172.20.3.73:80 00 Apr 23 20:26:57.455 INFO accepted connection, remote_addr: 172.20.3.73:61958, local_addr: 172.20.3.73:800 0 Apr 23 20:26:57.456 INFO request completed, error_message_external: Internal Server Error, error_message_ internal: Server not initialized (no instance), response_code: 500, uri: /instance/state, method: PUT, re q_id: 40a97378-4272-4907-a44b-2320c0c7c1b2, remote_addr: 172.20.3.73:61958, local_addr: 172.20.3.73:8000 Apr 23 20:27:46.942 WARN [1] downstairs disconnected Apr 23 20:27:46.942 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 20:27:46.942 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2220 closed, looper: 1 Apr 23 20:27:46.942 INFO [1] 172.20.3.73:2220 task reports connection:false Apr 23 20:27:46.942 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Offline Active Apr 23 20:27:46.942 INFO [1] 172.20.3.73:2220 task reports offline Apr 23 20:27:46.943 WARN [2] downstairs disconnected Apr 23 20:27:46.943 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 20:27:46.943 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2230 closed, looper: 2 Apr 23 20:27:46.943 INFO [2] 172.20.3.73:2230 task reports connection:false Apr 23 20:27:46.943 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Offline Offline Apr 23 20:27:46.943 INFO [2] 172.20.3.73:2230 task reports offline Apr 23 20:27:46.943 WARN [0] downstairs disconnected Apr 23 20:27:46.943 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Gone missing, transition from Active to Offline Apr 23 20:27:46.943 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 connection to 172.20.3.73:2210 closed, looper: 0 Apr 23 20:27:46.943 INFO [0] 172.20.3.73:2210 task reports connection:false Apr 23 20:27:46.943 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 20:27:46.943 INFO [0] 172.20.3.73:2210 task reports offline Apr 23 20:27:47.944 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 2 Apr 23 20:27:47.944 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 1 Apr 23 20:27:47.944 INFO [2] Proc runs for 172.20.3.73:2230 in state Offline Apr 23 20:27:47.944 INFO [2] client re-new 0 jobs since flush 1572, : downstairs Apr 23 20:27:47.944 INFO [1] Proc runs for 172.20.3.73:2220 in state Offline Apr 23 20:27:47.944 INFO [1] client re-new 0 jobs since flush 1572, : downstairs Apr 23 20:27:47.944 INFO [2] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 20:27:47.945 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 looper connected, looper: 0 Apr 23 20:27:47.945 INFO [0] Proc runs for 172.20.3.73:2210 in state Offline Apr 23 20:27:47.945 INFO [0] client re-new 0 jobs since flush 1572, : downstairs Apr 23 20:27:47.945 INFO [1] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 20:27:47.945 INFO [2] downstairs client at 172.20.3.73:2230 has UUID bf4a2dfd-8dfb-6a57-d95e-9a9d2 bcab7b2 Apr 23 20:27:47.945 INFO [2] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2, encrypted: false } Apr 23 20:27:47.945 INFO [0] upstairs guest_io_ready=TRUE, promote! session 96f8ff7c-9345-474a-9252-74a77 04ac738 Apr 23 20:27:47.945 INFO Returning client:2 UUID:bf4a2dfd-8dfb-6a57-d95e-9a9d2bcab7b2 matches Apr 23 20:27:47.945 INFO [2] send last flush ID to this DS: 1572 Apr 23 20:27:47.945 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Offline Apr 23 20:27:47.945 INFO [1] downstairs client at 172.20.3.73:2220 has UUID 471548f8-e726-65f7-c31c-a62f5 70393ad Apr 23 20:27:47.945 INFO [1] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: 471548f8-e726-65f7-c31c-a62f570393ad, encrypted: false } Apr 23 20:27:47.945 INFO [2] replied this last flush ID: 1572 Apr 23 20:27:47.945 INFO Returning client:1 UUID:471548f8-e726-65f7-c31c-a62f570393ad matches Apr 23 20:27:47.945 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Offline ds_transition to Replay Apr 23 20:27:47.945 INFO [2] Transition from Offline to Replay Apr 23 20:27:47.945 INFO [2] 172.20.3.73:2230 task reports connection:true Apr 23 20:27:47.945 INFO [1] send last flush ID to this DS: 1572 Apr 23 20:27:47.945 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 20:27:47.946 INFO [2] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Replay ds_transition to Active Apr 23 20:27:47.946 INFO [2] Transition from Replay to Active Apr 23 20:27:47.946 INFO [0] downstairs client at 172.20.3.73:2210 has UUID 14173ed5-e3b9-4bd6-d63d-ed81b 791c86c Apr 23 20:27:47.946 INFO [2] Starts cmd_loop Apr 23 20:27:47.946 INFO [0] Got region def RegionDefinition { block_size: 512, extent_size: Block { valu e: 64000, shift: 9 }, extent_count: 64, uuid: 14173ed5-e3b9-4bd6-d63d-ed81b791c86c, encrypted: false } Apr 23 20:27:47.946 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Active Apr 23 20:27:47.946 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Offline Active Apr 23 20:27:47.946 INFO Returning client:0 UUID:14173ed5-e3b9-4bd6-d63d-ed81b791c86c matches Apr 23 20:27:47.946 INFO [1] replied this last flush ID: 1572 Apr 23 20:27:47.946 INFO [0] send last flush ID to this DS: 1572 Apr 23 20:27:47.946 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Offline Active ds_transition to Replay Apr 23 20:27:47.946 INFO [1] Transition from Offline to Replay Apr 23 20:27:47.946 INFO [1] 172.20.3.73:2220 task reports connection:true Apr 23 20:27:47.946 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Replay Active Apr 23 20:27:47.946 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 20:27:47.946 INFO [1] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Replay Active ds_transition to Active Apr 23 20:27:47.946 INFO [1] Transition from Replay to Active Apr 23 20:27:47.946 INFO [1] Starts cmd_loop Apr 23 20:27:47.946 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Offline Active Active Apr 23 20:27:47.946 INFO [0] replied this last flush ID: 1572 Apr 23 20:27:47.946 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Offline Active Active ds_transition to Replay Apr 23 20:27:47.946 INFO [0] Transition from Offline to Replay Apr 23 20:27:47.946 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Transition from Replay to Active Apr 23 20:27:47.946 INFO [0] 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 (96f8ff7c-9345-474a-9252-74a7704ac738) Replay Active Active ds_transition to Active Apr 23 20:27:47.946 INFO [0] Transition from Replay to Active Apr 23 20:27:47.946 INFO [0] Starts cmd_loop Apr 23 20:27:47.946 INFO [0] 172.20.3.73:2210 task reports connection:true Apr 23 20:27:47.946 INFO 0cedae45-3d6e-4d90-b2cb-56f1a1a42a89 Active Active Active Apr 23 20:27:48.946 WARN [0] flow control end Apr 23 20:27:48.946 WARN [2] flow control end Apr 23 20:27:48.946 WARN [1] flow control end ```
gjcolombo commented 1 year ago

This is indeed the cousin of #302. First we try to pause all entities:

Apr 23 20:26:11.813 INFO State worker handling event, event: External(Stop), component: vm_state_worker
Apr 23 20:26:11.815 INFO Sending pause request to block-crucible-0cedae45-3d6e-4d90-b2cb-56f1a1a42a89, component: vm_controller

But the call to pause Crucible is synchronous, and Crucible is having a difficult time, so it doesn't pause right away. 12 seconds later, a request to start appears and is rejected because the VM is going away:

Apr 23 20:26:23.438 INFO Queuing external request, disposition: Deny(HaltPending), request: start, component: external_request_queue

33 seconds later, Crucible unsticks itself and finishes pausing, which lets the VM controller proceed to actually shut down the VM:

Apr 23 20:26:56.955 INFO vcpu_tasks: exit all
Sending pause request to qemu-fwcfg, component: vm_controller
Apr 23 20:26:56.955 INFO Sending pause request to qemu-ramfbcontroller: halt entities, component: vm_controller
Apr 23 20:26:56.955 INFO Waiting for entities to pause, component: vm_controller

After this, the instance is destroyed (i.e. no longer ensured), so attempting to run it fails with an "instance not present" error instead of an "invalid transition" error.

gjcolombo commented 1 year ago

Note, however, that any long-running entity pause operation, whether synchronous (the call to pause the entity didn't return right away, as in this case) or asynchronous (the paused future takes forever to complete) will cause this sort of problem.

jordanhendricks commented 1 year ago

We've seen this issue crop up in dogfood, with instances being stuck stopping/rebooting (usually because crucible is stuck). One example is oxidecomputer/crucible#837.

jordanhendricks commented 1 year ago

I don't think this is a propolis problem per se: I think what we really need is for entities to forcefully shut themselves down when requested to. I presume that will require some changes on the crucible end, but I haven't looked into what that would entail yet.

leftwo commented 1 year ago

Crucible did have a bug that prevented it from completing the deactivation, but crucible should also handle getting the rug pulled out from under it (though cleanup may be required which would increase future activation time). We should look at the interface that propolis is using to disconnect from crucible and see if there is some additional disconnect interface we need to add to support this.

I think the current behavior is try to disconnect and wait (possibly forever) till the upstairs has a clean shutdown.

Would we want another behavior, where there is a timeout or something like that where we bound the wait and have crucible pull the rug under itself if the timer elapses and return to propolis?