TritonDataCenter / smartos-live

For more information, please see http://smartos.org/ For any questions that aren't answered there, please join the SmartOS discussion list: https://smartos.topicbox.com/groups/smartos-discuss
1.57k stars 246 forks source link

OS-7317 vmadm "remove_filesystems" is both broken and undocumented #888

Closed sjorge closed 4 years ago

sjorge commented 4 years ago

This PR replaces #863

Original work done by @jclulow, man page update + tests done by @sjorge

[root@00-0c-29-51-55-da /usr/vm/test]# ./runtest tests/test-update.js
# Running tests/test-update.js
Already have "imgapi" image source "https://images.joyent.com", no change
Already have "docker" image source "https://docker.io", no change
TAP version 13
# create VM
ok 1 created VM: 7b64f9ad-f4f7-466f-9046-bca2923a6d95
# add net0
ok 2 failed to set ips, was ["10.254.254.254/24"], expected ["10.254.254.254/24"]
ok 3 failed to set netmask, was "255.255.255.0", expected "255.255.255.0"
ok 4 failed to set nic_tag, was "external", expected "external"
ok 5 failed to set vlan_id, was 0, expected 0
ok 6 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 7 failed to set mac, was "01:02:03:04:05:06", expected "01:02:03:04:05:06"
ok 8 failed to set interface, was "net0", expected "net0"
# add IPv6 to net0
ok 9 failed to set ips, was ["10.254.254.254/24","fd00::1/64","addrconf"], expected ["10.254.254.254/24","fd00::1/64","addrconf"]
ok 10 failed to set mac, was "01:02:03:04:05:06", expected "01:02:03:04:05:06"
# add net1 -- bad IP
ok 11 failed to add nic with invalid IP: Invalid IP for NIC: {"nic_tag":"external","vlan_id":0,"mac":"82:b3:5f:c6:74:97","physical":"net1","ips":["10.99.99.12,10.99.99.33,10.99.99.34/24"],"gateways":["10.254.254.1"]}
# add KVM-only property to zone
ok 12 VM has [1 vs. 1] nics
ok 13 allow_unfiltered_promisc is not set
# remove net0
ok 14 Successfully removed net0 from VM
# add net0 and net1
ok 15 failed to set ips, was ["10.254.254.254/24"], expected ["10.254.254.254/24"]
ok 16 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 17 failed to set ips, was ["10.254.254.253/24"], expected ["10.254.254.253/24"]
ok 18 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
# remove net0 and net1
ok 19 Successfully removed net0 and net1 from VM
# add 3 nics, 2 non-private
ok 20 2nd NIC is primary: true
ok 21 updated VM: 7b64f9ad-f4f7-466f-9046-bca2923a6d95
# remove net1 -- 1st time
ok 22 Successfully removed net1 from VM
ok 23 2nd NIC is primary: true
ok 24 [{"interface":"net0","mac":"01:02:03:04:05:06","vlan_id":0,"nic_tag":"external","gateway":"10.254.254.1","gateways":["10.254.254.1"],"netmask":"255.255.255.0","ip":"10.254.254.254","ips":["10.254.254.254/24"]},{"interface":"net2","mac":"02:03:04:05:06:08","vlan_id":253,"nic_tag":"external","gateway":"169.254.169.1","gateways":["169.254.169.1"],"netmask":"255.255.255.0","ip":"169.254.169.253","ips":["169.254.169.253/24"],"primary":true}]
# remove net0 and net2 -- 1st time
ok 25 Successfully removed net0 and net2 from VM
# add 3 nics, 1 non-private
ok 26 2nd NIC is primary: true
ok 27 updated VM: 7b64f9ad-f4f7-466f-9046-bca2923a6d95
# remove net1 -- 2nd time
ok 28 Successfully removed net0 and net1 from VM
ok 29 1st NIC is primary: true
# remove net0 and net2 -- 2nd time
ok 30 Successfully removed net0 and net2 from VM
# add empty NICs
ok 31 (unnamed assert)
ok 32 (unnamed assert)
ok 33 (unnamed assert)
ok 34 Successfully added empty NICs to VM
ok 35 Successfully added empty NICs to VM
ok 36 Successfully added empty NICs to VM
ok 37 Successfully added empty NICs to VM
# remove empty, primary NIC
ok 38 (unnamed assert)
ok 39 (unnamed assert)
ok 40 (unnamed assert)
# add IPs to empty NIC
ok 41 (unnamed assert)
ok 42 (unnamed assert)
ok 43 (unnamed assert)
ok 44 Successfully removed empty NIC from VM
# remove IPs to make empty NIC
ok 45 (unnamed assert)
ok 46 (unnamed assert)
ok 47 (unnamed assert)
ok 48 Successfully removed "ip" from NIC
ok 49 Successfully removed "ips" from NIC
# clean up from empty tests
ok 50 (unnamed assert)
ok 51 (unnamed assert)
ok 52 Remove last empty NIC from VM
# add NIC with minimal properties
ok 53 failed reloading VM
ok 54 VM has 1 nics, expected: 1
ok 55 prop is expected: interface
ok 56 prop interface is not undefined
ok 57 prop is expected: mac
ok 58 prop mac is not undefined
ok 59 prop is expected: nic_tag
ok 60 prop nic_tag is not undefined
ok 61 prop is expected: ip
ok 62 prop ip is not undefined
ok 63 prop is expected: ips
ok 64 prop ips is not undefined
# set then unset simple properties
ok 65 alias is useless VM, expected: useless VM
ok 66 alias is undefined, expected: undefined
ok 67 billing_id is 9.99, expected: 9.99
ok 68 billing_id is undefined, expected: undefined
ok 69 hostname is hamburgerhelper, expected: hamburgerhelper
ok 70 hostname is undefined, expected: undefined
ok 71 owner_uuid is 36bf401a-28ef-11e1-b4a7-c344deb1a5d6, expected: 36bf401a-28ef-11e1-b4a7-c344deb1a5d6
ok 72 owner_uuid is undefined, expected: undefined
ok 73 package_name is really expensive package, expected: really expensive package
ok 74 package_name is undefined, expected: undefined
ok 75 package_version is XP, expected: XP
ok 76 package_version is undefined, expected: undefined
# update quota
ok 77 updated quota now: 13G vs 13G
# remove quota
ok 78 updated quota now: none vs none
# update ram 512
ok 79 vm.max_physical_memory: 512 expected: 512
ok 80 vm.max_locked_memory: 512 expected: 512
ok 81 vm.max_swap: 512 expected: 512
# update ram 128
ok 82 vm.max_physical_memory: 128 expected: 128
ok 83 vm.max_locked_memory: 128 expected: 128
ok 84 vm.max_swap: 256 expected: 256
# update ram 256
ok 85 vm.max_physical_memory: 256 expected: 256
ok 86 vm.max_locked_memory: 256 expected: 256
ok 87 vm.max_swap: 256 expected: 256
# update ram 64
ok 88 vm.max_physical_memory: 64 expected: 64
ok 89 vm.max_locked_memory: 64 expected: 64
ok 90 vm.max_swap: 256 expected: 256
# update ram 1024
ok 91 vm.max_physical_memory: 1024 expected: 1024
ok 92 vm.max_locked_memory: 1024 expected: 1024
ok 93 vm.max_swap: 1024 expected: 1024
# update max_swap (up)
ok 94 vm.max_swap: 1536 expected: 1536
ok 95 vm.tmpfs: 1024 expected: 1024
ok 96 vm.max_physical_memory: 1024 expected: 1024
ok 97 vm.max_locked_memory: 1024 expected: 1024
# update max_swap (down)
ok 98 vm.max_swap: 1024 expected: 1024
ok 99 vm.tmpfs: 1024 expected: 1024
ok 100 vm.max_physical_memory: 1024 expected: 1024
ok 101 vm.max_locked_memory: 1024 expected: 1024
# update max_physical_memory (up)
ok 102 vm.max_swap: 2048 expected: 2048
ok 103 vm.tmpfs: 2048 expected: 2048
ok 104 vm.max_physical_memory: 2048 expected: 2048
ok 105 vm.max_locked_memory: 2048 expected: 2048
# update max_physical_memory (down)
ok 106 vm.max_swap: 512 expected: 512
ok 107 vm.tmpfs: 512 expected: 512
ok 108 vm.max_physical_memory: 512 expected: 512
ok 109 vm.max_locked_memory: 512 expected: 512
# update max_locked_memory
ok 110 vm.max_swap: 512 expected: 512
ok 111 vm.tmpfs: 512 expected: 512
ok 112 vm.max_physical_memory: 512 expected: 512
ok 113 vm.max_locked_memory: 512 expected: 512
# update resolvers when empty
ok 114 resolvers after update: ["4.2.2.1","4.2.2.2"]
# update resolvers to empty when filled
ok 115 resolvers after update: []
# update resolvers to empty when empty
ok 116 resolvers after update: []
# update shm rctls
ok 117 max_msg_ids value before test: 4096
ok 118 max_sem_ids value before test: 4096
ok 119 max_shm_ids value before test: 4096
ok 120 max_shm_memory value before test: 256
ok 121 max_msg_ids value after test: 3333
ok 122 max_sem_ids value after test: 2332
ok 123 max_shm_ids value after test: 2345
ok 124 max_shm_memory value after test: 1234
# remove cpu_cap
ok 125 cpu_cap is 1600 to start
ok 126 cpu_cap is gone
# set low quota
ok 127 update quota=1: success
# fill up zoneroot
ok 128 expected short write
# get vmobj for full VM
ok 129 load VM: success
# bump max_physical_memory
ok 130 update max_physical_memory: success
# raise quota to 2
ok 131 update quota=2: success
# get vmobj for full VM after modifications
ok 132 load VM: success
ok 133 check max_physical_memory
ok 134 check quota
# attempt to modify unmodifiable properties
ok 135 load VM: success
ok 136 update unmodifiable VM property "brand" to "bogus-brand": success
ok 137 load VM: success
ok 138 value has not been modified (original: "joyent", found "joyent")
ok 139 update unmodifiable VM property "hvm" to "bogus-hvm": success
ok 140 load VM: success
ok 141 value has not been modified (original: false, found false)
ok 142 update unmodifiable VM property "last_modified" to "bogus-last-modified": success
ok 143 update unmodifiable VM property "server_uuid" to "00000000-0000-0000-0000-000000000000": success
ok 144 load VM: success
ok 145 value has not been modified (original: "564d0a56-64f5-ac53-2414-89acd25155da", found "564d0a56-64f5-ac53-2414-89acd25155da")
ok 146 update unmodifiable VM property "uuid" to "00000000-0000-0000-0000-000000000000": success
ok 147 load VM: success
ok 148 value has not been modified (original: "7b64f9ad-f4f7-466f-9046-bca2923a6d95", found "7b64f9ad-f4f7-466f-9046-bca2923a6d95")
ok 149 update unmodifiable VM property "zonename" to "bogus-zonename": success
ok 150 load VM: success
ok 151 value has not been modified (original: "7b64f9ad-f4f7-466f-9046-bca2923a6d95", found "7b64f9ad-f4f7-466f-9046-bca2923a6d95")
ok 152 unmodifiable properties: success
# attempt to remove and set zonecfg properties
ok 153 update VM property "cpu_shares" to undefined: success
ok 154 update VM property "cpu_shares" to 5: success
ok 155 update VM property "cpu_shares" to undefined: success
ok 156 update VM property "cpu_shares" to undefined: success
ok 157 update VM property "limit_priv" to "": success
ok 158 update VM property "limit_priv" to "default": success
ok 159 update VM property "limit_priv" to "default,dtrace_user": success
ok 160 update VM property "limit_priv" to "": success
ok 161 update VM property "limit_priv" to "": success
ok 162 update VM property "max_lwps" to undefined: success
ok 163 update VM property "max_lwps" to 5000: success
ok 164 update VM property "max_lwps" to undefined: success
ok 165 update VM property "max_lwps" to undefined: success
ok 166 update VM property "max_msg_ids" to undefined: success
ok 167 update VM property "max_msg_ids" to 5000: success
ok 168 update VM property "max_msg_ids" to undefined: success
ok 169 update VM property "max_msg_ids" to undefined: success
ok 170 update VM property "max_shm_ids" to undefined: success
ok 171 update VM property "max_shm_ids" to 5000: success
ok 172 update VM property "max_shm_ids" to undefined: success
ok 173 update VM property "max_shm_ids" to undefined: success
ok 174 update VM property "max_shm_memory" to undefined: success
ok 175 update VM property "max_shm_memory" to 5000: success
ok 176 update VM property "max_shm_memory" to undefined: success
ok 177 update VM property "max_shm_memory" to undefined: success
ok 178 update VM property "zfs_io_priority" to undefined: success
ok 179 update VM property "zfs_io_priority" to 50: success
ok 180 update VM property "zfs_io_priority" to undefined: success
ok 181 update VM property "zfs_io_priority" to undefined: success
ok 182 zonecfg properties: success
# add fs /var/tmp/global
ok 183 field type was set to "lofs"
ok 184 field source was set to "/tmp"
ok 185 field target was set to "/var/tmp/global"
ok 186 field options was set to ["nodevice"]
# remove fs /var/tmp/global
ok 187 Successfully removed filesystem from VM
# delete zone
ok 188 deleted VM: 7b64f9ad-f4f7-466f-9046-bca2923a6d95

1..188
# tests 188
# pass  188

# ok
#
# tests/test-update.js TEST COMPLETE IN 67 SECONDS, SUMMARY:
#
# PASS: 188 / 188
#

I couldn't get a full ./runtests to complete because under vmware fusion both kvm and bhyve fail. I verified they also fail in an unmodified PI. There is a iso with the changes available here: https://pkg.blackdot.be/extras/platform-20191224T122406Z.iso

Aditional testing: been using the original change for a few months now as I was carrying it around as a patch with a few other bits.

sjorge commented 4 years ago

@jclulow @jlevon @rzezeski ping

sjorge commented 4 years ago

Some small progess, the test I added works and fails... the changes I did are probably not enough. I'll look at it some more later this week,...

not ok 187 field options was set to ["nodevice"], but expected value is ["nodevice","ro"]
  ---
    type:    AssertionError
    message: field options was set to ["nodevice"], but expected value is ["nodevice","ro"]
    code:    ~
    errno:   ~
    file:    /usr/vm/node_modules/vminfod/client.js
    line:    292
    column:  13
    stack:
      - /usr/vm/test/tests/test-update.js:1527:27
      - /usr/vm/node_modules/VM.js:1695:9
      - /usr/vm/node_modules/vmload/index.js:689:13
      - /usr/vm/node_modules/vmload/index.js:699:13
      - IncomingMessage.resEnd (/usr/vm/node_modules/vminfod/client.js:292:13)
      - IncomingMessage.EventEmitter.emit (events.js:117:20)
      - _stream_readable.js:920:16
      - process._tickDomainCallback (node.js:459:13)
    wanted:  true
    found:   false
  ...
sjorge commented 4 years ago

Success! Will upload a new iso and squash the commits.

# Running tests/test-update.js
Already have "imgapi" image source "https://images.joyent.com", no change
Already have "docker" image source "https://docker.io", no change
TAP version 13
# create VM
ok 1 created VM: aa8f953f-e503-e927-e77b-d2ce08449f61
# add net0
ok 2 failed to set ips, was ["10.254.254.254/24"], expected ["10.254.254.254/24"]
ok 3 failed to set netmask, was "255.255.255.0", expected "255.255.255.0"
ok 4 failed to set nic_tag, was "external", expected "external"
ok 5 failed to set vlan_id, was 0, expected 0
ok 6 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 7 failed to set mac, was "01:02:03:04:05:06", expected "01:02:03:04:05:06"
ok 8 failed to set interface, was "net0", expected "net0"
# add IPv6 to net0
ok 9 failed to set ips, was ["10.254.254.254/24","fd00::1/64","addrconf"], expected ["10.254.254.254/24","fd00::1/64","addrconf"]
ok 10 failed to set mac, was "01:02:03:04:05:06", expected "01:02:03:04:05:06"
# add net1 -- bad IP
ok 11 failed to add nic with invalid IP: Invalid IP for NIC: {"nic_tag":"external","vlan_id":0,"mac":"22:ea:73:40:38:d3","physical":"net1","ips":["10.99.99.12,10.99.99.33,10.99.99.34/24"],"gateways":["10.254.254.1"]}
# add KVM-only property to zone
ok 12 VM has [1 vs. 1] nics
ok 13 allow_unfiltered_promisc is not set
# remove net0
ok 14 Successfully removed net0 from VM
# add net0 and net1
ok 15 failed to set ips, was ["10.254.254.254/24"], expected ["10.254.254.254/24"]
ok 16 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 17 failed to set ips, was ["10.254.254.253/24"], expected ["10.254.254.253/24"]
ok 18 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
# remove net0 and net1
ok 19 Successfully removed net0 and net1 from VM
# add 3 nics, 2 non-private
ok 20 2nd NIC is primary: true
ok 21 updated VM: aa8f953f-e503-e927-e77b-d2ce08449f61
# remove net1 -- 1st time
ok 22 Successfully removed net1 from VM
ok 23 2nd NIC is primary: true
ok 24 [{"interface":"net0","mac":"01:02:03:04:05:06","vlan_id":0,"nic_tag":"external","gateway":"10.254.254.1","gateways":["10.254.254.1"],"netmask":"255.255.255.0","ip":"10.254.254.254","ips":["10.254.254.254/24"]},{"interface":"net2","mac":"02:03:04:05:06:08","vlan_id":253,"nic_tag":"external","gateway":"169.254.169.1","gateways":["169.254.169.1"],"netmask":"255.255.255.0","ip":"169.254.169.253","ips":["169.254.169.253/24"],"primary":true}]
# remove net0 and net2 -- 1st time
ok 25 Successfully removed net0 and net2 from VM
# add 3 nics, 1 non-private
ok 26 2nd NIC is primary: true
ok 27 updated VM: aa8f953f-e503-e927-e77b-d2ce08449f61
# remove net1 -- 2nd time
ok 28 Successfully removed net0 and net1 from VM
ok 29 1st NIC is primary: true
# remove net0 and net2 -- 2nd time
ok 30 Successfully removed net0 and net2 from VM
# add empty NICs
ok 31 (unnamed assert)
ok 32 (unnamed assert)
ok 33 (unnamed assert)
ok 34 Successfully added empty NICs to VM
ok 35 Successfully added empty NICs to VM
ok 36 Successfully added empty NICs to VM
ok 37 Successfully added empty NICs to VM
# remove empty, primary NIC
ok 38 (unnamed assert)
ok 39 (unnamed assert)
ok 40 (unnamed assert)
# add IPs to empty NIC
ok 41 (unnamed assert)
ok 42 (unnamed assert)
ok 43 (unnamed assert)
ok 44 Successfully removed empty NIC from VM
# remove IPs to make empty NIC
ok 45 (unnamed assert)
ok 46 (unnamed assert)
ok 47 (unnamed assert)
ok 48 Successfully removed "ip" from NIC
ok 49 Successfully removed "ips" from NIC
# clean up from empty tests
ok 50 (unnamed assert)
ok 51 (unnamed assert)
ok 52 Remove last empty NIC from VM
# add NIC with minimal properties
ok 53 failed reloading VM
ok 54 VM has 1 nics, expected: 1
ok 55 prop is expected: interface
ok 56 prop interface is not undefined
ok 57 prop is expected: mac
ok 58 prop mac is not undefined
ok 59 prop is expected: nic_tag
ok 60 prop nic_tag is not undefined
ok 61 prop is expected: ip
ok 62 prop ip is not undefined
ok 63 prop is expected: ips
ok 64 prop ips is not undefined
# set then unset simple properties
ok 65 alias is useless VM, expected: useless VM
ok 66 alias is undefined, expected: undefined
ok 67 billing_id is 9.99, expected: 9.99
ok 68 billing_id is undefined, expected: undefined
ok 69 hostname is hamburgerhelper, expected: hamburgerhelper
ok 70 hostname is undefined, expected: undefined
ok 71 owner_uuid is 36bf401a-28ef-11e1-b4a7-c344deb1a5d6, expected: 36bf401a-28ef-11e1-b4a7-c344deb1a5d6
ok 72 owner_uuid is undefined, expected: undefined
ok 73 package_name is really expensive package, expected: really expensive package
ok 74 package_name is undefined, expected: undefined
ok 75 package_version is XP, expected: XP
ok 76 package_version is undefined, expected: undefined
# update quota
ok 77 updated quota now: 13G vs 13G
# remove quota
ok 78 updated quota now: none vs none
# update ram 512
ok 79 vm.max_physical_memory: 512 expected: 512
ok 80 vm.max_locked_memory: 512 expected: 512
ok 81 vm.max_swap: 512 expected: 512
# update ram 128
ok 82 vm.max_physical_memory: 128 expected: 128
ok 83 vm.max_locked_memory: 128 expected: 128
ok 84 vm.max_swap: 256 expected: 256
# update ram 256
ok 85 vm.max_physical_memory: 256 expected: 256
ok 86 vm.max_locked_memory: 256 expected: 256
ok 87 vm.max_swap: 256 expected: 256
# update ram 64
ok 88 vm.max_physical_memory: 64 expected: 64
ok 89 vm.max_locked_memory: 64 expected: 64
ok 90 vm.max_swap: 256 expected: 256
# update ram 1024
ok 91 vm.max_physical_memory: 1024 expected: 1024
ok 92 vm.max_locked_memory: 1024 expected: 1024
ok 93 vm.max_swap: 1024 expected: 1024
# update max_swap (up)
ok 94 vm.max_swap: 1536 expected: 1536
ok 95 vm.tmpfs: 1024 expected: 1024
ok 96 vm.max_physical_memory: 1024 expected: 1024
ok 97 vm.max_locked_memory: 1024 expected: 1024
# update max_swap (down)
ok 98 vm.max_swap: 1024 expected: 1024
ok 99 vm.tmpfs: 1024 expected: 1024
ok 100 vm.max_physical_memory: 1024 expected: 1024
ok 101 vm.max_locked_memory: 1024 expected: 1024
# update max_physical_memory (up)
ok 102 vm.max_swap: 2048 expected: 2048
ok 103 vm.tmpfs: 2048 expected: 2048
ok 104 vm.max_physical_memory: 2048 expected: 2048
ok 105 vm.max_locked_memory: 2048 expected: 2048
# update max_physical_memory (down)
ok 106 vm.max_swap: 512 expected: 512
ok 107 vm.tmpfs: 512 expected: 512
ok 108 vm.max_physical_memory: 512 expected: 512
ok 109 vm.max_locked_memory: 512 expected: 512
# update max_locked_memory
ok 110 vm.max_swap: 512 expected: 512
ok 111 vm.tmpfs: 512 expected: 512
ok 112 vm.max_physical_memory: 512 expected: 512
ok 113 vm.max_locked_memory: 512 expected: 512
# update resolvers when empty
ok 114 resolvers after update: ["4.2.2.1","4.2.2.2"]
# update resolvers to empty when filled
ok 115 resolvers after update: []
# update resolvers to empty when empty
ok 116 resolvers after update: []
# update shm rctls
ok 117 max_msg_ids value before test: 4096
ok 118 max_sem_ids value before test: 4096
ok 119 max_shm_ids value before test: 4096
ok 120 max_shm_memory value before test: 256
ok 121 max_msg_ids value after test: 3333
ok 122 max_sem_ids value after test: 2332
ok 123 max_shm_ids value after test: 2345
ok 124 max_shm_memory value after test: 1234
# remove cpu_cap
ok 125 cpu_cap is 1600 to start
ok 126 cpu_cap is gone
# set low quota
ok 127 update quota=1: success
# fill up zoneroot
ok 128 expected short write
# get vmobj for full VM
ok 129 load VM: success
# bump max_physical_memory
ok 130 update max_physical_memory: success
# raise quota to 2
ok 131 update quota=2: success
# get vmobj for full VM after modifications
ok 132 load VM: success
ok 133 check max_physical_memory
ok 134 check quota
# attempt to modify unmodifiable properties
ok 135 load VM: success
ok 136 update unmodifiable VM property "brand" to "bogus-brand": success
ok 137 load VM: success
ok 138 value has not been modified (original: "joyent", found "joyent")
ok 139 update unmodifiable VM property "hvm" to "bogus-hvm": success
ok 140 load VM: success
ok 141 value has not been modified (original: false, found false)
ok 142 update unmodifiable VM property "last_modified" to "bogus-last-modified": success
ok 143 update unmodifiable VM property "server_uuid" to "00000000-0000-0000-0000-000000000000": success
ok 144 load VM: success
ok 145 value has not been modified (original: "564d0a56-64f5-ac53-2414-89acd25155da", found "564d0a56-64f5-ac53-2414-89acd25155da")
ok 146 update unmodifiable VM property "uuid" to "00000000-0000-0000-0000-000000000000": success
ok 147 load VM: success
ok 148 value has not been modified (original: "aa8f953f-e503-e927-e77b-d2ce08449f61", found "aa8f953f-e503-e927-e77b-d2ce08449f61")
ok 149 update unmodifiable VM property "zonename" to "bogus-zonename": success
ok 150 load VM: success
ok 151 value has not been modified (original: "aa8f953f-e503-e927-e77b-d2ce08449f61", found "aa8f953f-e503-e927-e77b-d2ce08449f61")
ok 152 unmodifiable properties: success
# attempt to remove and set zonecfg properties
ok 153 update VM property "cpu_shares" to undefined: success
ok 154 update VM property "cpu_shares" to 5: success
ok 155 update VM property "cpu_shares" to undefined: success
ok 156 update VM property "cpu_shares" to undefined: success
ok 157 update VM property "limit_priv" to "": success
ok 158 update VM property "limit_priv" to "default": success
ok 159 update VM property "limit_priv" to "default,dtrace_user": success
ok 160 update VM property "limit_priv" to "": success
ok 161 update VM property "limit_priv" to "": success
ok 162 update VM property "max_lwps" to undefined: success
ok 163 update VM property "max_lwps" to 5000: success
ok 164 update VM property "max_lwps" to undefined: success
ok 165 update VM property "max_lwps" to undefined: success
ok 166 update VM property "max_msg_ids" to undefined: success
ok 167 update VM property "max_msg_ids" to 5000: success
ok 168 update VM property "max_msg_ids" to undefined: success
ok 169 update VM property "max_msg_ids" to undefined: success
ok 170 update VM property "max_shm_ids" to undefined: success
ok 171 update VM property "max_shm_ids" to 5000: success
ok 172 update VM property "max_shm_ids" to undefined: success
ok 173 update VM property "max_shm_ids" to undefined: success
ok 174 update VM property "max_shm_memory" to undefined: success
ok 175 update VM property "max_shm_memory" to 5000: success
ok 176 update VM property "max_shm_memory" to undefined: success
ok 177 update VM property "max_shm_memory" to undefined: success
ok 178 update VM property "zfs_io_priority" to undefined: success
ok 179 update VM property "zfs_io_priority" to 50: success
ok 180 update VM property "zfs_io_priority" to undefined: success
ok 181 update VM property "zfs_io_priority" to undefined: success
ok 182 zonecfg properties: success
# add fs /var/tmp/global
ok 183 field type was set to "lofs"
ok 184 field source was set to "/tmp"
ok 185 field target was set to "/var/tmp/global"
ok 186 field options was set to ["nodevice"]
# set fs /var/tmp/global as readonly
ok 187 field options was set to ["nodevice","ro"]
ok 188 field target was set to "/var/tmp/global"
# remove fs /var/tmp/global
ok 189 Successfully removed filesystem from VM
# delete zone
ok 190 deleted VM: aa8f953f-e503-e927-e77b-d2ce08449f61

1..190
# tests 190
# pass  190

# ok
#
# tests/test-update.js TEST COMPLETE IN 58 SECONDS, SUMMARY:
#
# PASS: 190 / 190
#
sjorge commented 4 years ago

@jlevon commits squashed, update_filesystems is working now and has a simple test.

New ISO: https://pkg.blackdot.be/extras/platform-20200107T130814Z.iso

sjorge commented 4 years ago

I did a full test run after fixing some of the issues that would hang the CN (related to vmx nesting)

The results can be found here https://pkg.blackdot.be/extras/vmtest.1578429901.11881.tar.gz

#  TEST COMPLETE IN 4086 SECONDS, SUMMARY:
#
# PASS: 5063 / 5071
# FAIL: 8 / 5071
#
#  ** FAILED TESTS **
#  /usr/vm/test/tests/test-update-bhyve.js
#  /usr/vm/test/tests/test-bhyve-pci_slot.js
#
# log files available in: /tmp/vmtest.1578429901.11881
#

I did run test-update-bhyve.js again separately and it ran fine.

# Running ./tests/test-update-bhyve.js
Already have "imgapi" image source "https://images.joyent.com", no change
Already have "docker" image source "https://docker.io", no change
TAP version 13
# create bhyve VM
ok 1 created VM: 3444cd8b-b467-40e6-b541-96616b087be9
# add disk1 to bhyve VM
ok 2 models of disk0 and disk1 match [virtio,virtio]
# start bhyve VM
ok 3 error starting VM
# add net0 to bhyve VM
ok 4 failed to set primary, was true, expected true
ok 5 failed to set model, was "e1000", expected "e1000"
ok 6 failed to set ips, was ["10.254.254.254/24"], expected ["10.254.254.254/24"]
ok 7 failed to set netmask, was "255.255.255.0", expected "255.255.255.0"
ok 8 failed to set nic_tag, was "external", expected "external"
ok 9 failed to set vlan_id, was 0, expected 0
ok 10 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 11 failed to set mac, was "00:02:03:04:05:06", expected "00:02:03:04:05:06"
ok 12 failed to set interface, was "net0", expected "net0"
# update nic model on bhyve VM
ok 13 obj.nics[0].model: virtio expected: virtio
# add net1 to bhyve VM
ok 14 models of net0 and net1 match [virtio,virtio]
# remove net0 from bhyve VM
ok 15 Successfully removed net0 from VM
# remove net1 from bhyve VM
ok 16 Successfully removed net0 from VM
# add net0 and net1 to bhyve VM
ok 17 failed to set primary, was true, expected true
ok 18 failed to set model, was "virtio", expected "virtio"
ok 19 failed to set ips, was ["10.254.254.254/24"], expected ["10.254.254.254/24"]
ok 20 failed to set netmask, was "255.255.255.0", expected "255.255.255.0"
ok 21 failed to set nic_tag, was "external", expected "external"
ok 22 failed to set vlan_id, was 0, expected 0
ok 23 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 24 failed to set mac, was "00:02:03:04:05:06", expected "00:02:03:04:05:06"
ok 25 failed to set interface, was "net0", expected "net0"
ok 26 failed to set model, was "virtio", expected "virtio"
ok 27 failed to set ips, was ["10.254.254.253/24"], expected ["10.254.254.253/24"]
ok 28 failed to set netmask, was "255.255.255.0", expected "255.255.255.0"
ok 29 failed to set nic_tag, was "external", expected "external"
ok 30 failed to set vlan_id, was 253, expected 253
ok 31 failed to set gateways, was ["10.254.254.1"], expected ["10.254.254.1"]
ok 32 failed to set mac, was "02:03:04:05:06:07", expected "02:03:04:05:06:07"
ok 33 failed to set interface, was "net1", expected "net1"
# remove net0 and net1 from bhyve VM
ok 34 Successfully removed net0 and net1 from VM
# add 3 NICs to bhyve VM
ok 35 Successfully 3 NICs to VM
# remove net1 from bhyve VM -- 2nd time
ok 36 Successfully removed net1
# restart bhyve VM
ok 37 stopping VM
ok 38 starting VM
ok 39 VM is running after restart: running
# set simple properties on bhyve VM
ok 40 alias is useless VM (string), expected: useless VM (string)
ok 41 billing_id is 9.99 (string), expected: 9.99 (string)
ok 42 hostname is hamburgerhelper (string), expected: hamburgerhelper (string)
ok 43 owner_uuid is 36bf401a-28ef-11e1-b4a7-c344deb1a5d6 (string), expected: 36bf401a-28ef-11e1-b4a7-c344deb1a5d6 (string)
ok 44 bhyve_extra_opts is -c sockets=1,cores=2,threads=2 (string), expected: -c sockets=1,cores=2,threads=2 (string)
ok 45 bootrom is uefi (string), expected: uefi (string)
# update bhyve VM flexible_disk_size
ok 46 VM has 11264 != 11264
# update ram 2048 (1)
ok 47 vm.ram: 2048 expected: 2048
ok 48 vm.max_physical_memory: 3328 expected: 3328
ok 49 vm.max_locked_memory: 3328 expected: 3328
ok 50 vm.max_swap: 3328 expected: 3328
# update ram 1024 (2)
ok 51 vm.ram: 1024 expected: 1024
ok 52 vm.max_physical_memory: 2304 expected: 2304
ok 53 vm.max_locked_memory: 2304 expected: 2304
ok 54 vm.max_swap: 2304 expected: 2304
# update ram 1024 (3)
ok 55 vm.ram: 1024 expected: 1024
ok 56 vm.max_physical_memory: 2304 expected: 2304
ok 57 vm.max_locked_memory: 2304 expected: 2304
ok 58 vm.max_swap: 2304 expected: 2304
# update mixed mem properties on BHYVE VM
ok 59 vm.ram: 2048 expected: 2048
ok 60 vm.max_locked_memory: 2304 expected: 2304
ok 61 vm.max_swap: 2304 expected: 2304
ok 62 vm.max_physical_memory: 2304 expected: 2304
# update bhyve VM max_swap
ok 63 vm.max_swap: 4096 expected: 4096
ok 64 vm.max_physical_memory: 2304 expected: 2304
ok 65 vm.max_locked_memory: 2304 expected: 2304
# update bhyve VM max_swap to lower value than RAM
ok 66 vm.max_swap: 2304 expected: 2304
ok 67 vm.max_physical_memory: 2304 expected: 2304
ok 68 vm.max_locked_memory: 2304 expected: 2304
# update max_physical_memory to RAM + 256
ok 69 vm.max_swap: 2304 expected: 2304
ok 70 vm.max_physical_memory: 2304 expected: 2304
ok 71 vm.max_locked_memory: 2304 expected: 2304
# update max_physical_memory to RAM + 1024
ok 72 vm.max_swap: 3072 expected: 3072
ok 73 vm.max_physical_memory: 3072 expected: 3072
ok 74 vm.max_locked_memory: 3072 expected: 3072
# update max_swap to RAM - 64
ok 75 vm.max_swap: 3072 expected: 3072
# update max_locked_memory to high value
ok 76 vm.max_swap: 3072 expected: 3072
ok 77 vm.max_physical_memory: 3072 expected: 3072
ok 78 vm.max_locked_memory: 3072 expected: 3072
# set vnc_port=-1
ok 79 vm.vnc_port: -1 expected: -1
# enable / disable compression
ok 80 disk0 has compression off: off
ok 81 disk1 has compression off: off
ok 82 disk0 has compression=gzip: gzip
ok 83 disk1 has compression=gzip: gzip
ok 84 disk0 has compression=off: off
ok 85 disk1 has compression=off: off
# remove bhyve disks
ok 86 disk0 has a path: /dev/zvol/rdsk/zones/3444cd8b-b467-40e6-b541-96616b087be9/disk0
ok 87 disk1 has a path: /dev/zvol/rdsk/zones/3444cd8b-b467-40e6-b541-96616b087be9/disk1
ok 88 VM.update failed to remove disks: updates to disks are only allowed when state is "stopped", currently: running (running)
ok 89 VM.stop
ok 90 loaded VM after stop
ok 91 VM is stopped.
ok 92 removed disk: undefined
ok 93 ensure dataset no longer exists
ok 94 loaded VM after delete
ok 95 no disks member for final_obj
ok 96 disks list empty: []
# delete bhyve VM
ok 97 deleted VM: 3444cd8b-b467-40e6-b541-96616b087be9
# test 100%/10% refreservation, change to 50%/75%
ok 98 created VM: 69ac17d1-1973-6b60-8d02-918fb2d6bffd
ok 99 updating VM: success
ok 100 load VM: success
ok 101 disk 0 has correct refreservation: 2560/2560
ok 102 disk 1 has correct refreservation: 768/768
ok 103 deleted VM: 69ac17d1-1973-6b60-8d02-918fb2d6bffd

1..103
# tests 103
# pass  103

# ok
#
# ./tests/test-update-bhyve.js TEST COMPLETE IN 16 SECONDS, SUMMARY:
#
# PASS: 103 / 103
#

test-bhyve-pci_slot.js took 3 times before it passed without timeouts, but it does pass

# Running ./tests/test-bhyve-pci_slot.js
Already have "imgapi" image source "https://images.joyent.com", no change
Already have "docker" image source "https://docker.io", no change
TAP version 13
# Verify disk.*.pci_slot are populated by VM.configure
ok 1 VM created with uuid acb4bb03-093f-c2dc-e0d6-e572b3e7194a
ok 2 Skipping damage - nothing to do
ok 3 Skipping update - nothing to do
ok 4 VM updated with operator script
ok 5 VM started
ok 6 zone stopped after running operator script
ok 7 VM loaded uuid acb4bb03-093f-c2dc-e0d6-e572b3e7194a
ok 8 Checking disk /dev/zvol/rdsk/zones/acb4bb03-093f-c2dc-e0d6-e572b3e7194a/disk0
ok 9 matching prop: image_uuid=462d1d03-8457-e134-a408-cf9ea2b9be96
ok 10 matching prop: boot=true
ok 11 matching prop: model=virtio
ok 12 matching prop: pci_slot=0:4:0
ok 13 disk /dev/zvol/rdsk/zones/acb4bb03-093f-c2dc-e0d6-e572b3e7194a/disk0 found
ok 14 Checking disk /dev/zvol/rdsk/zones/acb4bb03-093f-c2dc-e0d6-e572b3e7194a/disk1
ok 15 matching prop: image_uuid=undefined
ok 16 matching prop: model=virtio
ok 17 matching prop: pci_slot=0:4:1
ok 18 disk /dev/zvol/rdsk/zones/acb4bb03-093f-c2dc-e0d6-e572b3e7194a/disk1 found
ok 19 PCI slots occupied: 0:4:0 0:4:1
# Verify cdrom is in PCI slot 3:0
ok 20 VM created with uuid 5972a11a-6e26-c2c2-8cb3-8e323c5e69cd
ok 21 Skipping damage - nothing to do
ok 22 Skipping update - nothing to do
ok 23 Skipping guest PCI slot check
ok 24 Skipping guest start
ok 25 VM loaded uuid 5972a11a-6e26-c2c2-8cb3-8e323c5e69cd
ok 26 Checking disk /dev/zvol/rdsk/zones/5972a11a-6e26-c2c2-8cb3-8e323c5e69cd/disk0
ok 27 matching prop: image_uuid=462d1d03-8457-e134-a408-cf9ea2b9be96
ok 28 matching prop: boot=true
ok 29 matching prop: model=virtio
ok 30 matching prop: pci_slot=0:4:0
ok 31 disk /dev/zvol/rdsk/zones/5972a11a-6e26-c2c2-8cb3-8e323c5e69cd/disk0 found
ok 32 Checking disk /usr/share/bhyve/uefi-rom.bin
ok 33 matching prop: image_uuid=undefined
ok 34 matching prop: model=ahci
ok 35 matching prop: pci_slot=0:3:0
ok 36 matching prop: media=cdrom
ok 37 disk /usr/share/bhyve/uefi-rom.bin found
# Verify 8 disks automatically assigned properly
ok 38 VM created with uuid 4a963014-5807-ee11-cacd-ea7952fb04fe
ok 39 Skipping damage - nothing to do
ok 40 Skipping update - nothing to do
ok 41 VM updated with operator script
ok 42 VM started
ok 43 zone stopped after running operator script
ok 44 VM loaded uuid 4a963014-5807-ee11-cacd-ea7952fb04fe
ok 45 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk0
ok 46 matching prop: image_uuid=462d1d03-8457-e134-a408-cf9ea2b9be96
ok 47 matching prop: boot=true
ok 48 matching prop: model=virtio
ok 49 matching prop: pci_slot=0:4:0
ok 50 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk0 found
ok 51 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk1
ok 52 matching prop: pci_slot=0:4:1
ok 53 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk1 found
ok 54 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk2
ok 55 matching prop: pci_slot=0:4:2
ok 56 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk2 found
ok 57 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk3
ok 58 matching prop: pci_slot=0:4:3
ok 59 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk3 found
ok 60 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk4
ok 61 matching prop: pci_slot=0:4:4
ok 62 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk4 found
ok 63 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk5
ok 64 matching prop: pci_slot=0:4:5
ok 65 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk5 found
ok 66 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk6
ok 67 matching prop: pci_slot=0:4:6
ok 68 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk6 found
ok 69 Checking disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk7
ok 70 matching prop: pci_slot=0:4:7
ok 71 disk /dev/zvol/rdsk/zones/4a963014-5807-ee11-cacd-ea7952fb04fe/disk7 found
ok 72 PCI slots occupied: 0:4:0 0:4:1 0:4:2 0:4:3 0:4:4 0:4:5 0:4:6 0:4:7
# Verify create time assignments are sticky
ok 73 VM created with uuid e83a558a-fb0a-6238-83ec-d4efe8882485
ok 74 Skipping damage - nothing to do
ok 75 Skipping update - nothing to do
ok 76 VM updated with operator script
ok 77 VM started
ok 78 zone stopped after running operator script
ok 79 VM loaded uuid e83a558a-fb0a-6238-83ec-d4efe8882485
ok 80 Checking disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk0
ok 81 matching prop: image_uuid=462d1d03-8457-e134-a408-cf9ea2b9be96
ok 82 matching prop: boot=true
ok 83 matching prop: model=virtio
ok 84 matching prop: pci_slot=0:4:0
ok 85 disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk0 found
ok 86 Checking disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk1
ok 87 matching prop: pci_slot=0:4:5
ok 88 disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk1 found
ok 89 Checking disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk2
ok 90 matching prop: pci_slot=0:4:6
ok 91 disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk2 found
ok 92 Checking disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk3
ok 93 matching prop: pci_slot=0:4:7
ok 94 disk /dev/zvol/rdsk/zones/e83a558a-fb0a-6238-83ec-d4efe8882485/disk3 found
ok 95 PCI slots occupied: 0:4:0 0:4:5 0:4:6 0:4:7
# Verify alternate slot schemes are allowed
ok 96 VM created with uuid 1744e9c1-d931-4a9a-ba33-df51a800f477
ok 97 Skipping damage - nothing to do
ok 98 Skipping update - nothing to do
ok 99 VM updated with operator script
ok 100 VM started
ok 101 zone stopped after running operator script
ok 102 VM loaded uuid 1744e9c1-d931-4a9a-ba33-df51a800f477
ok 103 Checking disk /dev/zvol/rdsk/zones/1744e9c1-d931-4a9a-ba33-df51a800f477/disk0
ok 104 matching prop: pci_slot=0:4:0
ok 105 disk /dev/zvol/rdsk/zones/1744e9c1-d931-4a9a-ba33-df51a800f477/disk0 found
ok 106 Checking disk /dev/zvol/rdsk/zones/1744e9c1-d931-4a9a-ba33-df51a800f477/disk1
ok 107 matching prop: pci_slot=4:1
ok 108 disk /dev/zvol/rdsk/zones/1744e9c1-d931-4a9a-ba33-df51a800f477/disk1 found
ok 109 Checking disk /dev/zvol/rdsk/zones/1744e9c1-d931-4a9a-ba33-df51a800f477/disk2
ok 110 matching prop: pci_slot=5
ok 111 disk /dev/zvol/rdsk/zones/1744e9c1-d931-4a9a-ba33-df51a800f477/disk2 found
ok 112 PCI slots occupied: 0:4:0 0:4:1 0:5:0
# Verify holes are filled
ok 113 VM created with uuid aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e
ok 114 Skipping damage - nothing to do
ok 115 VM updated
ok 116 VM updated with operator script
ok 117 VM started
ok 118 zone stopped after running operator script
ok 119 VM loaded uuid aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e
ok 120 Checking disk /dev/zvol/rdsk/zones/aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e/disk0
ok 121 matching prop: pci_slot=0:4:0
ok 122 disk /dev/zvol/rdsk/zones/aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e/disk0 found
ok 123 Checking disk /dev/zvol/rdsk/zones/aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e/disk1
ok 124 matching prop: pci_slot=0:4:2
ok 125 disk /dev/zvol/rdsk/zones/aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e/disk1 found
ok 126 Checking disk /dev/zvol/rdsk/zones/aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e/disk2
ok 127 matching prop: pci_slot=0:4:1
ok 128 disk /dev/zvol/rdsk/zones/aedfc2b6-4c98-e435-d8b4-cb5bf7fdfa5e/disk2 found
ok 129 PCI slots occupied: 0:4:0 0:4:1 0:4:2
# Verify VM.start performs static assignment
ok 130 VM created with uuid 92e58037-fe93-4333-99f6-d0f0b78e0281
ok 131 damaging zone
ok 132 zone config succeeded
ok 133 damaged VM loaded
ok 134 Checking damage
ok 135 damage done and verified
ok 136 Skipping update - nothing to do
ok 137 VM updated with operator script
ok 138 VM started
ok 139 zone stopped after running operator script
ok 140 VM loaded uuid 92e58037-fe93-4333-99f6-d0f0b78e0281
ok 141 Checking disk /dev/zvol/rdsk/zones/92e58037-fe93-4333-99f6-d0f0b78e0281/disk0
ok 142 matching prop: pci_slot=0:4:0
ok 143 disk /dev/zvol/rdsk/zones/92e58037-fe93-4333-99f6-d0f0b78e0281/disk0 found
ok 144 Checking disk /dev/zvol/rdsk/zones/92e58037-fe93-4333-99f6-d0f0b78e0281/disk1
ok 145 matching prop: pci_slot=0:5:0
ok 146 disk /dev/zvol/rdsk/zones/92e58037-fe93-4333-99f6-d0f0b78e0281/disk1 found
ok 147 PCI slots occupied: 0:4:0 0:5:0
# Conflict during create
ok 148 error detected
ok 149 VM load should fail
# Multiple boot disks
ok 150 error detected
ok 151 VM load should fail
# Conflict during update
ok 152 VM created with uuid 5440046a-d4ef-4c9c-8d85-8429f5ed885e
ok 153 update should not succeed
ok 154 conflict detected
# No squatters on 0:0:0: hostbridge
ok 155 error detected
ok 156 VM load should fail
# No squatters on 0:0:1: hostbridge
ok 157 error detected
ok 158 VM load should fail
# No squatters on 0:6:0: nics
ok 159 error detected
ok 160 VM load should fail
# No squatters on 0:6:1: nics
ok 161 error detected
ok 162 VM load should fail
# No squatters on 0:30:0: fbuf
ok 163 error detected
ok 164 VM load should fail
# No squatters on 0:30:1: fbuf
ok 165 error detected
ok 166 VM load should fail
# No squatters on 0:31:0: lpc
ok 167 error detected
ok 168 VM load should fail
# No squatters on 0:31:1: lpc
ok 169 error detected
ok 170 VM load should fail

1..170
# tests 170
# pass  170

# ok
#
# ./tests/test-bhyve-pci_slot.js TEST COMPLETE IN 259 SECONDS, SUMMARY:
#
# PASS: 170 / 170
#

In addition to that I also added, updated, and removes some filesystems manually from a vm I created.

@jlevon any other test data needed?

jlevon commented 4 years ago

"make check" is failing, can you please fix that up? (NB: don't force push!)

sjorge commented 4 years ago

Extra commit pushed... luckily it was only in the test file.

[root@proton ~/smartos-live]# make check
make[1]: Entering directory '/root/smartos-live/src'
/root/smartos-live/src/dockerinit
gmake[2]: Entering directory '/root/smartos-live/src/dockerinit'
/root/smartos-live/src/dockerinit/..//../tools/cstyle -P src/dockerexec.c src/dockerinit.c src/docker-common.c
gmake[2]: Leaving directory '/root/smartos-live/src/dockerinit'
/root/smartos-live/src/routeinfo
gmake[2]: Entering directory '/root/smartos-live/src/routeinfo'
/root/smartos-live/src/routeinfo/..//../tools/cstyle -cPp main.c
gmake[2]: Leaving directory '/root/smartos-live/src/routeinfo'
/root/smartos-live/src/lx_hook_postnet
gmake[2]: Entering directory '/root/smartos-live/src/lx_hook_postnet'
/root/smartos-live/src/lx_hook_postnet/..//../tools/cstyle -P main.c
gmake[2]: Leaving directory '/root/smartos-live/src/lx_hook_postnet'
==> Running cstyle...
/root/smartos-live/src/bootparams.c
/root/smartos-live/src/cryptpass.c
/root/smartos-live/src/disk_size.c
/root/smartos-live/src/fswatcher.c
/root/smartos-live/src/measure_terminal.c
/root/smartos-live/src/nomknod.c
/root/smartos-live/src/smartdc/bin/qemu-exec.c
/root/smartos-live/src/removable_disk.c
/root/smartos-live/src/vmunbundle.c
/root/smartos-live/src/zfs_recv.c
/root/smartos-live/src/zfs_send.c
/root/smartos-live/src/smartdc/lib/sdc-on-tty.c
/root/smartos-live/src/sysinfo_mod.c
/root/smartos-live/src/sysevent.c

C files ok!

==> Running JavaScriptLint...
/root/smartos-live/src/filewait.js
/root/smartos-live/src/node_modules/system.js
/root/smartos-live/src/node_modules/onlyif.js
/root/smartos-live/src/node_modules/net-boot-config.js
/root/smartos-live/src/net-boot-config
/root/smartos-live/src/vm/sbin/add-userscript.js
/root/smartos-live/src/vm/sbin/metadata.js
/root/smartos-live/src/vm/sbin/vmadm.js
/root/smartos-live/src/vm/sbin/vmadmd.js
/root/smartos-live/src/vm/sbin/vmdf.js
/root/smartos-live/src/vm/sbin/vminfo.js
/root/smartos-live/src/vm/sbin/vminfod.js
/root/smartos-live/src/vm/node_modules/diff.js
/root/smartos-live/src/vm/node_modules/dladm.js
/root/smartos-live/src/vm/node_modules/expander.js
/root/smartos-live/src/vm/node_modules/fswatcher.js
/root/smartos-live/src/vm/node_modules/hrtime.js
/root/smartos-live/src/vm/node_modules/ip.js
/root/smartos-live/src/vm/node_modules/nic.js
/root/smartos-live/src/vm/node_modules/proptable.js
/root/smartos-live/src/vm/node_modules/utils.js
/root/smartos-live/src/vm/node_modules/VM.js
/root/smartos-live/src/vm/node_modules/qmp.js
/root/smartos-live/src/vm/node_modules/queue.js
/root/smartos-live/src/vm/node_modules/openonerrlogger.js
/root/smartos-live/src/vm/node_modules/vmload/dump-datasets.js
/root/smartos-live/src/vm/node_modules/vmload/dump-json.js
/root/smartos-live/src/vm/node_modules/vmload/dump-vmobjs.js
/root/smartos-live/src/vm/node_modules/vmload/dump-zoneadm.js
/root/smartos-live/src/vm/node_modules/vmload/dump-zoneinfo.js
/root/smartos-live/src/vm/node_modules/vmload/dump-zonexml.js
/root/smartos-live/src/vm/node_modules/vmload/index.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-datasets.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-json.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-utils.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-xml.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-zoneadm.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-zoneinfo.js
/root/smartos-live/src/vm/node_modules/vminfod/client.js
/root/smartos-live/src/vm/node_modules/vminfod/vminfod.js
/root/smartos-live/src/vm/node_modules/vminfod/zonewatcher.js
/root/smartos-live/src/vm/node_modules/vminfod/zpoolwatcher.js
/root/smartos-live/src/vm/node_modules/sysevent-stream.js
/root/smartos-live/src/vm/node_modules/zonecfg.js
/root/smartos-live/src/vm/node_modules/zoneevent.js
/root/smartos-live/src/img/lib/IMG.js
/root/smartos-live/src/img/lib/cli.js
/root/smartos-live/src/img/lib/common.js
/root/smartos-live/src/img/lib/configuration.js
/root/smartos-live/src/img/lib/database.js
/root/smartos-live/src/img/lib/errors.js
/root/smartos-live/src/img/lib/imgadm.js
/root/smartos-live/src/img/lib/magic.js
/root/smartos-live/src/img/lib/upgrade.js
/root/smartos-live/src/img/sbin/imgadm
/root/smartos-live/src/img/test/IMG.test.js
/root/smartos-live/src/img/test/basics.test.js
/root/smartos-live/src/img/test/create.test.js
/root/smartos-live/src/img/test/docker.test.js
/root/smartos-live/src/img/test/import.test.js
/root/smartos-live/src/img/test/update.test.js
/root/smartos-live/src/vm/common/nictag.js
/root/smartos-live/src/vm/tests/common.js
/root/smartos-live/src/vm/tests/test-alias.js
/root/smartos-live/src/vm/node_modules/nodeunit-plus/index.js
/root/smartos-live/src/vm/tests/test-bhyve-disk-resize.js
/root/smartos-live/src/vm/tests/test-bhyve-pci.js
/root/smartos-live/src/vm/tests/test-bhyve-pci_slot.js
/root/smartos-live/src/vm/tests/test-cleanup-on-failure.js
/root/smartos-live/src/vm/tests/test-create-filesystems.js
/root/smartos-live/src/vm/tests/test-create.js
/root/smartos-live/src/vm/tests/test-defaults.js
/root/smartos-live/src/vm/tests/test-disk-uuid.js
/root/smartos-live/src/vm/tests/test-docker.js
/root/smartos-live/src/vm/tests/test-firewall.js
/root/smartos-live/src/vm/tests/test-fswatcher.js
/root/smartos-live/src/vm/tests/test-hrtime.js
/root/smartos-live/src/vm/tests/test-indestructible.js
/root/smartos-live/src/vm/tests/test-internal_metadata_namespaces.js
/root/smartos-live/src/vm/tests/test-info.js
/root/smartos-live/src/vm/tests/test-lastexited.js
/root/smartos-live/src/vm/tests/test-openonerrlogger.js
/root/smartos-live/src/vm/tests/test-queue.js
/root/smartos-live/src/vm/tests/test-quota.js
/root/smartos-live/src/vm/tests/test-reboot.js
/root/smartos-live/src/vm/tests/test-reprovision.js
/root/smartos-live/src/vm/tests/test-send-recv.js
/root/smartos-live/src/vm/tests/test-snapshots.js
/root/smartos-live/src/vm/tests/test-spoof-opts.js
/root/smartos-live/src/vm/tests/test-sysinfo.js
/root/smartos-live/src/vm/tests/test-tmpfs.js
/root/smartos-live/src/vm/tests/test-update.js
/root/smartos-live/src/vm/tests/test-update-kvm.js
/root/smartos-live/src/vm/tests/test-update-bhyve.js
/root/smartos-live/src/vm/tests/test-vrrp-nics.js
/root/smartos-live/src/vm/tests/test-vminfod.js
/root/smartos-live/src/vm/tests/test-vminfod-zonewatcher.js
/root/smartos-live/src/vm/tests/test-vminfod-zonewatcher-overflow.js
/root/smartos-live/src/vm/tests/test-vminfod-zpoolwatcher.js
/root/smartos-live/src/vm/lib/metadata/agent.js
/root/smartos-live/src/vm/lib/metadata/common.js
/root/smartos-live/src/vm/lib/metadata/crc32.js
/root/smartos-live/src/disklayout.js
/root/smartos-live/src/mkzpool.js
/root/smartos-live/src/smartdc/lib/ntp_config.js
/root/smartos-live/src/node_modules/disklayout.js

0 error(s), 0 warnings(s)

==> Running jsstyle...
(for file in filewait.js node_modules/{system,onlyif,net-boot-config}.js net-boot-config vm/sbin/*.js vm/node_modules/diff.js vm/node_modules/dladm.js vm/node_modules/expander.js vm/node_modules/fswatcher.js vm/node_modules/hrtime.js vm/node_modules/ip.js vm/node_modules/nic.js vm/node_modules/proptable.js vm/node_modules/utils.js vm/node_modules/VM.js vm/node_modules/qmp.js vm/node_modules/queue.js vm/node_modules/openonerrlogger.js vm/node_modules/vmload/*.js vm/node_modules/vminfod/*.js vm/node_modules/sysevent-stream.js vm/node_modules/zonecfg.js vm/node_modules/zoneevent.js img/lib/*.js img/sbin/imgadm img/test/*.test.js vm/common/nictag.js vm/tests/common.js vm/tests/test-alias.js vm/tests/test-bhyve-disk-resize.js vm/tests/test-bhyve-pci.js vm/tests/test-bhyve-pci_slot.js vm/tests/test-cleanup-on-failure.js vm/tests/test-create-filesystems.js vm/tests/test-create.js vm/tests/test-defaults.js vm/tests/test-disk-uuid.js vm/tests/test-docker.js vm/tests/test-firewall.js vm/tests/test-fswatcher.js vm/tests/test-hrtime.js vm/tests/test-indestructible.js vm/tests/test-internal_metadata_namespaces.js vm/tests/test-info.js vm/tests/test-lastexited.js vm/tests/test-openonerrlogger.js vm/tests/test-queue.js vm/tests/test-quota.js vm/tests/test-reboot.js vm/tests/test-reprovision.js vm/tests/test-send-recv.js vm/tests/test-snapshots.js vm/tests/test-spoof-opts.js vm/tests/test-sysinfo.js vm/tests/test-tmpfs.js vm/tests/test-update.js vm/tests/test-update-kvm.js vm/tests/test-update-bhyve.js vm/tests/test-vrrp-nics.js vm/tests/test-vminfod.js vm/tests/test-vminfod-zonewatcher.js vm/tests/test-vminfod-zonewatcher-overflow.js vm/tests/test-vminfod-zpoolwatcher.js vm/lib/metadata/*.js; do \
        echo /root/smartos-live/src/$file; \
        /root/smartos-live/src/../tools/jsstyle/jsstyle -o indent=4,strict-indent=1,doxygen,unparenthesized-return=0,continuation-at-front=1,leading-right-paren-ok=1 $file; \
        [[ $? == "0" ]] || exit 1; \
done)
/root/smartos-live/src/filewait.js
/root/smartos-live/src/node_modules/system.js
/root/smartos-live/src/node_modules/onlyif.js
/root/smartos-live/src/node_modules/net-boot-config.js
/root/smartos-live/src/net-boot-config
/root/smartos-live/src/vm/sbin/add-userscript.js
/root/smartos-live/src/vm/sbin/metadata.js
/root/smartos-live/src/vm/sbin/vmadm.js
/root/smartos-live/src/vm/sbin/vmadmd.js
/root/smartos-live/src/vm/sbin/vmdf.js
/root/smartos-live/src/vm/sbin/vminfo.js
/root/smartos-live/src/vm/sbin/vminfod.js
/root/smartos-live/src/vm/node_modules/diff.js
/root/smartos-live/src/vm/node_modules/dladm.js
/root/smartos-live/src/vm/node_modules/expander.js
/root/smartos-live/src/vm/node_modules/fswatcher.js
/root/smartos-live/src/vm/node_modules/hrtime.js
/root/smartos-live/src/vm/node_modules/ip.js
/root/smartos-live/src/vm/node_modules/nic.js
/root/smartos-live/src/vm/node_modules/proptable.js
/root/smartos-live/src/vm/node_modules/utils.js
/root/smartos-live/src/vm/node_modules/VM.js
/root/smartos-live/src/vm/node_modules/qmp.js
/root/smartos-live/src/vm/node_modules/queue.js
/root/smartos-live/src/vm/node_modules/openonerrlogger.js
/root/smartos-live/src/vm/node_modules/vmload/dump-datasets.js
/root/smartos-live/src/vm/node_modules/vmload/dump-json.js
/root/smartos-live/src/vm/node_modules/vmload/dump-vmobjs.js
/root/smartos-live/src/vm/node_modules/vmload/dump-zoneadm.js
/root/smartos-live/src/vm/node_modules/vmload/dump-zoneinfo.js
/root/smartos-live/src/vm/node_modules/vmload/dump-zonexml.js
/root/smartos-live/src/vm/node_modules/vmload/index.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-datasets.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-json.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-utils.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-xml.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-zoneadm.js
/root/smartos-live/src/vm/node_modules/vmload/vmload-zoneinfo.js
/root/smartos-live/src/vm/node_modules/vminfod/client.js
/root/smartos-live/src/vm/node_modules/vminfod/vminfod.js
/root/smartos-live/src/vm/node_modules/vminfod/zonewatcher.js
/root/smartos-live/src/vm/node_modules/vminfod/zpoolwatcher.js
/root/smartos-live/src/vm/node_modules/sysevent-stream.js
/root/smartos-live/src/vm/node_modules/zonecfg.js
/root/smartos-live/src/vm/node_modules/zoneevent.js
/root/smartos-live/src/img/lib/IMG.js
/root/smartos-live/src/img/lib/cli.js
/root/smartos-live/src/img/lib/common.js
/root/smartos-live/src/img/lib/configuration.js
/root/smartos-live/src/img/lib/database.js
/root/smartos-live/src/img/lib/errors.js
/root/smartos-live/src/img/lib/imgadm.js
/root/smartos-live/src/img/lib/magic.js
/root/smartos-live/src/img/lib/upgrade.js
/root/smartos-live/src/img/sbin/imgadm
/root/smartos-live/src/img/test/IMG.test.js
/root/smartos-live/src/img/test/basics.test.js
/root/smartos-live/src/img/test/create.test.js
/root/smartos-live/src/img/test/docker.test.js
/root/smartos-live/src/img/test/import.test.js
/root/smartos-live/src/img/test/update.test.js
/root/smartos-live/src/vm/common/nictag.js
/root/smartos-live/src/vm/tests/common.js
/root/smartos-live/src/vm/tests/test-alias.js
/root/smartos-live/src/vm/tests/test-bhyve-disk-resize.js
/root/smartos-live/src/vm/tests/test-bhyve-pci.js
/root/smartos-live/src/vm/tests/test-bhyve-pci_slot.js
/root/smartos-live/src/vm/tests/test-cleanup-on-failure.js
/root/smartos-live/src/vm/tests/test-create-filesystems.js
/root/smartos-live/src/vm/tests/test-create.js
/root/smartos-live/src/vm/tests/test-defaults.js
/root/smartos-live/src/vm/tests/test-disk-uuid.js
/root/smartos-live/src/vm/tests/test-docker.js
/root/smartos-live/src/vm/tests/test-firewall.js
/root/smartos-live/src/vm/tests/test-fswatcher.js
/root/smartos-live/src/vm/tests/test-hrtime.js
/root/smartos-live/src/vm/tests/test-indestructible.js
/root/smartos-live/src/vm/tests/test-internal_metadata_namespaces.js
/root/smartos-live/src/vm/tests/test-info.js
/root/smartos-live/src/vm/tests/test-lastexited.js
/root/smartos-live/src/vm/tests/test-openonerrlogger.js
/root/smartos-live/src/vm/tests/test-queue.js
/root/smartos-live/src/vm/tests/test-quota.js
/root/smartos-live/src/vm/tests/test-reboot.js
/root/smartos-live/src/vm/tests/test-reprovision.js
/root/smartos-live/src/vm/tests/test-send-recv.js
/root/smartos-live/src/vm/tests/test-snapshots.js
/root/smartos-live/src/vm/tests/test-spoof-opts.js
/root/smartos-live/src/vm/tests/test-sysinfo.js
/root/smartos-live/src/vm/tests/test-tmpfs.js
/root/smartos-live/src/vm/tests/test-update.js
/root/smartos-live/src/vm/tests/test-update-kvm.js
/root/smartos-live/src/vm/tests/test-update-bhyve.js
/root/smartos-live/src/vm/tests/test-vrrp-nics.js
/root/smartos-live/src/vm/tests/test-vminfod.js
/root/smartos-live/src/vm/tests/test-vminfod-zonewatcher.js
/root/smartos-live/src/vm/tests/test-vminfod-zonewatcher-overflow.js
/root/smartos-live/src/vm/tests/test-vminfod-zpoolwatcher.js
/root/smartos-live/src/vm/lib/metadata/agent.js
/root/smartos-live/src/vm/lib/metadata/common.js
/root/smartos-live/src/vm/lib/metadata/crc32.js
(for file in disklayout.js mkzpool.js smartdc/lib/ntp_config.js node_modules/disklayout.js; do \
        echo /root/smartos-live/src/$file; \
        /root/smartos-live/src/../tools/jsstyle/jsstyle  $file; \
        [[ $? == "0" ]] || exit 1; \
done)
/root/smartos-live/src/disklayout.js
/root/smartos-live/src/mkzpool.js
/root/smartos-live/src/smartdc/lib/ntp_config.js
/root/smartos-live/src/node_modules/disklayout.js
gmake[2]: Entering directory '/root/smartos-live/src/fw'

==> Running JavaScriptLint...
/root/smartos-live/src/fw/lib/cli.js
/root/smartos-live/src/fw/lib/filter.js
/root/smartos-live/src/fw/lib/fw.js
/root/smartos-live/src/fw/lib/fwadm.js
/root/smartos-live/src/fw/lib/ipf.js
/root/smartos-live/src/fw/lib/locker.js
/root/smartos-live/src/fw/lib/pipeline.js
/root/smartos-live/src/fw/lib/rvm.js
/root/smartos-live/src/fw/lib/util/errors.js
/root/smartos-live/src/fw/lib/util/log.js
/root/smartos-live/src/fw/lib/util/obj.js
/root/smartos-live/src/fw/lib/util/vm.js
/root/smartos-live/src/fw/sbin/fwadm
/root/smartos-live/src/fw/test/unit/add.test.js
/root/smartos-live/src/fw/test/unit/fw.test.js
/root/smartos-live/src/fw/test/unit/global.test.js
/root/smartos-live/src/fw/test/unit/icmp.test.js
/root/smartos-live/src/fw/test/unit/ipsec.test.js
/root/smartos-live/src/fw/test/unit/list.test.js
/root/smartos-live/src/fw/test/unit/log.test.js
/root/smartos-live/src/fw/test/unit/owner.test.js
/root/smartos-live/src/fw/test/unit/priority.test.js
/root/smartos-live/src/fw/test/unit/remote-targets.test.js
/root/smartos-live/src/fw/test/unit/remote-vms.test.js
/root/smartos-live/src/fw/test/unit/stats.test.js
/root/smartos-live/src/fw/test/unit/status.test.js
/root/smartos-live/src/fw/test/unit/tags.test.js
/root/smartos-live/src/fw/test/unit/update.test.js
/root/smartos-live/src/fw/test/unit/validate.test.js
/root/smartos-live/src/fw/test/unit/vms.test.js
/root/smartos-live/src/fw/test/unit/wildcards.test.js
/root/smartos-live/src/fw/test/integration/enable-disable.test.js
/root/smartos-live/src/fw/test/integration/examples.test.js
/root/smartos-live/src/fw/test/integration/in-zone-enabled.test.js
/root/smartos-live/src/fw/test/integration/ipsec.test.js
/root/smartos-live/src/fw/test/lib/common.js
/root/smartos-live/src/fw/test/lib/fw.js
/root/smartos-live/src/fw/test/lib/helpers.js
/root/smartos-live/src/fw/test/lib/log.js
/root/smartos-live/src/fw/test/lib/mocks.js
/root/smartos-live/src/fw/test/lib/vm.js

0 error(s), 0 warnings(s)

==> Running jsstyle...
/root/smartos-live/src/fw/lib/cli.js
/root/smartos-live/src/fw/lib/filter.js
/root/smartos-live/src/fw/lib/fw.js
/root/smartos-live/src/fw/lib/fwadm.js
/root/smartos-live/src/fw/lib/ipf.js
/root/smartos-live/src/fw/lib/locker.js
/root/smartos-live/src/fw/lib/pipeline.js
/root/smartos-live/src/fw/lib/rvm.js
/root/smartos-live/src/fw/lib/util/errors.js
/root/smartos-live/src/fw/lib/util/log.js
/root/smartos-live/src/fw/lib/util/obj.js
/root/smartos-live/src/fw/lib/util/vm.js
/root/smartos-live/src/fw/sbin/fwadm
/root/smartos-live/src/fw/test/unit/add.test.js
/root/smartos-live/src/fw/test/unit/fw.test.js
/root/smartos-live/src/fw/test/unit/global.test.js
/root/smartos-live/src/fw/test/unit/icmp.test.js
/root/smartos-live/src/fw/test/unit/ipsec.test.js
/root/smartos-live/src/fw/test/unit/list.test.js
/root/smartos-live/src/fw/test/unit/log.test.js
/root/smartos-live/src/fw/test/unit/owner.test.js
/root/smartos-live/src/fw/test/unit/priority.test.js
/root/smartos-live/src/fw/test/unit/remote-targets.test.js
/root/smartos-live/src/fw/test/unit/remote-vms.test.js
/root/smartos-live/src/fw/test/unit/stats.test.js
/root/smartos-live/src/fw/test/unit/status.test.js
/root/smartos-live/src/fw/test/unit/tags.test.js
/root/smartos-live/src/fw/test/unit/update.test.js
/root/smartos-live/src/fw/test/unit/validate.test.js
/root/smartos-live/src/fw/test/unit/vms.test.js
/root/smartos-live/src/fw/test/unit/wildcards.test.js
/root/smartos-live/src/fw/test/integration/enable-disable.test.js
/root/smartos-live/src/fw/test/integration/examples.test.js
/root/smartos-live/src/fw/test/integration/in-zone-enabled.test.js
/root/smartos-live/src/fw/test/integration/ipsec.test.js
/root/smartos-live/src/fw/test/lib/common.js
/root/smartos-live/src/fw/test/lib/fw.js
/root/smartos-live/src/fw/test/lib/helpers.js
/root/smartos-live/src/fw/test/lib/log.js
/root/smartos-live/src/fw/test/lib/mocks.js
/root/smartos-live/src/fw/test/lib/vm.js

JS style ok!

==> Check man page line lengths...
man/fwadm.1m.md
man/fwrule.5.md
gmake[2]: Leaving directory '/root/smartos-live/src/fw'

JS style ok!
make[1]: Leaving directory '/root/smartos-live/src'
sjorge commented 4 years ago

All good this time round from first go!

# PASS: 9 / 9
#
#
#  TEST COMPLETE IN 2328 SECONDS, SUMMARY:
#
# PASS: 5159 / 5159
#
# log files available in: /tmp/vmtest.1578498709.4616
#

https://pkg.blackdot.be/extras/vmtest.1578498709.4616.tar.xz

jlevon commented 4 years ago

I ran this on my HN, and got this:

# add fs /var/tmp/global
Uncaught TypeError: Cannot read property 'length' of undefined

FROM
/usr/vm/test/tests/test-update.js:1488:43
/usr/vm/node_modules/VM.js:1695:9
/usr/vm/node_modules/vmload/index.js:689:13
/usr/vm/node_modules/vmload/index.js:699:13
IncomingMessage.resEnd (/usr/vm/node_modules/vminfod/client.js:292:13)
IncomingMessage.EventEmitter.emit (events.js:117:20)
_stream_readable.js:920:16
process._tickDomainCallback (node.js:459:13)
#
# /usr/vm/test/tests/test-update.js TEST COMPLETE IN 367 SECONDS, SUMMARY:
#
# FAIL: 1 / ?
#
# New core files
# new core: /zones/global/cores/core.node.354741

It looks like the add failed for some reason (unknown), but then we try to look at obj.filesystems, which I guess doesn't exist?

sjorge commented 4 years ago

Interesting, what happens if you manually do the add, update and remove payloads against a base64 (or other non lx/bhyve/kvm) zone?

add_fs_test.json

{
        "add_filesystems": [
            {
                "type": "lofs",
                "source": "/tmp",
                "target": "/var/tmp/global",
                "options": ["nodevice"]
            }
        ]
}

update_fs_test.json

{
        "update_filesystems": [
            {
                "options": ["nodevice", "ro"],
               " target": "/var/tmp/global"
            }
        ]
}

remove_fs_test.json

{
      "remove_filesystems": ["/var/tmp/global"]
}
sjorge commented 4 years ago
#  TEST COMPLETE IN 2267 SECONDS, SUMMARY:
#
# PASS: 5159 / 5159
#
# log files available in: /tmp/vmtest.1578574900.4613
#

https://pkg.blackdot.be/extras/vmtest.1578574900.4613.tar.xz

Running a loop now as mentioned on irc.

sjorge commented 4 years ago

strange, very strange...

Uncaught TypeError: Cannot read property 'length' of undefined

FROM
/usr/vm/test/tests/test-update.js:1490:60
/usr/vm/node_modules/VM.js:1695:9
/usr/vm/node_modules/vmload/index.js:689:13
/usr/vm/node_modules/vmload/index.js:699:13
IncomingMessage.resEnd (/usr/vm/node_modules/vminfod/client.js:292:13)
IncomingMessage.EventEmitter.emit (events.js:117:20)
_stream_readable.js:920:16
process._tickDomainCallback (node.js:459:13)
#
# ./tests/test-update.js TEST COMPLETE IN 55 SECONDS, SUMMARY:
#
# FAIL: 1 / ?
#
# New core files
# new core: /zones/global/cores/core.node.84821

They are using the updates tests!

[root@00-0c-29-51-55-da /usr/vm/test]# grep -A 1 '=== undefined' tests/test-update.js
                } else if (obj.filesystems === undefined
                            || obj.filesystems.length !== 1) {
--
                } else if (obj.filesystems === undefined
                            || obj.filesystems.length !== 1) {

So we have a obj.filesystems that does not have a length property :O

It does look like the filesystem was added correctly

[root@00-0c-29-51-55-da /usr/vm/test]# vmadm get e25837d2-1551-6a27-c37d-c1c3ffcbf00d | json filesystems
[
  {
    "source": "/tmp",
    "target": "/var/tmp/global",
    "type": "lofs",
    "options": [
      "nodevice"
    ]
  }
]
sjorge commented 4 years ago

My understanding of vminfod is not great, ... but

/usr/vm/node_modules/VM.js:1695:9
/usr/vm/node_modules/vmload/index.js:689:13
/usr/vm/node_modules/vmload/index.js:699:13

Looks like a bunch of callbacks to read the data from vminfod... which seems to return the vmobj without a filesystem block? Or we perhaps hitting some racy bit in vminfod not having updated yet?

sjorge commented 4 years ago

Test still pass

#  TEST COMPLETE IN 2331 SECONDS, SUMMARY:
#
# PASS: 5157 / 5157
#
# log files available in: /tmp/vmtest.1578587691.42222

Skipping uploading the test result tarball, will go to the loop now to see it fail.

sjorge commented 4 years ago

After about a good 1h30 of looping I got it to fail...

This time not tripping up over the fact that vmobj.filesystems was non existent, but that it existed and that there was 1 filesystem and we expected 0... but when inspecting the left over zone, there was no filesystem present!

Looks like vminfod also logged it as correctly removed

  {
    "name": "vminfod",
    "hostname": "00-0c-29-51-55-da",
    "pid": 3962,
    "level": 30,
    "ev": {
      "type": "modify",
      "date": "2020-01-09T19:32:24.750Z",
      "zonename": "e83a371c-b3d9-e8be-b1a0-ff42e735a795",
      "uuid": "e83a371c-b3d9-e8be-b1a0-ff42e735a795",
      "vm": {
        "zonename": "e83a371c-b3d9-e8be-b1a0-ff42e735a795",
        "autoboot": true,
        "brand": "joyent",
        "limit_priv": "",
        "v": 1,
        "create_timestamp": "2020-01-09T19:31:20.569Z",
        "image_uuid": "01b2c898-945f-11e1-a523-af1afbe22822",
        "dns_domain": "local",
        "do_not_inventory": true,
        "nics": [
          {
            "interface": "net0",
            "mac": "01:02:03:04:05:06",
            "nic_tag": "admin",
            "ip": "dhcp",
            "ips": [
              "dhcp"
            ]
          }
        ],
        "max_sem_ids": 2332,
        "max_physical_memory": 1536,
        "max_locked_memory": 1536,
        "max_swap": 1536,
        "tmpfs": 1536,
        "uuid": "e83a371c-b3d9-e8be-b1a0-ff42e735a795",
        "zone_state": "running",
        "zonepath": "/zones/e83a371c-b3d9-e8be-b1a0-ff42e735a795",
        "hvm": false,
        "zoneid": 671,
        "zonedid": 1258,
        "last_modified": "2020-01-09T19:32:24.000Z",
        "resolvers": [

        ],
        "firewall_enabled": false,
        "server_uuid": "564d0a56-64f5-ac53-2414-89acd25155da",
        "platform_buildstamp": "20200109T143428Z",
        "state": "running",
        "boot_timestamp": "2020-01-09T19:31:48.000Z",
        "init_restarts": 0,
        "pid": 52340,
        "customer_metadata": {

        },
        "internal_metadata": {

        },
        "routes": {

        },
        "tags": {

        },
        "quota": 2,
        "zfs_root_recsize": 131072,
        "zfs_filesystem": "zones/e83a371c-b3d9-e8be-b1a0-ff42e735a795",
        "zpool": "zones",
        "snapshots": [

        ]
      },
      "changes": [
        {
          "prettyPath": "filesystems",
          "path": [
            "filesystems"
          ],
          "action": "removed",
          "oldValue": [
            {
              "source": "/tmp",
              "target": "/var/tmp/global",
              "type": "lofs",
              "options": [
                "nodevice",
                "ro"
              ]
            }
          ]
        }
      ]
    },
    "msg": "emitting \"modify\" event (1 VMs total)",
    "time": "2020-01-09T19:32:24.750Z",
    "v": 0
  }

I don't think the new tests added are 'broken', but it's a bit beyond my depth to debug. It looks like vminfod is giving stale info back during the tests. With either the fs still or not yet present 🤷‍♂

sjorge commented 4 years ago

I'm not 100% sure but it looks vmload did fetch the zone data... which is the event BEFORE the change reported in the previous comment

  {
    "name": "vminfod",
    "hostname": "00-0c-29-51-55-da",
    "pid": 3962,
    "level": 30,
    "req": {
      "method": "GET",
      "url": "/vms/e83a371c-b3d9-e8be-b1a0-ff42e735a795",
      "headers": {
        "user-agent": "vmload/index - 00-0c-29-51-55-da/51611 (/usr/vm/node_modules/nodeunit/bin/nodeunit)",
        "host": "127.0.0.1:9090",
        "connection": "keep-alive"
      },
      "remoteAddress": "127.0.0.1",
      "remotePort": 40356
    },
    "msg": "HTTP request",
    "time": "2020-01-09T19:32:24.750Z",
    "v": 0
  }

They share the same timestamp but the log order is this entry and the the entry from the previous comment :/

sjorge commented 4 years ago

I think it is a race issue in vminfod :(

On the failed run:

I believe the failure of the test is 'correct' in this case because it got the data for the zone BEFORE the change was registered in vminfod

On a clean run:

sjorge commented 4 years ago

@jlevon with 1000ms wait I did not hit any test failures in test-update.js, currently running with 100ms which I think should be fine. Will report back tomorrow.

The vminfod races I saw 2 messages were swapped on the same timestamp.

sjorge commented 4 years ago

@jlevon it ran fine for all of my work day with 100ms delay too.

sjorge commented 4 years ago

I've added a comment, I still need to spin a new build. I will do so tonight

sjorge commented 4 years ago

Cool, gmake check works without doing a build first

vm/tests/test-update.js: 1507: line > 80 characters
vm/tests/test-update.js: 1551: line > 80 characters
vm/tests/test-update.js: 1552: line > 80 characters

will fix this

jlevon commented 4 years ago

Hi sjorge, once you've confirmed your happy with your final build and test, I'll add IA and merge!

sjorge commented 4 years ago

@jlevon ran the tests for about 1h, looks good.

I also manually create some vms and added, updated and remove some lofs mounts. LGTM