home-assistant / core

:house_with_garden: Open source home automation that puts local control and privacy first.
https://www.home-assistant.io
Apache License 2.0
69.74k stars 28.91k forks source link

ruckus_unleashed: integration "Failed to set up" on 2023.6.0 #94264

Closed pcmoore closed 10 months ago

pcmoore commented 1 year ago

The problem

After upgrading my HA instance to v2023.6.0 yesterday my Ruckus Unleashed integration stopped working with the "Failed to set up" error on the integration page; reloading the integration, restarting HA, and rebooting the entire device did not resolve the problem.

If I enable debug logging for the integration, reload the integration, and check the logs I see the following two entries which appear relevant.

The first entry is coming from the Ruckus integration itself:

Logger: homeassistant.config_entries
Source: components/ruckus_unleashed/__init__.py:31
First occurred: June 7, 2023 at 10:13:26 PM (3 occurrences)
Last logged: 10:09:47 AM

Error setting up entry mesh-XXXXXX for ruckus_unleashed
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 387, in async_setup
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/ruckus_unleashed/__init__.py", line 31, in async_setup_entry
    ruckus = await Ruckus.create(
             ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pyruckus/__init__.py", line 44, in create
    await ruckus.connect()
  File "/usr/local/lib/python3.11/site-packages/pyruckus/__init__.py", line 50, in connect
    result = await ssh.login(
             ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pyruckus/RuckusSSH.py", line 48, in login
    i = await self.expect(login_regex_array, timeout=login_timeout, async_=True)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pexpect/spawnbase.py", line 340, in expect
    return self.expect_list(compiled_pattern_list,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/pexpect/spawnbase.py", line 366, in expect_list
    from ._async import expect_async
  File "/usr/local/lib/python3.11/site-packages/pexpect/_async.py", line 6, in <module>
    @asyncio.coroutine
     ^^^^^^^^^^^^^^^^^
AttributeError: module 'asyncio' has no attribute 'coroutine'

The second entry doesn't appear to be directly related to the Ruckus integration, but it appears similar to other outstanding GH issues for the integration:

Logger: homeassistant.util.async_
Source: util/async_.py:166
First occurred: 10:00:24 AM (4 occurrences)
Last logged: 10:09:47 AM

Detected blocking call to sleep inside the event loop. This is causing stability issues. Please report issue for config doing blocking calls at homeassistant/components/config/config_entries.py, line 112: await hass.config_entries.async_reload(entry_id)

What version of Home Assistant Core has the issue?

core-2023.6.0

What was the last working version of Home Assistant Core?

core-2023.5.4

What type of installation are you running?

Home Assistant OS

Integration causing the issue

Ruckus Unleashed

Link to integration documentation on our website

https://www.home-assistant.io/integrations/ruckus_unleashed

Diagnostics information

No response

Example YAML snippet

No response

Anything in the logs that might be useful for us?

No response

Additional information

No response

lanrat commented 1 year ago

@ms264556 I saw your work on the Ruckus firmware and took a note to look through it later when I have time.

The tests are already committed. Feel free to take a look whenever you have time. I'll continue to think on it to see if I can make it work and commit/push regularly.

Thanks.

lanrat commented 1 year ago

@ms264556 I updated the tests a bit more, but it seems since you initially it, aioruckus has had some changes that makes it hard to fully mock a class without opening sockets, especially with the login() and requests POSTs.

Do you know if there is an easier way to mock responses? Or would you be willing to update the library to better support mocking?

ms264556 commented 1 year ago

For sure if you have a good plan/link for best-practices mocking then I'll have a go.

But you should definitely be able to mock [sub?]adequately as-is. I whacked the following classes into my test __init__.py:-

class MockSession(AjaxSession):
    """Mock Session"""
    def __init__(
        self,
        websession: aiohttp.ClientSession,
        host: str,
        username: str,
        password: str,
        auto_cleanup_websession=False,
    ) -> None:
        super().__init__(websession, host, username, password, auto_cleanup_websession)
        self.mock_results = {}
        self._api = MockApi(self)

    async def __aenter__(self) -> "AjaxSession":
        return self

    async def __aexit__(self, *exc: Any) -> None:
        pass

    async def login(self) -> None:
        pass

    async def close(self) -> None:
        pass

    @property
    def api(self) -> "MockApi":
        return self._api

    @classmethod
    def async_create(cls, host: str, username: str, password: str) -> "AjaxSession":
        return MockSession(None, host, username, password, True)

    async def get_conf_str(self, item: ConfigItem, timeout: int | None = None) -> str:
        return self.mock_results[item.value]

class MockApi(RuckusApi):
    """Mock Session"""
    def __init__(self, session: MockSession):
        self.session = session

    async def get_active_clients(self, interval_stats: bool = False) -> List:
        if interval_stats:
            raise NotImplementedError(self)
        else:
            result_text = self.session.mock_results["active-client-stats1"]
            return self._ruckus_xml_unwrap(result_text, ["client"])

then I was able to throw in some quick'n'dirty mocks for a few methods:-

async with MockSession.async_create("<my ZD ip>", "<my ZD user>", "<my ZD password>") as session:

    session.mock_results[ConfigItem.WLANSVC_LIST.value] = '<?xml version="1.0" encoding="utf-8"?><!DOCTYPE ajax-response><ajax-response><response type="object" id="wlansvc-list.0.5"><wlansvc-list><wlansvc name="MyFirstTestSSID" ssid="MyFirstTestSSID" description="" ofdm-rate-only="true" bss-minrate="11" tx-rate-config="4" authentication="open" encryption="wpa23-mixed" do-802-11w="1" is-guest="false" max-clients-per-radio="100" usage="user" acctsvr-id="0" acct-upd-interval="5" do-802-11d="enabled" do-wmm-ac="disabled" option82="0" option82-opt1="0" option82-opt2="0" option82-opt150="0" option82-opt151="0" option82-areaName="" force-dhcp="0" force-dhcp-timeout="10" dis-dgaf="0" parp="1" authstats="1" sta-info-extraction="1" enable-type="0" idle-timeout="true" max-idle-timeout="3000" called-station-id-type="0" policy-id="" policy6-id="" ci-whitelist-id="0" client-isolation="disabled" acl-id="1" pool-id="" vlan-id="1" https-redirection="disabled" local-bridge="1" dhcpsvr-id="0" bgscan="1" balance="1" band-balance="1" devicepolicy-id="" precedence-id="1" role-based-access-ctrl="false" client-flow-log="disabled" export-client-log="false" wifi6="true" dtim-period="2" directed-mbc="0" transient-client-mgnt="0" id="1"><rrm neighbor-report="enabled" /><smartcast mcast-filter="disabled" /><avp-policy avp-enabled="disabled" avpdeny-id="0" /><urlfiltering-policy urlfiltering-enabled="disabled" urlfiltering-id="0" /><qos uplink-preset="DISABLE" downlink-preset="DISABLE" perssid-uplink-preset="0" perssid-downlink-preset="0" /><wpa cipher="aes" x-sae-passphrase="mypassphrase" sae-passphrase="mypassphrase" x-passphrase="mypassphrase" passphrase="mypassphrase" dynamic-psk="disabled" /><queue-priority voice="0" video="2" data="4" background="6" /><wlan-schedule value="0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0" /></wlansvc><wlansvc name="RuckusWifi" ssid="RuckusWifi" description="" ofdm-rate-only="true" bss-minrate="0" tx-rate-config="4" authentication="open" encryption="wpa2" do-802-11w="1" is-guest="false" max-clients-per-radio="100" usage="autonomous" acctsvr-id="0" acct-upd-interval="10" do-802-11d="enabled" do-wmm-ac="disabled" option82="0" option82-opt1="0" option82-opt2="0" option82-opt150="0" option82-opt151="0" option82-areaName="" force-dhcp="0" force-dhcp-timeout="10" dis-dgaf="0" parp="0" authstats="0" sta-info-extraction="1" enable-type="0" idle-timeout="true" max-idle-timeout="300" called-station-id-type="0" policy-id="" policy6-id="" ci-whitelist-id="0" client-isolation="disabled" acl-id="1" pool-id="" vlan-id="1" https-redirection="disabled" local-bridge="1" dhcpsvr-id="0" bgscan="1" balance="1" band-balance="1" fast-bss="disabled" devicepolicy-id="" precedence-id="1" role-based-access-ctrl="false" client-flow-log="disabled" export-client-log="false" dtim-period="3" directed-mbc="0" transient-client-mgnt="0" id="2" wifi6="true"><rrm neighbor-report="disabled" /><smartcast mcast-filter="disabled" /><avp-policy avp-enabled="disabled" avpdeny-id="0" /><urlfiltering-policy urlfiltering-enabled="disabled" urlfiltering-id="0" /><qos uplink-preset="DISABLE" downlink-preset="DISABLE" perssid-uplink-preset="0" perssid-downlink-preset="0" /><wpa cipher="aes" x-passphrase="mypassphrase" passphrase="mypassphrase" dynamic-psk="disabled" /><queue-priority voice="0" video="2" data="4" background="6" /><wlan-schedule value="0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0:0x0" /></wlansvc></wlansvc-list></response></ajax-response>'
    session.mock_results[ConfigItem.AP_LIST.value] = '<?xml version="1.0" encoding="utf-8"?><!DOCTYPE ajax-response><ajax-response><response type="object" id="ap-list.0.5"><ap-list><ap mac="8c:7a:15:3e:01:02" application-reboot="2" user-reboot="0" push-reset-reboot="0" kernel-panic-reboot="0" watchdog-reboot="0" powercycle-reboot="1" reboot-reason="power cycle detect" reboot-detail="" rejoin-reason="Heartbeat Loss" mesh-last-good-ssid="67457921423000304" x-mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" id="1" name="8c:7a:15:3e:01:02" devname="MyTestAp" model="r650" description="" location="" coordinate_source="0" gps="-33.52647,102.59308" group-id="7" ipmode="*" as-is="true" as-is-ipv6="true" bonjour-check="false" x-psk="" mesh-enabled="true" mesh-mode="*" max-hops="*" led-off="true" usb-installed="true" usb-port="true" working-radio="0" approved="true" poe-mode-setting="2" port-setting="*" support-11ac="true" poe-mode="2" last-seen="1680750614" ip="192.168.1.65" netmask="255.255.255.0" gateway="192.168.1.1" dns1="192.168.1.1" dns2="" ipv6-addr="fc00::1" ipv6-plen="7" ipv6-gateway="" ipv6-dns1="" ipv6-dns2="" version="10.5.1.0" build-version="240" strong-cert="normal" config-state="3" serial="302139001234" tunnel-mode="2" udp-port="12223" ext-ip="192.168.1.65" ext-ipv6="fc00::1" ext-port="12223" ext-family="2" support-11ax="true" auth-mode="psk" blocked="false" mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" psk=""><radio radio-type="11na" ieee80211-radio-type="a/n" radio-id="1" channel="*" channel_seg2="*" tx-power="*" wmm-ac="*" prot-mode="*" vap-enabled="*" wlangroup-id="*" channel-select="*" enabled="1" channelization="*" /><radio radio-type="11ng" ieee80211-radio-type="g/n" radio-id="0" channel="*" channel_seg2="*" tx-power="*" wmm-ac="*" prot-mode="*" vap-enabled="*" wlangroup-id="*" channel-select="*" enabled="1" channelization="*" /><adv-mesh apply-acl="false" /><ports port-num="2" acctsvr-id="0" authsvr-id="0" mac-auth="false" supplicant="mac"><port id="1" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" /><port id="2" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" /></ports><venue-names /><bonjourfencing enable="*" policy="*" /></ap><ap mac="24:79:2a:3d:01:02" last-seen="1638930560" ip="192.168.1.2" netmask="255.255.255.0" gateway="192.168.1.1" dns1="202.180.64.10" dns2="202.180.64.11" ipv6-addr="2404:4404:2735:4e00:2679:2aff:fe3d:d70" ipv6-plen="64" ipv6-gateway="fe80::faca:59ff:febf:8ff1" ipv6-dns1="" ipv6-dns2="" application-reboot="17" user-reboot="0" push-reset-reboot="0" kernel-panic-reboot="0" watchdog-reboot="0" powercycle-reboot="24" reboot-reason="target fail detect" reboot-detail="" rejoin-reason="AP Restart" mesh-last-good-ssid="67457921423000304" x-mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" ext-ip="119.224.68.105" ext-port="12223" ext-ipv6="2404:4404:2735:4e00:2679:2aff:fe3d:d70" ext-family="2" tunnel-mode="2" version="10.5.0.0" build-version="212" strong-cert="updated" id="2" name="24:79:2a:3d:01:02" devname="MyOtherTestAp" model="r600" description="" location="44a The Test Road" coordinate_source="0" gps="-33.582834,102.567143" group-id="4" ipmode="*" by-dhcp="true" as-is="false" as-is-ipv6="true" bonjour-check="false" x-psk="" mesh-enabled="true" mesh-mode="*" max-hops="*" led-off="*" usb-installed="false" working-radio="0" approved="true" port-setting="*" support-11ac="true" config-state="3" poe-mode="0" serial="241703001234" udp-port="42530" auth-mode="psk" cband-chann="*" cband-license="*" mesh-last-good-psk="xLgkZhXhaE-Io5p7YUwbSNxmgUX68xBwWagWCg_5osPGLBGfIfw1AvcaJHH3ouc" psk=""><radio radio-type="11na" radio-id="1" channel="*" tx-power="*" wlangroup-id="*" wmm-ac="*" vap-enabled="*" channel-select="*" channelization="*" ieee80211-radio-type="a/n" enabled="1" channel_seg2="*" prot-mode="*" /><radio radio-type="11ng" radio-id="0" channel="*" tx-power="*" wlangroup-id="*" wmm-ac="*" vap-enabled="*" channel-select="*" channelization="*" ieee80211-radio-type="g/n" enabled="1" prot-mode="*" /><adv-mesh apply-acl="false" channel="*" channelization="*" channel-select="*" /><ports port-num="2" acctsvr-id="0" authsvr-id="0" mac-auth="false" supplicant="mac" channel="*" channelization="*" channel-select="*"><port id="1" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" igmp-snooping="disable" /><port id="2" enabled="true" tunnel="false" opt82="false" uplink="access" untag="1" members="1" guestvlan="" dvlan="false" dot1x="disabled" /></ports><venue-names channel="*" channelization="*" channel-select="*" /><bonjourfencing enable="*" policy="*" channel="*" channelization="*" channel-select="*" /></ap></ap-list></response></ajax-response>'
    session.mock_results["active-client-stats1"] = '<?xml version="1.0" encoding="utf-8"?><!DOCTYPE ajax-response><ajax-response><response type="object" id="DEH"><apstamgr-stat><client mac="Rec:0e:c4:af:01:02" vap-mac="8c:7a:15:3e:01:02" vap-nasid="8C-7A-15-3E-01-02" wlan-id="1" ap-name="MyTestAp" status="1" ext-status="0" first-assoc="1687043257" vlan="1" called-station-id-type="0" ssid="MyFirstTestSSID" favourite="0" blocked="0" wlan="MyFirstTestSSID" role-id="0" channel="1" description="" dvcinfo-group="9" channelization="20" ieee80211-radio-type="g/n" radio-type-text="11ng" rssi="49" received-signal-strength="-47" noise-floor="-96" num-interval-stats="0" location="" auth-method="Open" acct-multi-session-id="8c7a153e21d8ec0ec4af09fd648e3cb9100a" acct-session-id="648d4188-a7c" ap="8c:7a:15:3e:01:02" dpsk-id="0" user="" ip="192.168.1.45" ipv6="" dvcinfo="Gaming" dvctype="Gaming" model="Playstation 2" hostname="ec:0e:c4:af:01:02" oldname="Rec:0e:c4:af:01:02" radio-type="11ng" rssi-level="excellent" encryption="WPA2" /><client mac="30:e3:7a:7d:01:02" vap-mac="8c:7a:15:3e:01:02" vap-nasid="8C-7A-15-3E-01-02" wlan-id="1" ap-name="MyTestAp" status="1" ext-status="0" first-assoc="1687040776" vlan="1" called-station-id-type="0" ssid="MyFirstTestSSID" favourite="0" blocked="0" wlan="MyFirstTestSSID" role-id="0" channel="112" description="" dvcinfo-group="1" channelization="80" ieee80211-radio-type="a/n/ac" radio-type-text="11ac" rssi="38" received-signal-strength="-58" noise-floor="-96" num-interval-stats="0" location="" auth-method="Open" acct-multi-session-id="8c7a153e21dc30e37a7df17c648e33080ff5" acct-session-id="648d4188-a7b" ap="8c:7a:15:3e:01:02" dpsk-id="0" user="" ip="192.168.1.92" ipv6="" dvcinfo="Windows" dvctype="Laptop" model="Microsoft Windows/Windows 10.0.0" hostname="My-Test-PC" oldname="My-Test-PC" radio-type="11ac" rssi-level="excellent" encryption="WPA2" /><client mac="1a:97:a7:bf:01:02" vap-mac="2c:c5:d3:46:01:02" vap-nasid="2C-C5-D3-46-01-02" wlan-id="7" ap-name="Mum" status="1" ext-status="0" first-assoc="1687061025" vlan="1" called-station-id-type="0" ssid="RuckusWifi" favourite="0" blocked="0" wlan="RuckusWifi" role-id="0" channel="40" description="" dvcinfo-group="3" channelization="80" ieee80211-radio-type="a/n/ac" radio-type-text="11ac" rssi="39" received-signal-strength="-66" noise-floor="-105" num-interval-stats="0" location="" auth-method="Open" acct-multi-session-id="2cc5d38685d81a97a7bf802e648e82211082" acct-session-id="64e8cf09-ee" ap="2c:c5:d3:06:01:02" dpsk-id="0" user="" ip="192.168.7.124" ipv6="" dvcinfo="Apple iOS" dvctype="Smartphone" model="iOS Phone" hostname="1a:97:a7:bf:01:02" oldname="1a:97:a7:bf:01:02" radio-type="11ac" rssi-level="excellent" encryption="WPA3" /></apstamgr-stat></response></ajax-response>'

    ruckus = session.api
    wlans = await ruckus.get_wlans()
    aps = await ruckus.get_aps()
    active_clients = await ruckus.get_active_clients()

(edited to add login and close methods, since you directly use these)

ms264556 commented 1 year ago

(I understand this would be much easier if I'd added an abstract base class for Stats providers as well as the existing Config provider ABC. I will put this on my long TODO list).

lanrat commented 1 year ago

Thanks @ms264556, this helps a lot.

Next on my list is to figure out how to get MockSession to work with patch().

lanrat commented 1 year ago

Actually, I think that adding tests to HASS to create and re-parse the XML is way to low level.

The tests in this component should only check the functionality in the ruckus_unleashed component. We need a way to instantiate a version of your library with more straight forward mock data types, like lists and dicts.

ms264556 commented 1 year ago

Sure, so you can see that I overrode get_active_clients(). Just override the methods you're interested in to directly return the dicts and lists you want.

faithless01 commented 1 year ago

Massive thank you to @lanrat and @ms264556 for pushing the aioruckus integration PR. (98% there!) I'm struggling to follow the process, but am learning a lot watching you guys work. Thank You.

For anybody still relying on the current broken integration in the interim, (perhaps having automations based on presence detection of wifi devices), you can still mend the broken implementation, using @Bubbgump209 's monkey patched _async.py file above.

The solution from @andornaut to clobber the pexpect library with the master branch has not worked in the last OS update, providing a different set of error messages. (Maybe just me?)

Unfortunately, for those running HASSOS, there is a challenge gaining access to the 'broken' Python libraries. While this can be performed relatively easily from a terminal shell, retyping the monkey patched file without copy/paste can be tedious.

Copy / Paste is straight forward in SSH, and can be accessed easily from your own Host OS, as with a Core, Containered or Supervised installation of Home Assistant.

If however, you are running the HASSOS implementation natively on a Pi, or virtualised in VMware or Proxmox, the native SSH Add-On, drops you into its own docker container, with no access to the Core Container, meaning no access to the 'broken' pythin libraries. The following Add-On from @adamoutler provides an SSH session to the host OS, where you can jump up to the Home Assistant Core Docker container, and cut and paste the monkey patch over the broken _async.py file https://community.home-assistant.io/t/add-on-hassos-ssh-port-22222-configurator/264109

Again - thanks to those higher up the tree that are making this work.

andornaut commented 1 year ago

The solution from @andornaut to clobber the pexpect library with the master branch has not worked in the last OS update, providing a different set of error messages. (Maybe just me?)

This still works for me.

@faithless01 feel free to post your logs and install procedure if you want to troubleshoot.

faithless01 commented 1 year ago

Thanks @andornaut No need to troubleshoot, as monkey patching the _async.py file is currently working. I received the following errors when I clobbered pexpect with the master branch after the last core and OS updates:

RuntimeError: Detected blocking call to sleep inside the event loop. Use `await hass.async_add_executor_job()`; This is causing stability issues. Please report issue
2023-06-20 10:57:50.135 WARNING (MainThread) [homeassistant.config_entries] Config entry 'Mesh-Backbone' for ruckus_unleashed integration not ready yet: Could not establish connection to host; Retrying in background
2023-06-20 10:57:50.640 WARNING (MainThread) [homeassistant.util.async_] Detected blocking call to sleep inside the event loop. This is causing stability issues. Please report issue for hassio doing blocking calls at homeassistant/components/hassio/handler.py, line 534: request = await self.websession.request(
2023-06-20 10:57:57.334 ERROR (MainThread) [homeassistant] Error doing job: Exception in callback _UnixReadPipeTransport._call_connection_lost(OSError(5, 'I/O error'))
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/usr/local/lib/python3.11/asyncio/unix_events.py", line 596, in _call_connection_lost
    self._pipe.close()
  File "/usr/local/lib/python3.11/site-packages/pexpect/pty_spawn.py", line 328, in close
    self.ptyproc.close(force=force)
  File "/usr/local/lib/python3.11/site-packages/ptyprocess/ptyprocess.py", line 403, in close
    time.sleep(self.delayafterclose)
  File "/usr/src/homeassistant/homeassistant/util/async_.py", line 166, in protected_loop_func
    check_loop(func, strict=strict)
  File "/usr/src/homeassistant/homeassistant/util/async_.py", line 123, in check_loop
    raise RuntimeError(
RuntimeError: Detected blocking call to sleep inside the event loop. Use `await hass.async_add_executor_job()`; This is causing stability issues. Please report issue

Install procedure, as per previous working method:

ha > login
# docker exec -it homeassistant /bin/bash
homeassistant:/config# pip install https://github.com/pexpect/pexpect/archive/master.zip
homeassistant:/config# exit
# shutdown -r now

I'll try to clobber the pexpect library with the master branch again, after the next core/os release and see how it goes. Thanks again - fantastic community!

lanrat commented 1 year ago

If anyone wants a simpler solution to get this integration working until we have the correct test coverage to get my PR merged, you can clone my repo/branch and mount the ruckus_unleashed from my repo's components folder into your custom_components folder to have it override the built in broken integration.

This is what I have been doing for the past few weeks and its been incredibly stable.

faithless01 commented 1 year ago

You're a beast @lanrat I've raw copied the files from your repo into my custom_components folder, and all appears well. Is there any way to tell if I'm running the custom version? Enabling debug only provides the following: 2023-06-25 04:35:41.758 DEBUG (MainThread) [homeassistant.components.ruckus_unleashed] Finished fetching ruckus_unleashed data in 0.666 seconds (success: True)

EDIT: I moved the /usr/src/homeassistant/homeassistant/components/ruckus_unleashed folder to my home folder and rebooted. I received the Error: Setup failed for ruckus_unleashed: Integration not found Once I moved the original ruckus_unleashed back and rebooted, all worked as per before. EDIT2: I trashed the device_tracker.py file in the custom_componets file, to brute force test if I'm using your new engine, restarted and everything is working as per before ... I think I can safely say, I'm not using your rewritten integration: I'll go research how to manually add a custom component :) EDIT3: So, I replaced the original /usr/src/homeassistant/homeassistant/components/ruckus_unleashed folder with the contents from your repo, and received the following error: Setup failed for ruckus_unleased: No setup or config entry setup function defined.

Time for me do undo my attempts and work with what little I understand at the moment. It feels like I'm trying to repair a pocket watch with a sledgehammer :)

EDIT4: I promise I'll internalise my journey going forward and leave the dialogue to those with talent.. However, to clarify my failings, the root cause of my problems was mistyping__init__.py as _init_.py Upon correcting the mistake, the logs show the following:

2023-06-25 05:33:15.807 WARNING (SyncWorker_4) [homeassistant.loader] We found a custom integration ruckus_unleashed which has not been tested by Home Assistant. This component might cause stability problems, be sure to disable it if you experience issues with Home Assistant
2023-06-25 05:33:15.808 ERROR (SyncWorker_4) [homeassistant.loader] The custom integration 'ruckus_unleashed' does not have a version key in the manifest file and was blocked from loading. See https://developers.home-assistant.io/blog/2021/01/29/custom-integration-changes#versions for more details

Followed shortly after by 2023-06-25 05:34:18.055 DEBUG (MainThread) [homeassistant.components.ruckus_unleashed] Finished fetching ruckus_unleashed data in 0.658 seconds (success: True)

EDIT5: Wooo! Apologies, but I persist with my sad rambling in the vain hope some other muppets frustrated google search results in a simple forehead slap and not hours of d*cking around with their slightly blunter sledgehammer So, as per the error thrown above, I appended a version number to the custom_componets/ruckus_unleasged/manifest.json file

  "requirements": ["aioruckus==0.30", "xmltodict==0.13.0"],
  "version": "0.0.1"
}

...and voila - I was hit with masses of errors for trashing the device_tracker.py file earlier. I recopied the device_tracker.py file, and after a final reboot, a yellow "Custom Integration" icon appeared on my Ruckus Unleashed Integration. I have ADDed a HUB, with the ability to ADD more HUBs! :) (The text input boxes inside the ADD HUB popup were not labelled ip/user/pw - but thats probably just me mistyping something) My Device Trackers are all responding perfectly. Thank You!

lanrat commented 1 year ago

@faithless01 I'm glad you figured it out.

As you found out the hard way, I forgot to mention you will need a add "version": "0.0.1" to manifest.json.

Coder84619 commented 1 year ago

Any ETA on a working version?

faithless01 commented 1 year ago

Hi @Coder84619 - In lieu of a legitmate response I can offer: The rewritten integration has been submitted as a PR and passed 49/51 tests. My head is too far underwater to provide insight into the failed tests, but I believe it may be political rather than technical ... possibly regarding the ownership of the code base (?!?) I believe the former owner/writer/creator of the Integration no longer uses Ruckus devices and is keen to hand over ownership. Under a strict "do not hassle the talent" policy, corresopndance is via HA bots. We wait. If, like myself, your HA automations require Device Tracker presense to trigger events, there are a couple of ways to make the old integration work, as per Bubbgump209's surgical rewrite of the offending _async.py file in the pexpect library, or andornaut's upgrade of the entire pexpect library to a later version. Preferably though, you can spare 10-15 minutes and follow lanrat's suggestion of pulling down the new code into your custom_components folder to test the new integration that will (hopefully) soon be released to us all once it passes the final tests. I'm new to much of this, but fortunately have some time to commit to bridging the knowledge gap from those who create, and myself, who consumes. If you're short on time, but need the Integration up and running, lay out your scenario, and I will try to to help you get things moving until it all becomes a simple click and consume Integration once more.

faithless01 commented 1 year ago

If anybody is testing the new aioruckus core of ruckus_unleashed and missing the UI labels in the Integration, copy the translations folder from: /usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations to /config/custom_components/ruckus_unleashed/translations then reboot.

faithless01 commented 1 year ago

Perhaps just my cludged intallation, but my aioruckus integration is not discovering any devices on my slave access points.

ms264556 commented 1 year ago

Perhaps just my cludged intallation, but my aioruckus integration is not discovering any devices on my slave access points.

I'll put some time into this tomorrow. I'll add a note to specifically trace through this situation.

faithless01 commented 1 year ago

Apologies - it appears the devices on my slave were being added with their entities disabled. Possibly prior state? Again, huge apologies for riding this change of Integration as a personal learning experience in a public forum. Ignore any 'features' I discover on my journey unless corroborated by at least one other member. Thanks again for your hard work!

pcmoore commented 12 months ago

I really want to thank everyone that has put time and effort into reworking and testing the ruckus_unleashed integration, this is one of the many things that makes HA and the concept of Open Source home automation so wonderful. You all have my thanks and respect.

However, it has been a month now, complete with a new monthly HA release, and it is unclear if, or when, this will be resolved in the main HA repository. From what I've read there remain some test failures, but I get the impression these failures are rather minor, and surely the proposed fixes/PR is better than the completely broken integration that currently lives in the main HA branch.

Does anyone know what needs to be done to move this forward?

faithless01 commented 12 months ago

Hey @pcmoore In the interim, it appears the custom_components folder is unaffected by the core updates. I have just updated to 2023.7 and the new integration persists without intervention.

Edit: If you have the Native SSH Add-On Installed, you can copy the files via the web interface:

Edit2: Once the PR is successful - we just have to delete the custom_components/ruckus_unleashed folder

Bubbgump209 commented 12 months ago

FWIW, the aioruckus integration seems great here. It was a seamless drop in replacement. It survive an update to 2023.7 just fine. For the 5 minutes of effort yeah, I'd very much encourage folks to install it and rest easy until the rejigger is accepted.

pcmoore commented 12 months ago

Thanks for all the updates on the workarounds, but I most interested in resolving the outstanding test issues with the PR so we can get this fixed in the main HA branch. I don't see any progress towards resolving the review comments in the PR, and I was just wondering what plans exist to move the PR forward?

lanrat commented 12 months ago

@pcmoore the PR I started is missing the tests required in order for the updated component to be merged.

Unfortunately I'm not as familiar with the testing framework and creating the correct types of mocks required to make good automated tests at this time, which is blocking the PR. I do plan to revisit this, but I'm currently busy with other things and can't promise I will get to it.

If you or anyone else has the time and experience to add the tests please do! As far as I can tell, that's all that's blocking this.

pcmoore commented 12 months ago

I would love to help, but sadly Python is largely outside my area of expertise. Maybe I can stir something up on the PR ...

toxic0berliner commented 11 months ago

I'm trying it out and it looks like something's wrong with the UI at least : image

Home Assistant 2023.7.2 
Supervisor 2023.07.1 
Operating System 10.3 
Interface utilisateur : 20230705.1 - latest 

That said, I think it found everything that's on my wifi very quickly, it even turned some of them into entities when it found matching entities already and it did automagicly set the room these entities are in already... hass magic happening again.

Took me less than 10 mins to do, don't know why it took me weeks between seeing your comments and trying it out, thanks a lot to @lanrat for the dev and @faithless01 for the quick quide

FYI, just in case, my strings.json in the cusom component has this :

{
  "config": {
    "step": {
      "user": {
        "data": {
          "host": "[%key:common::config_flow::data::host%]",
          "username": "[%key:common::config_flow::data::username%]",
          "password": "[%key:common::config_flow::data::password%]"
        }
      }
    },
    "error": {
      "cannot_connect": "[%key:common::config_flow::error::cannot_connect%]",
      "invalid_auth": "[%key:common::config_flow::error::invalid_auth%]",
      "unknown": "[%key:common::config_flow::error::unknown%]"
    },
    "abort": {
      "already_configured": "[%key:common::config_flow::abort::already_configured_device%]"
    }
  }
}
faithless01 commented 11 months ago

Glad its working @toxic0berliner :) I had a similar issue with the UI labelling, which appears to be a "feature" due to our implementation being a 'custom_component', not a regular component, requiring a translations folder under the srtings.json file: https://developers.home-assistant.io/docs/internationalization/custom_integration/

To fix the UI labelling: Create a folder called "translations" here: /config/custom_components/ruckus_unleashed/translations Create a file called en.json and copy in the following:

{
    "config": {
        "abort": {
            "already_configured": "Device is already configured"
        },
        "error": {
            "cannot_connect": "Failed to connect",
            "invalid_auth": "Invalid authentication",
            "unknown": "Unexpected error"
        },
        "step": {
            "user": {
                "data": {
                    "host": "Host",
                    "password": "Password",
                    "username": "Username"
                }
            }
        }
    }
}

Alternately, if you have access to the root OS, and can hop into the HassOS Core Docker container: docker exec -it homeassistant /bin/bash You can copy all the translation files, for all languages directly from the original Ruckus Unleashed component: /usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations

homeassistant:/usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations# ls -al
total 136
drwxr-xr-x    2 root     root          4096 Jul 14 02:45 .
drwxr-xr-x    1 root     root          4096 Jul 14 03:09 ..
-rw-r--r--    1 root     root          1098 Jul 14 02:44 bg.json
-rw-r--r--    1 root     root           570 Jul 14 02:44 ca.json
-rw-r--r--    1 root     root           615 Jul 14 02:44 cs.json
-rw-r--r--    1 root     root           541 Jul 14 02:44 da.json
-rw-r--r--    1 root     root           560 Jul 14 02:44 de.json
-rw-r--r--    1 root     root          1217 Jul 14 02:44 el.json
-rw-r--r--    1 root     root           528 Jul 14 02:44 en.json
-rw-r--r--    1 root     root           568 Jul 14 02:44 es.json
-rw-r--r--    1 root     root           536 Jul 14 02:44 et.json
-rw-r--r--    1 root     root           574 Jul 14 02:44 fr.json
-rw-r--r--    1 root     root           866 Jul 14 02:44 he.json
-rw-r--r--    1 root     root           624 Jul 14 02:44 hu.json
-rw-r--r--    1 root     root           550 Jul 14 02:44 id.json
-rw-r--r--    1 root     root           555 Jul 14 02:44 it.json
-rw-r--r--    1 root     root           731 Jul 14 02:44 ja.json
-rw-r--r--    1 root     root          1066 Jul 14 02:44 ka.json
-rw-r--r--    1 root     root           775 Jul 14 02:44 ko.json
-rw-r--r--    1 root     root           554 Jul 14 02:44 lb.json
-rw-r--r--    1 root     root           528 Jul 14 02:44 nb.json
-rw-r--r--    1 root     root           546 Jul 14 02:44 nl.json
-rw-r--r--    1 root     root           635 Jul 14 02:44 pl.json
-rw-r--r--    1 root     root           563 Jul 14 02:44 pt-BR.json
-rw-r--r--    1 root     root           589 Jul 14 02:44 pt.json
-rw-r--r--    1 root     root           571 Jul 14 02:44 ro.json
-rw-r--r--    1 root     root          1125 Jul 14 02:44 ru.json
-rw-r--r--    1 root     root           604 Jul 14 02:44 sk.json
-rw-r--r--    1 root     root           407 Jul 14 02:44 sl.json
-rw-r--r--    1 root     root           561 Jul 14 02:44 sv.json
-rw-r--r--    1 root     root           601 Jul 14 02:44 tr.json
-rw-r--r--    1 root     root          1083 Jul 14 02:44 uk.json
-rw-r--r--    1 root     root           605 Jul 14 02:44 zh-Hans.json
-rw-r--r--    1 root     root           623 Jul 14 02:44 zh-Hant.json

image

toxic0berliner commented 11 months ago

For those wanting to do it from the docker host when it's HAOS, here is a quick and dirty way :

langs=$(docker exec homeassistant /bin/bash -c "ls /usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations/")
for l in $langs; do docker cp homeassistant:/usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations/$l /mnt/data/supervisor/homeassistant/custom_components/ruckus_unleashed/translati
ons/ ; done

worked for me, thanks !

Coder84619 commented 11 months ago

For those wanting to do it from the docker host when it's HAOS, here is a quick and dirty way :

langs=$(docker exec homeassistant /bin/bash -c "ls /usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations/")
for l in $langs; do docker cp homeassistant:/usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations/$l /mnt/data/supervisor/homeassistant/custom_components/ruckus_unleashed/translati
ons/ ; done

worked for me, thanks !

Doesn't work for me. I get a 'zsh parse error near 'for'

ms264556 commented 11 months ago

Hi @lanrat, I've ported over the existing integration's tests, apart from one which seems to be testing old HA functionality. Hopefully that's enough to get this accepted.

Can you have a look at my branch here, and apply the changes to your branch?

lanrat commented 11 months ago

@ms264556 Thanks a ton for working on the tests!

I'll dig into your branch soon and update the PR!

toxic0berliner commented 11 months ago

For those wanting to do it from the docker host when it's HAOS, here is a quick and dirty way :

langs=$(docker exec homeassistant /bin/bash -c "ls /usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations/")
for l in $langs; do docker cp homeassistant:/usr/src/homeassistant/homeassistant/components/ruckus_unleashed/translations/$l /mnt/data/supervisor/homeassistant/custom_components/ruckus_unleashed/translati
ons/ ; done

worked for me, thanks !

Doesn't work for me. I get a 'zsh parse error near 'for'

You're doing something wrong, you edited something here, the haOS does not set zsh as your shell. This is intended to be run on the HAOS machine itself which has only /bin/sh, the one you reach using ssh on port 22222

lanrat commented 11 months ago

@ms264556 I'm looking over your changes in that branch now. Are you sure that's the right branch? Its still using the old aioruckus library and not your newer pyruckus library.

The diff between my PR and your branch make it look like your code is based on the much older core code and not my newer version. Am I looking at the wrong thing?

ms264556 commented 11 months ago

aioruckus is my new AJAX library and pyruckus is the old pexpect one.

On Tue, 25 Jul 2023, 14:37 Ian Foster, @.***> wrote:

@ms264556 https://github.com/ms264556 I'm looking over your changes in that branch now. Are you sure that's the right branch? Its still using the old aioruckus https://github.com/ms264556/home-assistant-core/blob/ruckus_unleashed_updates/homeassistant/components/ruckus_unleashed/manifest.json#L10 library and not your newer pyruckus library.

The diff between my PR and your branch https://github.com/lanrat/hass_core/compare/ruckus_unleashed_py3.11...ms264556:home-assistant-core:ruckus_unleashed_updates?diff=split make it look like your code is based on the much older core code and not my newer version. Am I looking at the wrong thing?

— Reply to this email directly, view it on GitHub https://github.com/home-assistant/core/issues/94264#issuecomment-1648914087, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHC7WFT7KJLBXAYLMURSJ2DXR4WNXANCNFSM6AAAAAAY7MFTYM . You are receiving this because you were mentioned.Message ID: @.***>

lanrat commented 11 months ago

You are correct. I was looking at the wrong branch. The new tests seem to work great.

Coder84619 commented 11 months ago

Any update on pushing a working version out?

lanrat commented 11 months ago

@Coder84619 please refer to the progress on the ongoing PR: https://github.com/home-assistant/core/pull/94835

Flash3dee commented 10 months ago

Almost there :), was missing this great integration.

Coder84619 commented 9 months ago

I'm on 2023.9.3 and I'm still seeing this integration failing to setup. When will it make it into the production release?

faithless01 commented 9 months ago

Hi @Coder84619 , I believe it was implemented in 2023.9.2 I have removed the custom component I pulled from lanrat and am using the integration that has been merged into the 2023.9.2 code branch with no issues. No tricks or messing around. It just worked. You may have to remove your old integration, reboot and add the new integration (?) I've just updated to 2023.9.3 with no issues. If you enable debug logging and reload your ruckus integration, what errors are you seeing?

Bubbgump209 commented 9 months ago

I'm on 2023.9.3 and I'm still seeing this integration failing to setup. When will it make it into the production release?

Indeed, working here on 2023.9.3. Be sure HA has HTTP access to the master AP. The old integration used SSH. Now the Ruckus API is used via HTTP.

Coder84619 commented 9 months ago

@Bubbgump209 @faithless01 Thanks guys...I deleted my AP, re-added it, and now all is well.

ms264556 commented 9 months ago

Sorry team. I changed the AP unique id in one PR and then fixed this in a second PR.

But they only took my first PR into this month's release.

So I guess when the 2nd PR goes in next week then you'll need to delete and re-add the AP again.

On Tue, 26 Sept 2023, 05:15 Coder84619, @.***> wrote:

@Bubbgump209 https://github.com/Bubbgump209 @faithless01 https://github.com/faithless01 Thanks guys...I deleted my AP, re-added it, and now all is well.

— Reply to this email directly, view it on GitHub https://github.com/home-assistant/core/issues/94264#issuecomment-1734066078, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHC7WFVHSYVHFARELOCGOC3X4GUXJANCNFSM6AAAAAAY7MFTYM . You are receiving this because you were mentioned.Message ID: @.***>