Open DeFlanko opened 2 years ago
My apologies, I forgot to get around to expanding the readme to cover that. I'm going by memory (and I don't have a Netgear modem anymore, and I don't know if it works on combo modem+router units). Out of curiosity which model modem do you have?
I'll do my best to help you, and hopefully once we get the steps right I can update the readme.
First, are you able to go to 192.168.100.1 and log into your modem, then look at the signal levels manually? You will need to know your username and password to put in this script and observe how many channels your ISP provisions up and down to make the correct sensors later. For that, please refer to the netgear knowledge base: https://kb.netgear.com/24311/Power-level-guidelines-for-a-NETGEAR-cable-modem-router
Are you running Home Assistant HassOS, or did you install it on an existing Linux computer/Pi with Docker running Home Assistant? It should be possible to do both ways, but the process will be slightly different.
I have a Netgear CM1200 that i own, so i have the Login/Password already and I am using Home assistant VM that they provided on my Windows day to day box currently running OS 7.6
I do see that if I go to 192.168.100.1 and right click on "Cable Connection" it'll take me to https://192.168.100.1/DocsisStatus.htm https://192.168.100.1/DocsisStatus.htm which looks to be the page you're referencing the scrape.
I'm not that code savvy yet so I trust you do look there. I look forward to working with you!
[image: image.png]
[image: image.png]
What else would be great if the scrape can capture the eventlog page too: https://192.168.100.1/eventLog.htm https://192.168.100.1/eventLog.htm
[image: image.png]
On Fri, Apr 1, 2022 at 6:16 AM mmiller7 @.***> wrote:
My apologies, I forgot to get around to expanding the readme to cover that. I'm going by memory (and I don't have a Netgear modem anymore, and I don't know if it works on combo modem+router units). Out of curiosity which model modem do you have?
I'll do my best to help you, and hopefully once we get the steps right I can update the readme.
First, are you able to go to 192.168.100.1 and log into your modem, then look at the signal levels manually? You will need to know your username and password to put in this script and observe how many channels your ISP provisions up and down to make the correct sensors later. For that, please refer to the netgear knowledge base: https://kb.netgear.com/24311/Power-level-guidelines-for-a-NETGEAR-cable-modem-router
Are you running Home Assistant HassOS, or did you install it on an existing Linux computer/Pi with Docker running Home Assistant? It should be possible to do both ways, but the process will be slightly different.
— Reply to this email directly, view it on GitHub https://github.com/mmiller7/Netgear_modem_scrape/issues/1#issuecomment-1085888557, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABH52MJ5FVGJISO34ONEP53VC3ZKNANCNFSM5SG5VDJQ . You are receiving this because you authored the thread.Message ID: @.***>
-- Thanks, James
I'll have to get more notes when I'm home from work, but I will say I no longer have a Netgear modem so I don't know that I'd get much done for adding more on event logs and such. Its something I've pondered for my Arris modem too, I've been having TONS of ISP issues which is why I went down this route and had both modems at one point because they tried to blame my modem and I replaced it. But that's not relevant to getting you up and running hearing my story. My difficulty with capturing logs is I haven't figured out a good way to make them accessible since I can't be notified as they appear in real-time, and also there can be zero-to-unlimited log entries between scrapes, and when the modem reboots the initial clock is January 1, 1970 so using dates as a key could make it hard to read too. The scraping method I used is very dumb, just doing a lot of find/replace on HTML tags that seemed to get it into a good-enough format to shove out over MQTT one line at a time.
It sounds like you're doing basically the same thing I am, running "Home Assistant OS" in a VM so I'll have to probably document the "hacks" that I did to get it all working. I'm not familiar with all the ins and outs of containers so I just transplanted a few files it seemed to need and then have it run the "scrape" using an automation.
The "install" is roughly (by memory, without my system to reference) -Get on by SSH or Samba -Make a folder in /config to stick stuff -Copy some binaries and libraries for MQTT into a folder -Copy the files I posted on github over -Add the YAML to your configuration.yaml or packages directory (I use package includes myself for organizing my stuff) -Modify the YAML for the MQTT sensors according to your modem (your ISP may provision more or fewer channels downstream and upstream, while the scrape "publishes" them all, I've not found a way to automated create sensors from MQTT data) -Build yourself some fancy lovelace stuffs and/or notifications based on problems
I'll try and write up something more coherent tonight but that's a rough idea.
Ok, give this a try...there's not an automated script, but it should work to make the changes either with the ssh or samba addons depending on your preference. Please let me know if it works or not, and if you had issues where because this is the first time I know someone else has attempted to set it up, and I'd like to get the correct directions included in my README if you're successful.
I'm going to guide thru setting these up in the same places I chose, but you could pick different locations if you wish.
Prerequisite: MQTT configured and working on Home Assistant
Create the following folders:
Copy the dependency files into place (I used SSH with the cp from_path to_path
command):
/usr/bin/mosquitto_pub
To: /config/bin/mosquitto_deps/mosquitto_pub
/usr/lib/libmosquitto.so.1
To: /config/bin/mosquitto_deps/lib/libmosquitto.so.1
/usr/lib/libcares.so.2
To: /config/bin/mosquitto_deps/lib/libcares.so.2
Download my script off GitHub:
Edit the netgear_signal_dump.sh
file and modify the lines at the top:
The "baseURL" should be your modem's IP address, which is normally 192.168.100.1
# Default mqtt_password is "password"
modem_username="admin"
modem_password="password"
baseURL='http://192.168.100.1'
mqtt_broker="192.168.1.221" mqtt_username="your_mqtt_username_here" mqtt_password="your_mqtt_password_here" mqtt_topic="homeassistant/sensor/modemsignals"
Optional:
At this point, if you wish, it should be possible to do an initial test. By opening a MQTT Explorer/Browser, and then in the SSH addon running `/config/netgear_modem_signal_scraper/netgear_signal_dump.sh` it should scrape the modem and publish new topics. One of the published topics should be `homeassistant/sensor/modemsignals/login` which will provide information about problems or success of the modem login-process of the script while scraping the data.
5. Now, you will need to configure Home Assistant to connect to the sensors and run the script. For this, I recommend using [packages](https://www.home-assistant.io/docs/configuration/packages/) to split out the large amount of configuration for organization.
- Create a new folder to store your package YAML files. `/config/packages_yaml`
- Add an include line to your `/configuration.yaml` to read the packages (think of this as many stand-alone configuration.yaml files with custom names, which you can group in subfolders any way you like)
homeassistant: packages: !include_dir_named packages_yaml
6. Download my sample YAML file off GitHub and place it somewhere in the packages_yaml folder we set up in step 5:
- /config/packages_yaml/network_monitoring/netgear_modem_signal.yaml
7. Log into your modem signal status page (typically 192.168.100.1) manually and take note of how many channels you have downstream and upstream, and their numbers. You will need this information to set up matching sensors that read off MQTT and expose the sensors to Home Assistant as entities. This must be set up by you as the combination of channels varies depending on your ISP and subscription plan what your modem is provisioned for. My example has 32 downstream QAM, 1 downstream ODFM, 4 upstream QAM, and 0 upstream ODFM channels.
8. Modify the example netgear_modem_signals.yaml file automation to adjust when the signal-scrape is triggered. This is near the top, around the line `# Run the test on startup, and hourly` and in my example I have a number of "ping sensors" (set up elsewhere beyond the scope of this guide) that immediately read modem signals when there is a change in connectivity even if the number of minutes has not elapsed.
9. Modify the example netgear_modem_signals.yaml file to match your modem's channels provisioned. This is below the line `# Generated sensors below` in the file, and very repetitive.
The fields are split into several sections of interest for you to review/update
- Downstream `MQTT Inputs` (1-32 are QAM, 33 a single ODFM)
- Downstream `Averages` (provides easy min/avg/max across all 33 sample channels for each type of data)
- Upstream `MQTT Inputs` (1-4 QAM, no ODFM)
- Upstream `Averages` (provides easy min/avg/max across all 4 sample channels)
10. Go to Home Assistant control panel, and validate your configuration. If there are any errors, review those files before restarting Home Assistant.
11. Restart Home Assistant so it loads all the new changes.
12. Observe that the new sensors load in and populate. A good one to start looking at is `sensor.cable_modem_web_ui_login_status` which should indicate the success or failure of the scrape script logging into the modem. If this reports a value of "success" then you should have signal data stored in the other sensor fields ready for use in dashboards and automations.
I spent the better part of the day installing and configuring it. What a great piece of code. Paired with a smart plug, this has the potential to keep everything running nice and smooth. I have a couple of questions.
Upstream OFDMA Channels ChannelLock StatusModulation / Profile IDChannel IDFrequencyPower1Not LockedUnknown00 Hz0 dBmV2Not LockedUnknown00 Hz0 dBmV | Channel | Lock Status | Modulation / Profile ID | Channel ID | Frequency | Power | 1 | Not Locked | Unknown | 0 | 0 Hz | 0 dBmV | 2 | Not Locked | Unknown | 0 | 0 Hz | 0 dBmV Channel | Lock Status | Modulation / Profile ID | Channel ID | Frequency | Power 1 | Not Locked | Unknown | 0 | 0 Hz | 0 dBmV 2 | Not Locked | Unknown | 0 | 0 Hz | 0 dBmV Current System Time: Sat Apr 09 20:12:40 2022 System Up Time: 465:21:01
I appreciate all the work. This is a great automation idea! Thanks man
I'd need you to upload a copy of DocsisStatus.asp the full HTML source (preferably ONLY that page, without the "frames" if you do view-source in a web browser) that it spits out so I can compare and test. I no longer have a Netgear modem to test myself - but I could try and see how hard it looks to try and grab the uptime.
As for the sensors, none of my MQTT anything shows up in the integration, only as "entities" so I assumed that's normal.
Thank you!!
Edited: (removed un-formatted code since corrected file was attached to later comment)
Sorry, I didn't notice what that looked like until I saw the preview. Attached (without frames) is the DocisStatus page renamed to .log
Thanks, I'll try and look into how that can be pulled out with the rest of the data. May be the weekend before I get to, have to see.
at least on my CM700, it uses .htm instead of .asp pages, so I've been trying to modify that. I've found that
curl -u admin:password http://192.168.100.1/DocsisStatus.htm -b /dev/null --anyauth
will spit out the webpage into HA's terminal, if that makes any of this code easier
at least on my CM700, it uses .htm instead of .asp pages, so I've been trying to modify that. I've found that
curl -u admin:password http://192.168.100.1/DocsisStatus.htm -b /dev/null --anyauth
will spit out the webpage into HA's terminal, if that makes any of this code easier
That's interesting...I know a lot of things USED TO use the basic HTTP authentication you could do that. I think my Arris used to be similar before there was a firmware update and it got really painful because the "normal" HTTP auth you're using suddenly started returning a login-form instead. I remember the Netgear CM1100 being especially painful because the sequence of the pages seemed to matter a lot more than others I've messed with (hence the stupid if/else recursion nonsense I had to do). Some you could even just have a 1-liner without login to fetch just the channel status which was nice. Arris modems now also require HTTPS (with SSL) not plain HTTP which is annoying and I was surprised Netgear didn't use SSL passing a password across on the one I had for a bit.
I expect its firmware-version-dependent on some undocumented changelog as to which way works. And you have no say because the DOCSIS spec lets the ISP manage your modem's firmware for security reasons, and they may or may not update it, and they may or may not notify you. My ISP I have to explicitly ask if they have any supported firmware update for my modem and the tech on the phone might or might now understand the question.
If you come up with a different thing that works on that model feel free to fork the code - but I recommend putting the exact model # and firmware revision your modem is known working because of the differences. May also need to tweak the example YAML depending how many channels that model supports too (e.g. if it doesn't support any ODFM channels)
Yeah, it's just that one liner to spit out the DocsisStatus.htm page. Now I just need to figure out what's different on my modem's page, from yours. It isn't returning any useful information past "login success", no errors, no MQTT sensor data, Just login success. Good thing I've got no html experience, lol
replace log with .mhtml
Yeah, Spectrum uses a different firmware version on the CM700 vs the CM1100, as documented here https://kb.netgear.com/000036375/What-s-the-latest-firmware-version-of-my-NETGEAR-cable-modem-or-modem-router
Sorry, I didn't notice what that looked like until I saw the preview. Attached (without frames) is the DocisStatus page renamed to .log
I've made a modification which should publish an additional MQTT topic under homeassistant/sensor/modemsignals/system_up_time
which will contain the SystemUpTime field from that status page.
If you could grab the new netgear_signal_dump.sh from my github and give that a try, I've added a couple lines that pull out the SystemUpTime and should now publish that to MQTT. https://github.com/mmiller7/Netgear_modem_scrape/blob/main/netgear_signal_dump.sh
You'll also need to create a new MQTT sensor to pull in that value in the netgear_modem_signal.yaml
(ignore the typo in my comment, you can see the green portion I added)
https://github.com/mmiller7/Netgear_modem_scrape/commit/548614a6f3754480aca8e5251401790e1b2b969d
The new scripts are pulling in the system up time! One tiny issue. I can't figure out how to parse the data to make it time or numerical. The value is displayed like this { "SystemUpTime": "79:52:31" }. Is it possible to drop the text and { } ? I can't figure out a way to trigger an automation that based on that value. Or if you could point me in the right direction. Also, thank you for doing this. I really appreciate it.
The new scripts are pulling in the system up time! One tiny issue. I can't figure out how to parse the data to make it time or numerical. The value is displayed like this { "SystemUpTime": "79:52:31" }. Is it possible to drop the text and { } ? I can't figure out a way to trigger an automation that based on that value. Or if you could point me in the right direction. Also, thank you for doing this. I really appreciate it.
Could you post what the uptime-sensor looks like and what its showing on "developer tools" in Home Assistant for the sensor value?
That looks like the right JSON format so I will try and look again and see if I missed or added too many quotes if its making it to the sensor value
Are you still using these scrapers with Home Assistant? If so, could you update your code? Home Assistant required a pretty hefty MQTT change. I followed the docs and modified files but it stopped working. Any info would be appreciated. Thank you.
Are you still using these scrapers with Home Assistant? If so, could you update your code? Home Assistant required a pretty hefty MQTT change. I followed the docs and modified files but it stopped working. Any info would be appreciated. Thank you.
I only use the Arris one, but it looks like I forgot to update either after the big changes they did. I'll try and merge that in this weekend, but I won't have any way to test this Netgear one after updating the code. Hopefully as its just changes to the HA YAML it won't be a big deal, that was almost identical for both. I remember it was quite painful because it totally broke my workflow and organization.
I will happily test it. I appreciate it so much.
I will happily test it. I appreciate it so much.
I attempted to update the git repo with a copy of the Arris YAML that I know works, hopefully tweaked to incorporate the few changes I remember from the Netgear.
Also keep in mind you may need to tweak it anyway if your modem has a different quantity of channels to break out all the sensors. My example goes with the 33 downstream and 4 upstream that my ISP provisions. That hopefully is just copy-pasting and changing the number if you need to add more.
Well, we have partial success. I have 32 downstream and 8 upstream. I modified the yaml by deleting downstream channel 33 in all locations and adding 5-8 upstream channels in all locations. I get information back on three lines.
Sensor.cable_modem_system_up_time displays { "SystemUpTime": "45:51:40" } Sensor.modem_startup displays { "status": "HonorMDD", "comment": "IPv6only" } sensor.modem_upstream_1 displays modem_upstream-1 { "Channel": 1, "LockStatus": "Locked", "Modulation": "ATDMA", "ChannelID": 1, "Frequency": 35600000, "Power": 44.8 } friendly_name: modem_upstream-1
Everything else is unknown Does is matter about the OFDM up and downstream channels? Is looks like it's in the scraper...
Thanks!
Well, we have partial success. I have 32 downstream and 8 upstream. I modified the yaml by deleting downstream channel 33 in all locations and adding 5-8 upstream channels in all locations. I get information back on three lines.
Sensor.cable_modem_system_up_time displays { "SystemUpTime": "45:51:40" } Sensor.modem_startup displays { "status": "HonorMDD", "comment": "IPv6only" } sensor.modem_upstream_1 displays modem_upstream-1 { "Channel": 1, "LockStatus": "Locked", "Modulation": "ATDMA", "ChannelID": 1, "Frequency": 35600000, "Power": 44.8 } friendly_name: modem_upstream-1
Everything else is unknown Does is matter about the OFDM up and downstream channels? Is looks like it's in the scraper...
Thanks!
Where are you seeing that, do you have a MQTT explorer you're using? Also what model modem is it? I think I had a CM1100 when I made this.
I wonder if your firmware happens to use a different output format and would need different parsing to find the data or something. I'd like to know if you could use a MQTT explorer app does it actually publish the data out or does it fail?
Ok, give this a try...there's not an automated script, but it should work to make the changes either with the ssh or samba addons depending on your preference. Please let me know if it works or not, and if you had issues where because this is the first time I know someone else has attempted to set it up, and I'd like to get the correct directions included in my README if you're successful.
I'm going to guide thru setting these up in the same places I chose, but you could pick different locations if you wish.
Prerequisite: MQTT configured and working on Home Assistant
- Create the following folders:
- /config/bin
- /config/bin/mosquitto_deps
- /config/bin/mosquitto_deps/lib
- /config/netgear_modem_signal_scraper
- Copy the dependency files into place (I used SSH with the
cp from_path to_path
command):
- From:
/usr/bin/mosquitto_pub
To:/config/bin/mosquitto_deps/mosquitto_pub
- From:
/usr/lib/libmosquitto.so.1
To:/config/bin/mosquitto_deps/lib/libmosquitto.so.1
- From:
/usr/lib/libcares.so.2
To:/config/bin/mosquitto_deps/lib/libcares.so.2
i could not cp /usr/lib/libmosquitto.so.1
/config/bin/mosquitto_deps/lib/libmosquitto.so.1
and
cp /usr/lib/libcares.so.2
/config/bin/mosquitto_deps/lib/libcares.so.2
saying the /usr/lib/so.1's were not found --- i have MQTT from HA installed and running.
im sorry im just getting back to this.
Ok, give this a try...there's not an automated script, but it should work to make the changes either with the ssh or samba addons depending on your preference. Please let me know if it works or not, and if you had issues where because this is the first time I know someone else has attempted to set it up, and I'd like to get the correct directions included in my README if you're successful. I'm going to guide thru setting these up in the same places I chose, but you could pick different locations if you wish. Prerequisite: MQTT configured and working on Home Assistant
- Create the following folders:
- /config/bin
- /config/bin/mosquitto_deps
- /config/bin/mosquitto_deps/lib
- /config/netgear_modem_signal_scraper
- Copy the dependency files into place (I used SSH with the
cp from_path to_path
command):
- From:
/usr/bin/mosquitto_pub
To:/config/bin/mosquitto_deps/mosquitto_pub
- From:
/usr/lib/libmosquitto.so.1
To:/config/bin/mosquitto_deps/lib/libmosquitto.so.1
- From:
/usr/lib/libcares.so.2
To:/config/bin/mosquitto_deps/lib/libcares.so.2
i could not cp
/usr/lib/libmosquitto.so.1
/config/bin/mosquitto_deps/lib/libmosquitto.so.1
and cp/usr/lib/libcares.so.2
/config/bin/mosquitto_deps/lib/libcares.so.2
saying the /usr/lib/so.1's were not found --- i have MQTT from HA installed and running.
I don't know if it varies depending on install type - I'm on HassOS.
Give this a try in a SSH terminal on your Home Assistant: First, figure out what the path is to the pub command on your install:
[core-ssh]$ which mosquitto_pub
/usr/bin/mosquitto_pub
Now, using the path you found above, look at the linked libraries - this will show you where all the library files are it requires to run. These are the things you need to copy over to make sure they are all accessible...there's possibility there is a better way with containers but I don't know it myself.
[core-ssh]$ ldd /usr/bin/mosquitto_pub
/lib/ld-musl-x86_64.so.1 (0x7f3df988a000)
libmosquitto.so.1 => /usr/lib/libmosquitto.so.1 (0x7f3df9862000)
libc.musl-x86_64.so.1 => /lib/ld-musl-x86_64.so.1 (0x7f3df988a000)
libssl.so.1.1 => /lib/libssl.so.1.1 (0x7f3df97e1000)
libcrypto.so.1.1 => /lib/libcrypto.so.1.1 (0x7f3df955f000)
libcares.so.2 => /usr/lib/libcares.so.2 (0x7f3df9549000)
i used your ldd command and found how to:
cp /usr/lib/x86_64-linux-gnu/libmosquitto.so.1 /config/bin/mosquitto_deps/lib/libmosquitto.so.1
But i still don't see the libcares.so.2
wait... found it using Terminal.. and not the terminal from the Visual Studio Code plugin... oddddddd...
moving forward... I now get this running your sh:
I assume (1) that's the IP of my HASS install where the MQTT is installed? (2) the username/password from the Integrations > core-mosquito broker options (3) settings?
Can confirm that the homeassistant username and password can connect with the MQTT Explorer.
So i'm not sure why the sh is reporting back "MQTT-Pub Error!"
i figured it out -- the mosquitto_pub if copied from the Visual studio instance is the wrong kind(?)
100% have to copy these files from Putty for the sh to work.
i figured it out -- the mosquitto_pub if copied from the Visual studio instance is the wrong kind(?)
100% have to copy these files from Putty for the sh to work.
Ah, yeah glad you figured it out. I did them all thru the SSH-addon over SSH like it sounds like you ultimately did.
I'm not familiar with the visual studio code addon so I can't be sure it has the same versions of the binaries and libraries.
it does however say login failed when i review the MQTT Explorer
Which i think it has to do with this line?
I do have a CM1200 and found the page that i'm looking for to be: "https://192.168.100.1/DocsisStatus.htm" by a series of right clicking and or view page source and view iFrame source
I'm not familiar with the visual studio code addon so I can't be sure it has the same versions of the binaries and libraries.
Its what im using to edit my configuration.yaml and the netgear_signal_dump.sh -- i just discovered that it has a terminal feature -- i'm sticking to Putty now that i know its not reliable.
I guess i could have used F12 as well... Either way the info i have in the sh is the correct login for the Cable modem.
So this sounds silly but try logging out on the web-browser, reboot the modem, and try again. I feel like I had issues where the modem couldn't cope with 2 clients at once with the cookies. I think it also would perpetually reject every attempt after like 3 tries or something like that until a reboot.
It does sound like you've got it set up what should be correct.
I don't remember the exact flow but I recall there was some insanity where you couldn't just jump to the right page which is why I had to have it go thru that if/else set that steps thru several dumb intermediate pages. But yeah I think you're on the right track. The 'pageDive" thing is recursion because it has to sometimes do several redirects before it logs in successfully and I also think I had set a limit on how many times (to avoid infinite loops).
You can also add some echo "thing is $variable"
lines in the .sh and run it in the SSH window to help debug what it's doing. At one point I was printing out to the terminal whole HTML pages to try and wrap my head around what wasn't working.
I miss the old days when modems just gave you a basic signal report without any authentication on the first HTTP request.
ok im might be too balls deep in this one...
firstly i added the following echos like you said... thats cool (this is like maybe day 4 using any of this stuff...)
echo "Modem login: $1"
echo "Fetch: $toFetch"
echo "refer: $referPage"
echo "depth: $depth"
echo "trying: ${baseURL}${toFetch}"
echo "rsid: $retailSessionId"
echo "webtoken: $webToken"
which gave me:
curl 'https://192.168.100.1/' \
-H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' \
-H 'Accept-Language: en-US,en;q=0.9' \
-H 'Authorization: Basic YWRtaW46UDAwbHBAcnR5IQ==' \
-H 'Cache-Control: no-cache' \
-H 'Connection: keep-alive' \
-H 'Cookie: XSRF_TOKEN=12333518' \
-H 'DNT: 1' \
-H 'Pragma: no-cache' \
-H 'Sec-Fetch-Dest: document' \
-H 'Sec-Fetch-Mode: navigate' \
-H 'Sec-Fetch-Site: none' \
-H 'Sec-Fetch-User: ?1' \
-H 'Upgrade-Insecure-Requests: 1' \
-H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' \
-H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' \
-H 'sec-ch-ua-mobile: ?0' \
-H 'sec-ch-ua-platform: "Windows"' \
--compressed
which i then modified to be:
if [[ "$depth" -ge 5 ]]; then
loginStatus "max_redirect_exceeded"
else
# Fetch the page
if [ "$referPage" != "" ]; then
referURL="${baseURL}${referPage}"
fi
# Decide which request to make (if page requires post data)
if [ "$toFetch" == "/goform/GenieLogin" ]; then
data=$(curl --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode "loginUsername=$modem_username" --data-urlencode "loginPassword=$modem_password" --data "login=1" --data "webToken=$webToken" "${baseURL}${toFetch}" 2>&1)
exitCode=$?
elif [ "$toFetch" == "/goform/MultiLogin" ]; then
data=$(curl --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: application/x-www-form-urlencoded' --data "yes=yes" --data "Act=yes" --data "RetailSessionId=$retailSessionId" --data "webToken=$webToken" "${baseURL}${toFetch}" 2>&1)
exitCode=$?
else
data=$(curl "$referURL" -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' -H 'Accept-Language: en-US,en;q=0.9' -H 'Authorization: Basic YWRtaW46UDAwbHBAcnR5IQ==' -H 'Cache-Control: no-cache' -H 'Connection: keep-alive' -H 'Cookie: XSRF_TOKEN='$webToken'' -H 'DNT: 1' -H 'Pragma: no-cache' -H 'Sec-Fetch-Dest: document' -H 'Sec-Fetch-Mode: navigate' -H 'Sec-Fetch-Site: none' -H 'Sec-Fetch-User: ?1' -H 'Upgrade-Insecure-Requests: 1' -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' -H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' -H 'sec-ch-ua-mobile: ?0' -H 'sec-ch-ua-platform: "Windows"' --compressed)
exitCode=$?
#data=$(curl --connect-timeout 10 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" "${baseURL}${toFetch}" 2>&1)
exitCode=$?
fi
and that gave me a really weird output in Putty:
[core-ssh ~]$ /config/netgear_modem_signal_scraper/netgear_signal_dump.sh
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
Modem login: failed_retrying
Fetch:
refer:
depth:
trying: https://192.168.100.1/
rsid:
webtoken:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
curl: (3) URL using bad/illegal format or missing URL
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
Modem login: failed
Fetch:
refer:
depth: 0
trying: https://192.168.100.1/
rsid:
webtoken:
im not sure where to go from here.
Ohhhhhhhhhhhh wait a second. It's using HTTPS now? Huh...my Netgear modem's firmware was only HTTP back when I did that script.
I bet you need some extra flags - that error from CURL is saying it looked at the modem's self-signed cert and rejected it as unknown. I wonder if that modification is really all you actually needed?
For curl, there's 2 flags you are probably interested in.
-s
makes it not print that extra junk in the terminal with the "download progress" (once you get it working)
--insecure
tells it to ignore security errors for the SSL certificate. I have to use this on my new Arris modems because it uses HTTPS where when I did the Netgear one it was plain HTTP.
Try adding the --insecure
to all the curl
commands and see if that fixes the error.
If they changed it to HTTPS its possible they redid a bunch of other changes, but if you're lucky that is the only tweak you will need...fingers crossed
Adding the --insecure
to all of your urls nets me this:
[core-ssh ~]$ /config/netgear_modem_signal_scraper/netgear_signal_dump.sh
Modem login: failed_retrying
Fetch:
refer:
depth:
trying: https://192.168.100.1/
rsid:
webtoken:
Modem login: failed
Fetch:
refer:
depth: 0
trying: https://192.168.100.1/
rsid:
webtoken:
[core-ssh ~]$
adding it to the url i made nets me this:
[core-ssh ~]$ /config/netgear_modem_signal_scraper/netgear_signal_dump.sh
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (56) OpenSSL SSL_read: Connection reset by peer, errno 104
Modem login: failed_retrying
Fetch:
refer:
depth:
trying: https://192.168.100.1/
rsid:
webtoken:
rm: can't remove '/config/netgear_modem_signal_scraper/netgear_signal_dump.sh.cookie': No such file or directory
curl: (3) URL using bad/illegal format or missing URL
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (56) OpenSSL SSL_read: Connection reset by peer, errno 104
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (56) OpenSSL SSL_read: Connection reset by peer, errno 104
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (56) OpenSSL SSL_read: Connection reset by peer, errno 104
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (56) OpenSSL SSL_read: Connection reset by peer, errno 104
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (56) OpenSSL SSL_read: Connection reset by peer, errno 104
Modem login: failed
Fetch:
refer:
depth: 0
trying: https://192.168.100.1/
rsid:
webtoken:
I'm pretty sure that Basic token i hardcoded is part of the issue. So, how do we capture the new token on every connection to plug it in. I also am not sure about that $webtoken if its in the right spot.
data=$(curl "$referURL" --insecure -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' -H 'Accept-Language: en-US,en;q=0.9' -H 'Authorization: Basic YWRtaW46UDAwbHBAcnR5IQ==' -H 'Cache-Control: no-cache' -H 'Connection: keep-alive' -H 'Cookie: XSRF_TOKEN='$webToken'' -H 'DNT: 1' -H 'Pragma: no-cache' -H 'Sec-Fetch-Dest: document' -H 'Sec-Fetch-Mode: navigate' -H 'Sec-Fetch-Site: none' -H 'Sec-Fetch-User: ?1' -H 'Upgrade-Insecure-Requests: 1' -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' -H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' -H 'sec-ch-ua-mobile: ?0' -H 'sec-ch-ua-platform: "Windows"' --compressed)
Interesting, when i leave your line in and mine commented out.
data=$(curl --insecure --connect-timeout 10 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" "${baseURL}${toFetch}" 2>&1)
it refreshed cookie information
# Netscape HTTP Cookie File
# https://curl.se/docs/http-cookies.html
# This file was generated by libcurl! Edit at your own risk.
192.168.100.1 FALSE / FALSE 0 XSRF_TOKEN 12333518
I'm pretty sure that Basic token i hardcoded is part of the issue. So, how do we capture the new token on every connection to plug it in. I also am not sure about that $webtoken if its in the right spot.
data=$(curl "$referURL" --insecure -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' -H 'Accept-Language: en-US,en;q=0.9' -H 'Authorization: Basic YWRtaW46UDAwbHBAcnR5IQ==' -H 'Cache-Control: no-cache' -H 'Connection: keep-alive' -H 'Cookie: XSRF_TOKEN='$webToken'' -H 'DNT: 1' -H 'Pragma: no-cache' -H 'Sec-Fetch-Dest: document' -H 'Sec-Fetch-Mode: navigate' -H 'Sec-Fetch-Site: none' -H 'Sec-Fetch-User: ?1' -H 'Upgrade-Insecure-Requests: 1' -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' -H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' -H 'sec-ch-ua-mobile: ?0' -H 'sec-ch-ua-platform: "Windows"' --compressed)
Right, so the HTTP header that it spits back containing 'Cookie: XSRF_TOKEN=12333518'
is something that (with the right flags) curl
should already know how to handle cookies, you shouldn't need to worry about fetching, storing, and returning it as long as you have the lines on the curl
command telling it where to write and read the cookie-file
in curl, the "incoming" cookie is set with -c
-c, --cookie-jar <filename>
(HTTP) Specify to which file you want curl to write all cookies
after a completed operation
and then the "outgoing cookie" is set with -b
-b, --cookie <data|filename>
(HTTP) Pass the data to the HTTP server in the Cookie header.
We point both to the same file because we want to save the last cookie we got every time we get an updated one, and send it in the next request.
If you're not already familiar with "man pages" (as in user-manual) this may help you understanding what every option in curl means (it can vary by version, but is typically very similar): https://linux.die.net/man/1/curl
I wonder if you could be hitting some of the difficulties I had in building it where I had to like reboot the modem in-between my tries on the web-browser and curl-commands (and every 3-5 curl attempts) because I was somehow getting locked out in my trial and error.
If you got the token in the cookie once that suggests you are REALLY close to what it needs to be to work, and maybe some minor confusion or corruption (e.g. the modem's webserver can't decide which client is valid) might be the problem working repeatedly.
I was heavily using the web-developer console to trace what the web-forms submitted and what the cookies were being sent to the browser in my effort to reverse-engineer it originally, and it was a PITA swapping back and forth between clients. I'm not an expert at web-scraping so I used a really crude mashup of grep
, sed
, and awk
to crunch thru it in regular expressions which also could be messed up if they changed the HTML tags around.
Reading the Curl manual now -- i think we need to capture the basic token somehow.
I think i got some where..... "-F" might be needed now.
[core-ssh ~]$ /config/netgear_modem_signal_scraper/netgear_signal_dump.sh
Modem login: failed_retrying
data:
Modem login: failed
data: * WARNING: failed to open cookie file "/config/netgear_modem_signal_scraper/netgear_signal_dump.sh.cookie"
* Trying 192.168.100.1:443...
* Connected to 192.168.100.1 (192.168.100.1) port 443 (#0)
* ALPN: offers h2
* ALPN: offers http/1.1
} [5 bytes data]
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.3 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [6 bytes data]
* TLSv1.3 (IN), TLS handshake, Certificate (11):
{ [787 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [136 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [52 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [52 bytes data]
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384
* ALPN: server did not agree on a protocol. Uses default.
* Server certificate:
* subject: C=US; O=Netgear Inc.; OU=China; CN=94:A6:7E:D8:A0:54
* start date: Dec 11 00:00:00 2020 GMT
* expire date: Dec 11 23:59:59 2040 GMT
* issuer: C=US; O=CableLabs; OU=DOCSIS; OU=D CA00003; CN=CableLabs Cable Modem Certificate Authority
* SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway.
} [5 bytes data]
> POST /MultiLogin.asp HTTP/1.1
> Host: 192.168.100.1
> User-Agent: curl/7.83.1
> Accept: */*
> Referer: -i
> Content-Length: 153
> Content-Type: multipart/form-data; boundary=------------------------b965c022fd94e7ad
>
} [5 bytes data]
* We are completely uploaded and fine
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [217 bytes data]
* OpenSSL SSL_read: Connection reset by peer, errno 104
* Closing connection 0
} [5 bytes data]
* Hostname 192.168.100.1 was found in DNS cache
* Trying 192.168.100.1:443...
* Connected to 192.168.100.1 (192.168.100.1) port 443 (#1)
* ALPN: offers h2
* ALPN: offers http/1.1
* SSL re-using session ID
} [5 bytes data]
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
} [565 bytes data]
* TLSv1.3 (IN), TLS handshake, Server hello (2):
{ [128 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [6 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [52 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [52 bytes data]
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384
* ALPN: server did not agree on a protocol. Uses default.
* Server certificate:
* subject: C=US; O=Netgear Inc.; OU=China; CN=94:A6:7E:D8:A0:54
* start date: Dec 11 00:00:00 2020 GMT
* expire date: Dec 11 23:59:59 2040 GMT
* issuer: C=US; O=CableLabs; OU=DOCSIS; OU=D CA00003; CN=CableLabs Cable Modem Certificate Authority
* SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway.
} [5 bytes data]
> POST / HTTP/1.1
> Host: 192.168.100.1
> User-Agent: curl/7.83.1
> Accept: */*
> Referer: -i
> Content-Length: 153
> Content-Type: multipart/form-data; boundary=------------------------71f4df23461afe94
>
} [5 bytes data]
* We are completely uploaded and fine
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [217 bytes data]
* old SSL session ID is stale, removing
* OpenSSL SSL_read: Connection reset by peer, errno 104
* Closing connection 1
} [5 bytes data]
[core-ssh ~]$
Using in the else statement:
data=$(curl --insecure --connect-timeout 10 -v -s -e -i "$referURL" -b "$cookie_path" -c "$cookie_path" -F "loginUsername=$modem_username;loginPassword=$modem_password" "${baseURL}${toFetch}" 2>&1)
I'm using the ELSE area because i tried this all morning, and finally realised that i can do these to tell me where its landing in this IF statement:
#https://linux.die.net/man/1/curl
if [ "$toFetch" == "/goform/GenieLogin" ]; then
data=$(echo "its Genie Login Time!")
#data=$(curl --insecure --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode "loginUsername=$modem_username" --data-urlencode "loginPassword=$modem_password" --data "login=1" --data "webToken=$webToken" "${baseURL}${toFetch}" 2>&1)
exitCode=$?
elif [ "$toFetch" == "/goform/MultiLogin" ]; then
#data=$(curl --insecure --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: application/x-www-form-urlencoded' --data "yes=yes" --data "Act=yes" --data "RetailSessionId=$retailSessionId" --data "webToken=$webToken" "${baseURL}${toFetch}" 2>&1)
data=$(echo "its multilogin Login Time!")
#data=$(curl --insecure --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: multipart/form-data' --data-urlencode "loginUsername=$modem_username" --data-urlencode "loginPassword=$modem_password" "${baseURL}${toFetch}" 2>&1)
exitCode=$?
else
#get all segments to vars
#data_head=$(curl -k "$referURL" --dump-header "$headers" )
#data=$(curl "$referURL" --insecure -H "$accept_path" -H "$accept_lang_path" -H "$auth_path -H "$cache_control_path" -H "$connection_path" -H "$cookie_path" -H "$DNT_path" -H "$pragma_path" -H "$Sec_Fetch_Dest_path" -H "$Sec_Fetch_Mode_path" -H "$Sec_Fetch_Site_path" -H "$Sec_Fetch_User_path" -H "$Upgrade_Insecure_Requests_path" -H "$User_Agent_path" -H "$sec_ch_ua_path" -H "$sec_ch_ua_mobile_path" -H "$sec_ch_ua_platform_path" --compressed )
#data=$(curl "$referURL" --insecure -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' -H 'Accept-Language: en-US,en;q=0.9' -H 'Authorization: Basic YWRtaW46UDAwbHBAcnR5IQ==' -H 'Cache-Control: no-cache' -H 'Connection: keep-alive' -H 'Cookie: XSRF_TOKEN='$webToken'' -H 'DNT: 1' -H 'Pragma: no-cache' -H 'Sec-Fetch-Dest: document' -H 'Sec-Fetch-Mode: navigate' -H 'Sec-Fetch-Site: none' -H 'Sec-Fetch-User: ?1' -H 'Upgrade-Insecure-Requests: 1' -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' -H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' -H 'sec-ch-ua-mobile: ?0' -H 'sec-ch-ua-platform: "Windows"' --compressed)
data=$(curl --insecure --connect-timeout 10 -v -s -e -i "$referURL" -b "$cookie_path" -c "$cookie_path" -F "loginUsername=$modem_username;loginPassword=$modem_password" "${baseURL}${toFetch}" 2>&1)
#data=$(echo "no fetch met")
#data=$(curl -k --connect-timeout 10 -v -s -e -i "$referURL" -b "$cookie_path" -c "$cookie_path" "${baseURL}${toFetch}" 2>&1)
exitCode=$?
fi
i also have made a complete mess of that section.
Trying your Arris sh and i modified a few things in hopes of it too working.
Added modem_ip="192.168.100.1"
in the modem details section. And then :
token=$(curl --connect-timeout 5 -s --insecure "https://${modem_ip}/goform/MultiLogin?${auth_hash}" -H 'Accept: */*' -H 'Content-Type: application/x-www-form-urlencoded; charset=utf-8' -H "Authorization: Basic ${auth_hash}" -H 'X-Requested-With: XMLHttpRequest' -H 'Cookie: HttpOnly: true, Secure: true')
as well as:
# This function fetches the HTML status page from the modem for parsing
function getResult () {
# Finally, we can request the page
result=$(curl -s --insecure "https://${modem_ip}" -H "Cookie: HttpOnly: true, Secure: true; credential=${token}")
}
but still the reporting back into mqtt says login failed.
I think i got some where..... "-F" might be needed now.
[core-ssh ~]$ /config/netgear_modem_signal_scraper/netgear_signal_dump.sh Modem login: failed_retrying data: Modem login: failed data: * WARNING: failed to open cookie file "/config/netgear_modem_signal_scraper/netgear_signal_dump.sh.cookie" * Trying 192.168.100.1:443... * Connected to 192.168.100.1 (192.168.100.1) port 443 (#0) * ALPN: offers h2 * ALPN: offers http/1.1 } [5 bytes data] * TLSv1.3 (OUT), TLS handshake, Client hello (1): } [512 bytes data] * TLSv1.3 (IN), TLS handshake, Server hello (2): { [122 bytes data] * TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8): { [6 bytes data] * TLSv1.3 (IN), TLS handshake, Certificate (11): { [787 bytes data] * TLSv1.3 (IN), TLS handshake, CERT verify (15): { [136 bytes data] * TLSv1.3 (IN), TLS handshake, Finished (20): { [52 bytes data] * TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1): } [1 bytes data] * TLSv1.3 (OUT), TLS handshake, Finished (20): } [52 bytes data] * SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384 * ALPN: server did not agree on a protocol. Uses default. * Server certificate: * subject: C=US; O=Netgear Inc.; OU=China; CN=94:A6:7E:D8:A0:54 * start date: Dec 11 00:00:00 2020 GMT * expire date: Dec 11 23:59:59 2040 GMT * issuer: C=US; O=CableLabs; OU=DOCSIS; OU=D CA00003; CN=CableLabs Cable Modem Certificate Authority * SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway. } [5 bytes data] > POST /MultiLogin.asp HTTP/1.1 > Host: 192.168.100.1 > User-Agent: curl/7.83.1 > Accept: */* > Referer: -i > Content-Length: 153 > Content-Type: multipart/form-data; boundary=------------------------b965c022fd94e7ad > } [5 bytes data] * We are completely uploaded and fine { [5 bytes data] * TLSv1.3 (IN), TLS handshake, Newsession Ticket (4): { [217 bytes data] * OpenSSL SSL_read: Connection reset by peer, errno 104 * Closing connection 0 } [5 bytes data] * Hostname 192.168.100.1 was found in DNS cache * Trying 192.168.100.1:443... * Connected to 192.168.100.1 (192.168.100.1) port 443 (#1) * ALPN: offers h2 * ALPN: offers http/1.1 * SSL re-using session ID } [5 bytes data] * TLSv1.3 (OUT), TLS handshake, Client hello (1): } [565 bytes data] * TLSv1.3 (IN), TLS handshake, Server hello (2): { [128 bytes data] * TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8): { [6 bytes data] * TLSv1.3 (IN), TLS handshake, Finished (20): { [52 bytes data] * TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1): } [1 bytes data] * TLSv1.3 (OUT), TLS handshake, Finished (20): } [52 bytes data] * SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384 * ALPN: server did not agree on a protocol. Uses default. * Server certificate: * subject: C=US; O=Netgear Inc.; OU=China; CN=94:A6:7E:D8:A0:54 * start date: Dec 11 00:00:00 2020 GMT * expire date: Dec 11 23:59:59 2040 GMT * issuer: C=US; O=CableLabs; OU=DOCSIS; OU=D CA00003; CN=CableLabs Cable Modem Certificate Authority * SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway. } [5 bytes data] > POST / HTTP/1.1 > Host: 192.168.100.1 > User-Agent: curl/7.83.1 > Accept: */* > Referer: -i > Content-Length: 153 > Content-Type: multipart/form-data; boundary=------------------------71f4df23461afe94 > } [5 bytes data] * We are completely uploaded and fine { [5 bytes data] * TLSv1.3 (IN), TLS handshake, Newsession Ticket (4): { [217 bytes data] * old SSL session ID is stale, removing * OpenSSL SSL_read: Connection reset by peer, errno 104 * Closing connection 1 } [5 bytes data] [core-ssh ~]$
Using in the else statement:
data=$(curl --insecure --connect-timeout 10 -v -s -e -i "$referURL" -b "$cookie_path" -c "$cookie_path" -F "loginUsername=$modem_username;loginPassword=$modem_password" "${baseURL}${toFetch}" 2>&1)
I'm using the ELSE area because i tried this all morning, and finally realised that i can do these to tell me where its landing in this IF statement:
#https://linux.die.net/man/1/curl if [ "$toFetch" == "/goform/GenieLogin" ]; then data=$(echo "its Genie Login Time!") #data=$(curl --insecure --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode "loginUsername=$modem_username" --data-urlencode "loginPassword=$modem_password" --data "login=1" --data "webToken=$webToken" "${baseURL}${toFetch}" 2>&1) exitCode=$? elif [ "$toFetch" == "/goform/MultiLogin" ]; then #data=$(curl --insecure --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: application/x-www-form-urlencoded' --data "yes=yes" --data "Act=yes" --data "RetailSessionId=$retailSessionId" --data "webToken=$webToken" "${baseURL}${toFetch}" 2>&1) data=$(echo "its multilogin Login Time!") #data=$(curl --insecure --connect-timeout 5 -v -s -e "$referURL" -b "$cookie_path" -c "$cookie_path" -X POST -H 'Content-Type: multipart/form-data' --data-urlencode "loginUsername=$modem_username" --data-urlencode "loginPassword=$modem_password" "${baseURL}${toFetch}" 2>&1) exitCode=$? else #get all segments to vars #data_head=$(curl -k "$referURL" --dump-header "$headers" ) #data=$(curl "$referURL" --insecure -H "$accept_path" -H "$accept_lang_path" -H "$auth_path -H "$cache_control_path" -H "$connection_path" -H "$cookie_path" -H "$DNT_path" -H "$pragma_path" -H "$Sec_Fetch_Dest_path" -H "$Sec_Fetch_Mode_path" -H "$Sec_Fetch_Site_path" -H "$Sec_Fetch_User_path" -H "$Upgrade_Insecure_Requests_path" -H "$User_Agent_path" -H "$sec_ch_ua_path" -H "$sec_ch_ua_mobile_path" -H "$sec_ch_ua_platform_path" --compressed ) #data=$(curl "$referURL" --insecure -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' -H 'Accept-Language: en-US,en;q=0.9' -H 'Authorization: Basic YWRtaW46UDAwbHBAcnR5IQ==' -H 'Cache-Control: no-cache' -H 'Connection: keep-alive' -H 'Cookie: XSRF_TOKEN='$webToken'' -H 'DNT: 1' -H 'Pragma: no-cache' -H 'Sec-Fetch-Dest: document' -H 'Sec-Fetch-Mode: navigate' -H 'Sec-Fetch-Site: none' -H 'Sec-Fetch-User: ?1' -H 'Upgrade-Insecure-Requests: 1' -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' -H 'sec-ch-ua: "Not?A_Brand";v="8", "Chromium";v="108", "Google Chrome";v="108"' -H 'sec-ch-ua-mobile: ?0' -H 'sec-ch-ua-platform: "Windows"' --compressed) data=$(curl --insecure --connect-timeout 10 -v -s -e -i "$referURL" -b "$cookie_path" -c "$cookie_path" -F "loginUsername=$modem_username;loginPassword=$modem_password" "${baseURL}${toFetch}" 2>&1) #data=$(echo "no fetch met") #data=$(curl -k --connect-timeout 10 -v -s -e -i "$referURL" -b "$cookie_path" -c "$cookie_path" "${baseURL}${toFetch}" 2>&1) exitCode=$? fi
i also have made a complete mess of that section.
If you're not already, rather than trying to use the 'data' field to debug, you can also just spam the whole script with echo "I am here"
lines and run it manually from the SSH prompt and you should see each echo
line print in your terminal as it tries to run. May make it easier to observe.
You can also force it to stop instantly somewhere with a line like exit 1
so you can debug the "first part" of something and not keep going thru the whole output.
It's a pain in the butt how convoluted some of these modems make things. I think Netgear is a bit harder because it has the "frames inside of frames" thing to handle the same UI for routers and modem-only units vs Arris is just a single page for the modem
Reading the Curl manual now -- i think we need to capture the basic token somehow.
I wonder if it's sharing some of the code from the Arris one that I have now, which puts a hidden web-form field that has a "key" to pick up and pass along.
https://github.com/mmiller7/Arris_modem_scrape/blob/main/arris_signal_dump.sh
Everything works from either WinSCP or the SSH / Termnial addon however, when called from an automation the following error is produced: I checked to make sure it was executable. I also tried added a second script call with sudo to see if it changed the error message. (It did not.) Fully up to date Home Assistant
Anyone have any ideas? I'd appreciate any information. Thanks!
Logger: homeassistant.components.shell_command Source: /usr/src/homeassistant/homeassistant/components/shell_command/init.py:117 Integration: Shell Command (documentation, issues) First occurred: 12:50:19 PM (30 occurrences) Last logged: 3:15:00 PM
Error running command: /config/netgear_signal_dump.sh
, return code: 127
Error running command: sudo '/config/netgear_modem_signal_scraper/netgear_signal_dump.sh'
, return code: 127
NoneType: None
I don't think HassOS supports 'sudo' so that one at least makes sense. Not sure about the other.
Maybe try checking if the paths to the libraries for MQTT binary changed and more need to be copied? I talked about how to debug that earlier in this thread at this post
This is just to report that the location for the mosquitto_pub
binary and libraries are still the same in current Home Assistant. I use them in my adapted script for the Technicolor modem (at phrxmd/Technicolor_modem_scrape).
How do i install this?