Open NoPlayBack opened 2 months ago
This Adapter gets the values, when the API sends the values. How would you force the websocket connection to get values every 30s? Turn debug
-mode on and you can see all the values which are sent in which time frame.
No... no need to receive values every 30 sec. Have set debug mode and i see all 50 elements of the gps position data.
The dict element "positions" in the received data includes a list with 50 entries. Each list-position is again a dict and includes a pair of data, long and lat gps positions as float. The first pair of data is the last stored position, and the 50th pair is the one which was stored 50 * 30 sec ago. The stucture looks like this: "positions":[{"latitude":45.0000000,"longitude":10.0000000},{"latitude":45.0000020,"longitude":10.0000020},....... (data modified to stay anonymous ;-))
The structure is also described in the API-docs from Husqvarna https://developer.husqvarnagroup.cloud/apis/automower-connect-api?tab=openapi
The trick with the 30 seconds i did not find anywhere, but when i use the timestamp in "metadata":{"connected":true,"statusTimestamp":1727257706532}.... and then check how many new gps positions were stored in that list, then i definitely see a timing of 30 sec for each gps position.
So if i could see the complete list of all 50 gps positions and read the timestamp, i can see with one dataset send from API the last 50 positions of the mower from the timestamp and the timeframe of 30 sec * 50 positions = 25 minutes before that timestamp.
(…) So if i could see the complete list of all 50 gps positions (…)
Please try 0.6.0-beta.x
ups... what am i doing wrong?
$ iobroker upgrade husqvarna-automower@0.6.0-beta.x --debug Adapter "husqvarna-automower" is not in the repository and cannot be updated. ERROR: Process exited with code 53
Uninstall the existing version, and install https://github.com/ice987987/ioBroker.husqvarna-automower/tree/v0.6.0-beta.x directly from github on your non productive installation.
jupp.... i see the data point now. It is the full list of 50 entries, each entry a dict with long and lat of last GPS data.
Not sure how to deal easy with that within the iobroker and datastorage environment... but for now for further testing i am happy... i will grab the data via Script or maybe outside python logic, and start to use that data... will report back once i did get familiar with that.
Is there an easy way to store all 50 lat and all 50 long separate in a db? I do not know an easy way except what i did mention above. But if i am missing something here then i would be happy to learn that. Whatever, as said, for now i am happy and will start logging ;-) Will show the results here.
so here we go....
Any chance to improve that ?
you could log .positions.latitude
and .positions.longitude
via ioBroker.sql and read the values from there.
what is your goal? you want to to see the path of the robot, as far i understand. where and how would would you visualize it? in my opinion you do not need the time, because the latest position is in each case the position where the robot stands. and you heed to delete the path after some time, otherwise you have only lines of the robot...
I am using already "SQL logging" https://www.iobroker.net/#en/adapters/adapterref/iobroker.sql/README.md and storing data as into sql db in order to save history.
Yes i want to see the path of mower and not only the last known position.... there are adapters and routines available to show a path in VIS on a google earth page. I can select a timeframe and then see the detail path of the mower.
The trick is, that obviously the GPS-data is generated every 30 sec. and stored into that positions-dictionary within the whole dictionary, and contains 50 last values. So that would enable one to fetch data not more often as 25 minutes, and see the detail path of the mower. Your adapter seems to fetch data every 10 minutes, so i would be happy to use that access and read the gps array. Unfortunately, at startup of your adapter, the complete array of 50 positions is received. The updates only show the last entry of that 50 values, so that does not help. Is there a chance to receive also the complete array of 50 values when receiving the update of the data?
Background of all this is: I have a new installation of the mower and there are some tricky areas which the mower has to find, and some critical boundaries where the mower should catch the gras at the boundary, but not drive too much over that boundary... or there are walls where the mower would get scratches.... so i am currently running trial and error with the value, how far the mower should go over the cable, and if the mower should use it's second blade-unit to cut the edges.
Unfortunately, at startup of your adapter, the complete array of 50 positions is received. The updates only show the last entry of that 50 values, so that does not help. Is there a chance to receive also the complete array of 50 values when receiving the update of the data?
for this question you have to reach out to husqvarna directly
Have grabbed and transformed the wanted data now for 1 hour.... and made it visible in Google Earth. See picture attached... that is what i would like to achieve. 122 GPS data in approximately 1 hour. With this one can easily check how the area is covered by the mower. If using the data which is provided by the adapter currently, then i would have only 6 GPS datapoints in that same timeframe.
Unfortunately the panel i was trying to use is currently not available in Grafana.... it is on the way of getting published, so i have to wait for easy and automatically access.
So you are saying with that kind of connection you establish with Husqvarna it is not possible to fetch the 50 GPS point in that array ?
So you are saying with that kind of connection you establish with Husqvarna it is not possible to fetch the 50 GPS point in that array ?
not via websocket connection as far I understand.
you could possibly use the statistics call .ACTIONS.REFRESHSTATISTICS
or set an appropriate time in the instance but here you are limited wit tha calls/month.
Hmmm no idea what a "websocket connection" is... so right now i am running a python, which is running in a loop every 10 minutes. That requests an access_token, then it requests.get the mowers. With that i get the full blown dict, including everything. Then i revoke the access_token again to stay clean... currently do not want to handle the timeout of that access_token. After that i process the data and if new GPS-positions found i store in db appropriately.
Then we can close here since you do no longer use this adapter?
yes and no.... as my experiences with programming are very low and the python i did write seems to work but is more of a style "trial & error".... my goal was to use your adapter for grapping the data because i feel that this might be much more safe in regards to stability. Yes... if you are sure that with your kind of connection it is not possible.... and no if you think that this might be a feature which can be implemented and might be of use for other users as well. For now i am running your adapter and my python in parallel to see and learn in the next days. I simply would have been happy if i would have been able to grab that data through your adapter without having a rude python script running on the raspberry ;-)
if you set .ACTIONS.REFRESHSTATISTICS
-> true
do you get what you expect?
did not try that one... not sure how to handle and also not sure if that will be inline with the calls/month...
so now also got it running in a "TrackMap" panel in Grafana... here one can select a time frame and check the positions the mower had in that timeframe.
It would be great if you could publish your detailed solution here or somewhere on github, that the community also has access to such a solution.
Using "TrackMap" plugin in Grafana. Currently not selectable the standard way but one can download the zip and install manually by placing the files into the plugin folder of Grafana.
Fetching the array of gps position is done with:
clientId = 'your ID here' clientSecret = 'your Secret here'
r = requests.post('https://api.authentication.husqvarnagroup.dev/v1/oauth2/token', data={"grant_type": 'client_credentials', "client_id": clientId, "client_secret": clientSecret}, timeout=60).json()
headers = { "Authorization": "Bearer " + r['access_token'], "X-Api-Key": clientId, "Authorization-Provider": "husqvarna" }
url = "https://api.amc.husqvarna.dev/v1/mowers" mowersResponse = requests.get(url, headers=headers, timeout=60).json()
headers = { "Authorization": "Bearer " + r['access_token'] }
r_revoke = requests.post('https://api.authentication.husqvarnagroup.dev/v1/oauth2/revoke', data={"token": r['access_token']}, headers=headers, timeout=60)
These 3 request do
I am doing that in order to not deal with the timeout of the accesstoken... that is something i will implement later. In order to not create multiple logins i simply revoke the token after grabbing the data, and then, after 10 minutes, the loop is restarted.
mowersResponse then holds the data from the mower, including the list of 50 positions on mowersResponse['data'][0]['attributes']['positions']
I then process all positions with for positionlistitem in range (len(mowersResponse['data'][0]['attributes']['positions'])): latitude = mowersResponse['data'][0]['attributes']['positions'][positionlistitem]['latitude'] longitude = mowersResponse['data'][0]['attributes']['positions'][positionlistitem]['longitude']
if there are new positions then i store them with a timestamp. As timestamp i use mowersResponse['data'][0]['attributes']['metadata']['statusTimestamp'] The first position gets that timestamp, and as it looks pretty stable to have a 30 sec difference for each position, i subtract 30000 from the timestamp for each unknown position.
That's it... quick and dirty but it is up and running ...
Feature or idea description
Currently one longitude and latitude is shown as object... so position of the robo can be logged every 10 minutes if the adapter calls for new values every 10 minutes (at least that is what i think it is?)
The API provides with one call an array of 50 longitude&latitude float values, which seem always to be 30 sec apart from each other. So each call, if every 10 minutes, fetches the 50 positions from the last 25 minutes, which could be transformed to an constant GPS data flow and a position for each 30 sec.
Any chance to get that implemented ? I would love to view the path of the robo in that resolution...
Version of "husqvarna-automower"-adapter
0.5.0-beta.8
Additional information & file uploads
No response