mdeweerd / MetersToHA

Get Meter Data into Home Automation systems
GNU General Public License v3.0
26 stars 8 forks source link

install error #17

Open Amapem opened 7 months ago

Amapem commented 7 months ago

J'ai cette erreur quand j'essaie d'installer l'extension sur mon HAOS.

Échec de l'installation de l'extension The command '/bin/bash -o pipefail -c apk add --no-cache git py-urllib3 py3-colorama xvfb py3-pip xorg-server-xephyr chromium-chromedriver chromium py3-aiohttp py3-openssl py3-pysocks py3-wsproto py3-requests py3-sniffio py3-async_generator py3-sortedcontainers py3-attrs py3-outcome py3-trio py3-paho-mqtt && pip install --no-cache-dir 'urllib3>=1.24.2' 'colorama>=0.3.7' 'selenium>=3.14.1' 'PyVirtualDisplay>=0.2.4' 'requests>=2.23.0'' returned a non-zero code: 20

mdeweerd commented 7 months ago

Le mode d'installation c'est le Addon ?

Amapem commented 7 months ago

Le mode d'installation c'est le Addon ?

Oui

mdeweerd commented 7 months ago

Ok. J'ai cherché un peu l'erreur 20 et sans avoir de réponse précise, cela ressemble à une interruption de l'installation.

Dans un terminal sur HAOS (Terminal + SSH par exemple) on peut faire ha supervisor log- il y a peut-être d'autres infos. On peut aussi voir des traces sous '.../config/logs' accessible depuis Paramètres > Système > Journaux:

image .

Pour le moment l'image est construit sur la cible - c'est peut-être trop long/pas assez de mémoire/... . Il faut éventuellement tenter "Reconstruire".

Amapem commented 7 months ago

image je suis sur une VM proxmox c'est une copie d'écran je peux pas faire de copier coller de la console VW HAOS

Amapem commented 7 months ago

j'ai essayé par un docker :

root@Docker-host /home/MetersToHA# docker-compose run --rm meters-to-ha-debug-grdf
[+] Building 0.7s (7/7) FINISHED                                                                                                                                       docker:default
 => [meters-to-ha-debug-grdf internal] load build definition from DockerfileAlpine                                                                                               0.0s
 => => transferring dockerfile: 38B                                                                                                                                              0.0s
 => [meters-to-ha-debug-grdf internal] load .dockerignore                                                                                                                        0.0s
 => => transferring context: 2B                                                                                                                                                  0.0s
 => [meters-to-ha-debug-grdf internal] load metadata for docker.io/library/alpine:3.17                                                                                           0.7s
 => [meters-to-ha-debug-grdf 1/3] FROM docker.io/library/alpine:3.17@sha256:6e94b5cda2d6fd57d85abf81e81dabaea97a5885f919da676cc19d3551da4061                                     0.0s
 => CACHED [meters-to-ha-debug-grdf 2/3] RUN apk add --no-cache         py-urllib3         py3-colorama         xvfb         py3-pip         xorg-server-xephyr         chromiu  0.0s
 => CACHED [meters-to-ha-debug-grdf 3/3] RUN addgroup -g 1000 docker &&  adduser -D -u 1000 -G docker docker && apk add alpine-conf && setup-keymap fr fr                        0.0s
 => [meters-to-ha-debug-grdf] exporting to image                                                                                                                                 0.0s
 => => exporting layers                                                                                                                                                          0.0s
 => => writing image sha256:91b6727c7c9dbaa1d54359c205209b638f93a3ca9db105f22863e4ca23eef174                                                                                     0.0s
 => => naming to docker.io/library/meterstoha-meters-to-ha-debug-grdf                                                                                                            0.0s
DEBUG MODE ACTIVATED                                                       [WW] only use '--debug' for troubleshooting
Using /usr/bin/python3 Version 3.10.13                                     --Loading configuration file : ./config.json                                 --[OK] 
Start loading configuration                                                [--]
    "veolia" = "False"                                                     [OK] 
    "veolia_login" = "None"                                                [OK] 
    "veolia_password" = "None"                                             [OK] 
    "veolia_contract" = "None"                                             [OK] 
    "grdf" = "True"                                                        [OK] 
    "grdf_login" = "ludo.lesur@gmail.com"                                  [OK] 
    "grdf_password" = "******************"                                 [OK] 
    "grdf_pce" = "21462807485090"                                          [OK] 
    "screenshot" = "False"                                                 [OK] 
    "skip_download" = "False"                                              [OK] 
    "keep_output" = "True"                                                 [OK] 
"geckodriver" not found in config file, using default value                [WW] 
    "geckodriver" = "/workdir/apps/meters_to_ha/geckodriver"               [OK] 
"firefox" not found in config file, using default value                    [WW] 
    "firefox" = "/workdir/apps/meters_to_ha/firefox"                       [OK] 
"chromium" not found in config file, using default value                   [WW] 
    "chromium" = "/usr/bin/chromium-browser"                               [OK] 
"chromedriver" not found in config file, using default value               [WW] 
    "chromedriver" = "/usr/bin/chromedriver"                               [OK] 
    "chrome_version" = "None"                                              [OK] 
    "timeout" = "30"                                                       [OK] 
    "download_folder" = "/workdir/apps/meters_to_ha/"                      [OK] 
    "logs_folder" = "/workdir/apps/meters_to_ha/"                          [OK] 
    "2captcha_token" = "***********"                                       [OK] 
    "capmonster_token" = "***********"                                     [OK] 
    "captchaai_token" = "None"                                             [OK] 
"log_level" not found in config file, using default value                  [WW] 
    "log_level" = "INFO"                                                   [OK] 
End loading configuration                                                  [OK] 
Start Loading Home Assistant configuration                                 [--]
    "ha_server" = "http://192.168.0.241:8123"                              [OK] 
    "ha_token" = "***************************************************************************************************************************************************************************************"[OK] 
    "veolia_contract" = "None"                                             [OK] 
    "timeout" = "30"                                                       [OK] 
    "insecure" = "False"                                                   [OK] 
    "state_file" = "/workdir/apps/meters_to_ha/meters2ha_state.json"       [OK] 
End loading Home Assistant configuration                                   [OK] 
Check availability of "geckodriver"+"firefox" or "chromedriver"+"chromium" [~~] 
Found chromium binary                                                      [OK] 
[--]  Check Home Assistant connectivity                                          [OK] 
Try starting Chromium.                                                     [--] Add nix root user options.
Start virtual display (Chromium).                                          [EE] Xephyr program closed. command: ['Xephyr', '-br', '-screen', '1280x1024x24', '-displayfd', '4', '-resizeable'] stderr: b'\nXephyr cannot open host display. Is DISPLAY set?\n'
Close Browser                                                              [OK] 
Close Display                                                              [OK] 
Ended with error                                                           [EE] 
Traceback (most recent call last):
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 3336, in doWork
    crawler.init()
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 518, in init
    self.init_chromium()
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 733, in init_chromium
    self.__display.start()
  File "/usr/lib/python3.10/site-packages/pyvirtualdisplay/display.py", line 72, in start
    self._obj.start()
  File "/usr/lib/python3.10/site-packages/pyvirtualdisplay/abstractdisplay.py", line 149, in start
    self._start1_has_displayfd()
  File "/usr/lib/python3.10/site-packages/pyvirtualdisplay/abstractdisplay.py", line 197, in _start1_has_displayfd
    self.display = int(self._wait_for_pipe_text(rfd))
  File "/usr/lib/python3.10/site-packages/pyvirtualdisplay/abstractdisplay.py", line 297, in _wait_for_pipe_text
    raise XStartError(
pyvirtualdisplay.abstractdisplay.XStartError: Xephyr program closed. command: ['Xephyr', '-br', '-screen', '1280x1024x24', '-displayfd', '4', '-resizeable'] stderr: b'\nXephyr cannot open host display. Is DISPLAY set?\n'
mdeweerd commented 7 months ago
  1. Dans la première trace, le superviseur se plaint qu'il n'y a pas de connexion internet pour divers opérations, c'est peut-être aussi le cas pour Meters2HA.
  2. Pour l'exécution avec docker: l'option "-debug" c'est seulement pour avec un serveur 'X' - je pense que je vais modifier ce paramètre historique car il induit trop en erreur. Ne pas l'utiliser - modifier le niveau de la trace (log_level) à debug est possible par contre.
Amapem commented 7 months ago

Je n'ai pas vu où modifier le niveau de trace pour le docker

sinon j'ai çà en stoppant le docker

root@Docker-host /home/MetersToHA# docker-compose run --rm  meters-to-ha-grdf
^CTraceback (most recent call last):
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 3407, in <module>
    doWork()
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 3347, in doWork
    gazpar_file = crawler.get_gazpar_file()
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 1787, in get_gazpar_file
    if self.resolve_captcha2() is not None:
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 1168, in resolve_captcha2
    response = requests.get(url, timeout=10)
  File "/usr/lib/python3.10/site-packages/requests/api.py", line 73, in get
    return request("get", url, params=params, **kwargs)
  File "/usr/lib/python3.10/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3.10/site-packages/requests/sessions.py", line 587, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3.10/site-packages/requests/sessions.py", line 701, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3.10/site-packages/requests/adapters.py", line 489, in send
    resp = conn.urlopen(
  File "/usr/lib/python3.10/site-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
  File "/usr/lib/python3.10/site-packages/urllib3/connectionpool.py", line 467, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "/usr/lib/python3.10/site-packages/urllib3/connectionpool.py", line 462, in _make_request
    httplib_response = conn.getresponse()
  File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse
    response.begin()
  File "/usr/lib/python3.10/http/client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.10/http/client.py", line 279, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.10/socket.py", line 705, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.10/ssl.py", line 1307, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.10/ssl.py", line 1163, in read
    return self._sslobj.read(len, buffer)
KeyboardInterrupt

Si je le laisse tourner ça finit pas stopper :

root@Docker-host /home/MetersToHA# docker-compose run --rm  meters-to-ha-grdf
Traceback (most recent call last):
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 3347, in doWork
    gazpar_file = crawler.get_gazpar_file()
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 1945, in get_gazpar_file
    raise ValueError("No content")
ValueError: No content

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 3357, in doWork
    gazpar_file = crawler.get_gazpar_file()
  File "/workdir/./apps/meters_to_ha/meters_to_ha.py", line 1945, in get_gazpar_file
    raise ValueError("No content")
ValueError: No content
mdeweerd commented 7 months ago

A l'arrêt le processus était visiblement en cours - cela prend qqs temps pour se finaliser et il y a aussi une tentative de rattrapage s'il y a un problème.

Pour l'option du niveau de trace de meters_to_ha, il faut ajouter une ligne 'log_level' au fichier de configuration json ("log_level":"debug"), mais cela ne change pas grand chose car le principal est déjà dans 'service.log'.

Si je le laisse tourner ça finit pas stopper :

Probablement "par stopper", et "No content indique qu'il n'a pas eu de contenu récupéré. Le 'service.log' contient plus d'infos.

Note: j'ai modifié --debug en --display.

Amapem commented 7 months ago

Merci. Le docker est OK désormais (voir service.log)

2024-01-11 10:05:22,642 : OK : Check Home Assistant connectivity
2024-01-11 10:05:22,642 : -- : Try starting Chromium. Add nix root user options.
2024-01-11 10:05:22,712 : OK : Start virtual display (Chromium).
2024-01-11 10:05:22,712 : ~~ :  LOG LEVEL INFO:20:20 None
2024-01-11 10:05:23,896 : OK : Start the browser
2024-01-11 10:05:23,896 : OK :
2024-01-11 10:05:27,984 : OK : Waiting for cookie popup
2024-01-11 10:05:28,089 : OK : Click on deny
2024-01-11 10:05:28,481 : OK : Connexion au site GRDF
2024-01-11 10:05:29,151 : OK : Waiting for Password
2024-01-11 10:05:29,245 : OK : Waiting for Email
2024-01-11 10:05:30,337 : OK : Type Email
2024-01-11 10:05:30,543 : OK : Type Password
2024-01-11 10:05:30,552 : ~~ : Proceed with captcha resolution. 2Captcha https://2captcha.com/in.php?key=258bxxxxxxxxxxxxxxxxxxb34213fed&method=userrecaptcha&googlekey=6LfNHKgZAxxxxxxxxxxxxxxxxxB4bsyHy1&pageurl=https://login.monesp>
2024-01-11 10:05:30,679 : ~~ :  2Captcha Service response OK|75446344689
2024-01-11 10:05:30,679 : ~~ :  Wait 20s for 2Captcha
2024-01-11 10:05:50,817 : ~~ :  2Captcha Service response OK|03AFcWeA4vxxxxxxxxxxxxxxxxxBeUwo0J5Y-Qcl-hmtB0yET0Q__yFU5Ffv71kGuerYIncuSkLS-D6MfF2dtz9cwt7R-earNJZCu1Y0fcbo7vkrAaxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxUKTgbpTPqmMdfCqvWGKBl>
2024-01-11 10:05:52,858 : OK : Automatic resolution succeeded. Wait for Button xpath://input[@value='Connexion'].
2024-01-11 10:05:52,858 : ~~ : Wait before clicking (1.7s).
2024-01-11 10:05:54,673 : OK : Click on connexion
2024-01-11 10:06:01,589 : -- : End of wait after connexion. Get Data URL https://monespace.grdf.fr/api/e-conso/pce/consommation/informatives?dateDebut=2023-12-28&dateFin=2024-01-11&pceList[]=21462807485090Get Data Content. Writing '/workd>
2024-01-11 10:06:01,592 : -- :  From sensor.grdf_21462807485090_kwh: None
2024-01-11 10:06:01,594 : -- :  From sensor.gas_consumption_kwh: None
2024-01-11 10:06:01,599 : -- :  Previous None m3 None kWh 2024-01-04 10:06:01.589622+00:00 from None
2024-01-11 10:06:01,600 : -- :  New Total 80kWh (+80)
2024-01-11 10:06:01,600 : -- :  New Total 151kWh (+71)
2024-01-11 10:06:01,600 : -- :  New Total 235kWh (+84)
2024-01-11 10:06:01,600 : -- :  New Total 335kWh (+100)
2024-01-11 10:06:01,600 : -- :  New Total 441kWh (+106)
2024-01-11 10:06:01,604 : -- : update value is 2024-01-09T06:00:00+01:00: 10382 m   - 441 kWh - 106 kWh {'entity_id': 'sensor.gas_consumption_m3', 'state': '10382', 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measure>
2024-01-11 10:06:01,610 : -- :  {'entity_id': 'sensor.grdf_21462807485090_m3', 'state': '10382', 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measurement': 'm  ', 'device_class': 'gas', 'state_class': 'total_increasin>
2024-01-11 10:06:01,614 : -- :  {'entity_id': 'sensor.gas_daily_kwh', 'state': '106', 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measurement': 'kWh', 'device_class': 'energy', 'state_class': 'measurement', 'last_che>
2024-01-11 10:06:01,617 : -- :  {'entity_id': 'sensor.grdf_21462807485090_daily_kwh', 'state': '106', 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measurement': 'kWh', 'device_class': 'energy', 'state_class': 'measure>
2024-01-11 10:06:01,622 : Get_state_file /workdir/apps/meters_to_ha/meters2ha_state.json {'grdf': {'state': 441, 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measurement': 'kWh', 'device_class': 'energy', 'state_class>
2024-01-11 10:06:01,622 : -- :  {'entity_id': 'sensor.gas_consumption_kwh', 'state': '441', 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measurement': 'kWh', 'device_class': 'energy', 'state_class': 'total_increasing'>
2024-01-11 10:06:01,626 : -- :  {'entity_id': 'sensor.grdf_21462807485090_kwh', 'state': '441', 'attributes': {'date_time': '2024-01-09T06:00:00+01:00', 'unit_of_measurement': 'kWh', 'device_class': 'energy', 'state_class': 'total_increas>
2024-01-11 10:06:01,626 : OK :
2024-01-11 10:06:01,626 : OK :  Finished on success, cleaning up
2024-01-11 10:06:01,706 : OK : Close Browser
2024-01-11 10:06:01,725 : OK : Close Display

Mais le docker stoppe une fois les données remontées Comment le lancer régulièrement pour avoir les données mises à jour régulièrement ?

mdeweerd commented 7 months ago

C'est une bonne nouvelle 😃 .

Dans l'état actuel, on peut mettre la commande de lancement dans un tache cron sous unix ou comme un appel dans le planificateur de taches sous windows.

L'avantage est que cela ne consomme pas de ressources en dehors des quelques fois par jour que cela s'exécute.