ansys / pymeilisearch

MeiliSearch Deployment Synchronizer
https://pymeilisearch.docs.ansys.com/
MIT License
5 stars 0 forks source link

PyFluent dev documentation pymeilisearch is not working #90

Closed raph-luc closed 1 year ago

raph-luc commented 1 year ago

More details: https://github.com/ansys/pyfluent/issues/2093

See PyFluent dev doc search which doesn't work, compared to the stable version search which does work

Nightly dev doc build is failing in the "Scrap the document and deploy it to pymeilisearch" step, with the following traceback:

Serving directory /home/runner/work/pyfluent/pyfluent/HTML-Documentation-tag-v23.2.0 at http://localhost:8000
Traceback (most recent call last):
  File "/home/runner/.local/bin/pymeilisearch", line 8, in <module>
    sys.exit(main())
  File "/usr/lib/python3/dist-packages/click/core.py", line 1128, in __call__
    return self.main(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1053, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1659, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/lib/python3/dist-packages/click/core.py", line 1395, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/lib/python3/dist-packages/click/core.py", line 754, in invoke
    return __callback(*args, **kwargs)
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/cli.py", line 92, in upload
    local_host_scraping(index, template, location, port, stop_urls)
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/server.py", line 120, in local_host_scraping
    scrape_website(index_uid, templates, directory, port, stop_urls)
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/server.py", line 92, in scrape_website
    scrap_web_page(index_uid, urls, templates, stop_urls)
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/create_indexes.py", line 137, in scrap_web_page
    web_scraper.scrape_url(url, index_uid, templates, stop_urls)
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/scrapper.py", line 159, in scrape_url
    temp_config_file = self._load_and_render_template(url, template, index_uid, stop_urls)
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/scrapper.py", line 60, in _load_and_render_template
    render_template(
  File "/home/runner/.local/lib/python3.10/site-packages/ansys/tools/meilisearch/templates/__init__.py", line 79, in render_template
    if "localhost" in urls[0]:
IndexError: list index out of range
Error: The operation was canceled.

Any help figuring out what is going wrong would be appreciated.

Revathyvenugopal162 commented 1 year ago

thank for notifying the issue. i will try to debug and update to you .

raph-luc commented 1 year ago

Same error in the release documentation build as well: https://github.com/ansys/pyfluent/actions/runs/6486384314/job/17618728562

raph-luc commented 1 year ago

@Revathyvenugopal162 that issue was fixed, but a new one cropped up:

Traceback ``` Traceback (most recent call last): File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 857, in _runCallbacks current.result = callback( # type: ignore[misc] File "/home/runner/.local/lib/python3.10/site-packages/scraper/src/documentation_spider.py", line 178, in parse_from_start_url self.add_records(response, from_sitemap=False) File "/home/runner/.local/lib/python3.10/site-packages/scraper/src/documentation_spider.py", line 152, in add_records self.meilisearch_helper.add_records(records, response.url, from_sitemap) File "/home/runner/.local/lib/python3.10/site-packages/scraper/src/meilisearch_helper.py", line 123, in add_records self.meilisearch_index.add_documents(cleaned_records) File "/home/runner/.local/lib/python3.10/site-packages/meilisearch/index.py", line 381, in add_documents add_document_task = self.http.post(url, documents) File "/home/runner/.local/lib/python3.10/site-packages/meilisearch/_httprequests.py", line 73, in post return self.send_request(requests.post, path, body, content_type) File "/home/runner/.local/lib/python3.10/site-packages/meilisearch/_httprequests.py", line 57, in send_request return self.__validate(request) File "/home/runner/.local/lib/python3.10/site-packages/meilisearch/_httprequests.py", line 110, in __validate raise MeilisearchApiError(str(err), request) from err meilisearch.errors.MeilisearchApiError: MeilisearchApiError. Error code: internal. Error message: MDB_PANIC: Update of meta page failed or environment had fatal error Error documentation: https://docs.meilisearch.com/errors#internal Error type: internal 127.0.0.1 - - [17/Oct/2023 07:49:55] "GET /api/solver/_autosummary/settings/print_thread_clusters.html HTTP/1.1" 200 - 2023-10-17 07:49:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "/home/runner/.local/lib/python3.10/site-packages/meilisearch/_httprequests.py", line 107, in __validate request.raise_for_status() File "/home/runner/.local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: https://backend.search.pyansys.com/indexes/pyfluent-vdev/documents The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 857, in _runCallbacks current.result = callback( # type: ignore[misc] File "/home/runner/.local/lib/python3.10/site-packages/scraper/src/documentation_spider.py", line 178, in parse_from_start_url self.add_records(response, from_sitemap=False) File "/home/runner/.local/lib/python3.10/site-packages/scraper/src/documentation_spider.py", line 152, in add_records self.meilisearch_helper.add_records(records, response.url, from_sitemap) File "/home/runner/.local/lib/python3.10/site-packages/scraper/src/meilisearch_helper.py", line 123, in add_records self.meilisearch_index.add_documents(cleaned_records) File "/home/runner/.local/lib/python3.10/site-packages/meilisearch/index.py", line 381, in add_documents add_document_task = self.http.post(url, documents) File "/home/runner/.local/ ```

See Nightly Development Documentation Build

More details: https://github.com/ansys/pyfluent/issues/2093

raph-luc commented 1 year ago

We have reverted to the previous search feature for now, more details in the issue tracker listed above