Closed glottisfaun0000 closed 6 months ago
although admin server logs don't show anything relevant
Could you check the contents of the logs/frontend-default.log
?
Ahh I had just been checking docker container logs sist2-admin
, this looks more informative.
{"stderr": "T0 [2024-04-03 00:18:27] [INFO main.c] Loaded index: [test2]\n"}
{"stderr": "T0 [2024-04-03 00:18:27] [INFO serve.c] Starting web server @ http://0.0.0.0:4090\n"}
{"stderr": "T0 [2024-04-03 00:18:32] [WARNING serve.c] ElasticSearch error during query (404)\n"}
{"stderr": "T0 [2024-04-03 00:18:32] [WARNING serve.c] {\n"}
{"stderr": "\t\"error\":\t{\n"}
{"stderr": "\t\t\"root_cause\":\t[{\n"}
{"stderr": "\t\t\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t\t\t}],\n"}
{"stderr": "\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t},\n"}
{"stderr": "\t\"status\":\t404\n"}
{"stderr": "} \n"}
{"stderr": "T0 [2024-04-03 00:18:32] [WARNING serve.c] ElasticSearch error during query (404)\n"}
{"stderr": "T0 [2024-04-03 00:18:32] [WARNING serve.c] {\n"}
{"stderr": "\t\"error\":\t{\n"}
{"stderr": "\t\t\"root_cause\":\t[{\n"}
{"stderr": "\t\t\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t\t\t}],\n"}
{"stderr": "\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t},\n"}
{"stderr": "\t\"status\":\t404\n"}
{"stderr": "} \n"}
{"stderr": "T0 [2024-04-03 00:18:33] [WARNING serve.c] ElasticSearch error during query (404)\n"}
{"stderr": "T0 [2024-04-03 00:18:33] [WARNING serve.c] {\n"}
{"stderr": "\t\"error\":\t{\n"}
{"stderr": "\t\t\"root_cause\":\t[{\n"}
{"stderr": "\t\t\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t\t\t}],\n"}
{"stderr": "\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t},\n"}
{"stderr": "\t\"status\":\t404\n"}
{"stderr": "} \n"}
{"stderr": "T0 [2024-04-03 00:18:33] [WARNING serve.c] ElasticSearch error during query (404)\n"}
{"stderr": "T0 [2024-04-03 00:18:33] [WARNING serve.c] {\n"}
{"stderr": "\t\"error\":\t{\n"}
{"stderr": "\t\t\"root_cause\":\t[{\n"}
{"stderr": "\t\t\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t\t\t}],\n"}
{"stderr": "\t\t\"type\":\t\"index_not_found_exception\",\n"}
{"stderr": "\t\t\"reason\":\t\"no such index [sist2]\",\n"}
{"stderr": "\t\t\"resource.type\":\t\"index_or_alias\",\n"}
{"stderr": "\t\t\"resource.id\":\t\"sist2\",\n"}
{"stderr": "\t\t\"index_uuid\":\t\"_na_\",\n"}
{"stderr": "\t\t\"index\":\t\"sist2\"\n"}
{"stderr": "\t},\n"}
{"stderr": "\t\"status\":\t404\n"}
{"stderr": "} \n"}
Maybe it's not able to find the .sist2 file specified? Docker persistent storage issue?
I think what is happening is that docker-compose down
will remove the elasticsearch data. sist2 can't start because the elasticsearch index is gone.
To add persistance, you can add this in your compose file:
elasticsearch:
volumes:
- /path/to/elasticsearch/data/:/usr/share/elasticsearch/data
Awesome, I got it working with that but initially the elasticsearch service wasn't able to write to the volume/directory until I added a PGID and PUID to the environment and chown'd the directory with the same. Thanks!
Device Information (please complete the following information):
Debian 12
Docker
3.3.4
7.17.9
docker-compose.yml
Describe the bug
Steps To Reproduce Please be specific!
Expected behavior
Actual Behavior
docker compose down && docker compose up -d
, the new job becomes broken in the same way.Additional context Not sure if this is a permissions issue or what, but my installation is a pretty bog standard Docker compose run so I'm surprised if nobody else runs into this. When in the state where the frontend can't see the last index, Admin > backends > elasticsearch still tests succesfully. If this is expected behavior (you can't serve a frontend to browse an index created before the current run of the application) that seems highly limiting, I was hoping to use sist2 to index cold storage drives.