power to the autonomous, generative agency peoples
First, create a repository on GitHub with the same name as this project, and then run the following commands:
git init -b main
git add .
git commit -m "init commit"
git remote add origin git@github.com:elonmusk/overlore.git
git push -u origin main
Finally, install the environment and the pre-commit hooks with
make install
You are now ready to start development on your project! The CI/CD pipeline will be triggered when you open a pull request, merge to main, or when you create a new release.
To finalize the set-up for publishing to PyPi or Artifactory, see here. For activating the automatic documentation with MkDocs, see here. To enable the code coverage reports, see here.
From the root of the project, simply run poetry run lore-machine
Flags:
-h
, --help
show this help message and exit--mock
Use mock data for GPT response instead of querying the API. (saves API calls)--prompt
Run lore-machine in a prompt testing loop.--prod
Run lore-machine in production mode.-a ADDRESS
, --address
ADDRESS Host address for ws connection-p PORT
, --port PORT
Host port for ws connection-w WORLD_DB
, --world_db
WORLD_DB location of the world db-l LOGGING_FILE
, --logging_file LOGGING_FILE
location of the logging fileThe Overlore System runs as a plugin to Eternums. The current source of truth of our system design can be found in the docker compose file in our Eternums fork.
We are currently exploring possibly using litefs
to sync the worlddb maintained by the torii process to our lore machine process. In practice, it means that a world_db
param can be passed on startup in the loremachine that will be synced from torii, the responsibiliy of which is owned by litefs.
Repository initiated with fpgmaas/cookiecutter-poetry.