There are 3 different data steams that need to be transferred between the backend and the dune queries:
In the first version all the data is stored in simple json files. Later, we will consider building real databases. The data flows are driven by 2 different cronjobs. The first job updates and executes the queries for the appData and the main-dune-query for the daily download, each 30 mins. Then 15 mins later, a second job is starting the download of the query results. The backend-api will continuously look for new downloads from dune in a maintenance loop and read the new data, serve it via an api and create new appData-referral mappings.
Preparations:
cd dune_api_scripts
python3 -m venv env
source ./env/bin/activate
pip install -r requirements.txt
Setting some envs:
cp .env.example .env
and adjust the values.
Pulling new query results:
python -m dune_api_scripts.store_query_result_all_distinct_app_data
python -m dune_api_scripts.store_query_result_for_todays_trading_data
python -m dune_api_scripts.store_query_result_for_entire_history_trading_data
The last command might take a while, as downloading the whole history takes quite some time.
Alternatively, the scripts can also be run via docker:
docker build -t fetch_script -f ./docker/Dockerfile.binary .
docker run -e DUNE_PASSWORD=<pwd> -e DUNE_USER=alex@gnosis.pm -e REFERRAL_DATA_FOLDER=/usr/src/app/data/ -v ./data/:/usr/src/app/data -ti fetch_script /bin/sh
Running the api with the data form user_data.json:
cargo run
and then check the local endpoint like this:
http://127.0.0.1:8080/api/v1/profile/0xa4a6ef5c494091f6aeca7fa28a04a219dd0f31b5
or
http://127.0.0.1:8080/api/v1/profile/0xe7207afc5cd57625b88e2ddbc4fe9de794a76b0f
Alternatively, the code can also be run via docker:
docker build -t gpdata -f docker/Dockerfile.binary .
docker run -ti -e DUNE_DATA_FOLDER='/usr/src/app/data' gpdata gpdata