weweswap / app-backend

Utils and APR backend
https://app-backend-production-676d.up.railway.app/docs
MIT License
0 stars 0 forks source link

Vault Aggregation and APR Calculation Service

Swagger Documentation can be found at enpoint /docs

Current link: https://app-backend-production-676d.up.railway.app/docs

Overview

This service is designed to handle two primary tasks:

  1. Data Aggregation:

    • Listens to "RewardsConvertedToUsdc" events and stores them in the database. The events are emitted by the fee manager contract.
    • Aggregates vault data daily and stores historical vault information (TVL). Based on on-chain information of the vault contracts.
  2. API Endpoint:

    • Provides an API endpoint with vault information like address, apr, accumulated fees per day and accumulated incentives per day.

Features

1. Rewards Collection and Aggregation

2. Vault Information API

Project Structure

src/
├── aggregators/
│   ├── vault-aggregator/         # Handles daily vault data aggregation
│   └── events-aggregator/        # Handles listening to "RewardsConvertedToUsdc" events
├── api/
│   └── vaults/                   # API module for vault-related data, including APR calculations
├── blockchain-connectors/        # Manages blockchain connectivity
├── contract-connectors/          # Services for interacting with contracts (eg. Arrakis, ERC-20 and Fee Manager)
├── database/                     # MongoDB schemas and database interaction services
├── price-oracles/                # Price-Oracle service (Coingecko API)
├── config/                       # Configuration settings for the service
├── shared/                       # Shared models, types, classes and enums and utility functions
└── utils/                        # Various utility functions

Key Files

Setup and Installation

  1. Clone the Repository:

    git clone <repository-url>
    cd <project-directory>
  2. Install Dependencies:

    npm install
  3. Environment Variables:

    • Configure the necessary environment variables in a .env file. You’ll need:
      • Database connection URI (MongoDB)
      • Blockchain provider URLs (for EVM connectors)
      • Any other relevant config variables such as vault addresses.
  4. Run the Application:

    npm run start
  5. Running Tests:

    • Tests are included in the project, and you can run them with:
      npm run test

Usage

Vault Aggregation

The vault aggregation logic starts automatically:

API

The vault information is exposed via an API:

Importing Whitelist Data

To manage multiple whitelists for different projects, follow the steps below to import your JSON files into MongoDB. This process needs to be done only once for each whitelist file, as the data will be stored and accessible in the database thereafter.

1. Prepare Your JSON Files

src/ ├── static/ │ ├── fomo.json │ ├── projectA.json │ ├── projectB.json │ └── … other JSON files └── … other directories


[
  {
    "value": [
      "0xAddress1",
      "384992472620497583"
    ],
    "proof": [
      "0xfc13c899b6516cf2dac5e27ecb0752e46e0ee419ad13d8b6c556d94ee8752ae2",
      "0x3b86523d566ffbd123f49de172f6b82cb9df34900acd7a2f8f4d2a913d24c0f9",
      // ... more proof entries
    ]
  },
  // ... more entries
]

2. Note on the /src/static Directory

    •   The /src/static directory is included in .gitignore due to the large size of JSON files. This means these files will not be tracked by Git and must be managed manually or through another method (e.g., deployment scripts).

3. Running the Import Script

The import process is designed to read all JSON files within the /src/static directory and populate the MongoDB database accordingly.

a. Ensure Your MongoDB Connection

    •   Verify that your MongoDB connection URI is correctly set in your .env file. Example:

MONGODB_URI=mongodb://localhost:27017/yourdbname

b. Execute the Import Command

    •   Run the following command to start the import process:

npm run import

Note: Ensure that the import script is defined in your package.json. If not, you can add it as follows:

// package.json
{
  "scripts": {
    // ... other scripts
    "import": "ts-node src/import.ts" // Adjust the path and command as needed
  }
}

    •   Example Output:

[ImportService] Reading data directory: /path/to/src/static
[ImportService] Reading data from /path/to/src/static/fomo.json...
[ImportService] Starting import of 1000 records from fomo.json...
[ImportService] Import from fomo.json completed. Inserted: 1000, Modified: 0
[ImportService] Reading data from /path/to/src/static/projectA.json...
[ImportService] Starting import of 1500 records from projectA.json...
[ImportService] Import from projectA.json completed. Inserted: 1500, Modified: 0
[ImportService] All data imports completed.

c. Verify the Import

    •   After running the import script, verify that the data has been successfully inserted into MongoDB.
    •   Using MongoDB Compass or the Mongo Shell:

// Example using Mongo shell
use yourdbname

db.whitelists.find({ project: "fomo" }).limit(5).pretty()

Expected Document Structure:

{
  "_id": ObjectId("..."),
  "address": "0x7bcd8185b7f4171017397993345726e15457b1ee",
  "proof": [
    "0xfc13c899b6516cf2dac5e27ecb0752e46e0ee419ad13d8b6c556d94ee8752ae2",
    "0x3b86523d566ffbd123f49de172f6b82cb9df34900acd7a2f8f4d2a913d24c0f9",
    // ... more proof entries
  ],
  "project": "fomo"
}

4. Managing Future Whitelist Files

For future projects, simply add the new JSON files to the /src/static directory and run the import script again. Since each project is identified by the filename (e.g., projectB.json), the import script will handle them appropriately.

Example:

    1.  Add projectC.json to /src/static.
    2.  Run the import script:

npm run import

    3.  Verify the import in MongoDB.

5. Import Script Overview

Here’s a brief overview of how the import script works:

    •   File Location: The import script reads all .json files located in /src/static.
    •   Project Identification: Each JSON file’s name (e.g., fomo.json) is used as the project identifier in the database.
    •   Data Mapping: For each entry in the JSON file:
    •   value[0] is mapped to the address field.
    •   proof array is mapped to the proof field.
    •   The filename (without extension) is mapped to the project field.
    •   Database Operation: Utilizes bulk write operations to efficiently insert or update records in MongoDB.

## Further improvements
- Split up data-aggregator & api, if api requests are growing
- Share common contract calls between data-aggregator & api
  - /contract-connectors & /api/lp/lp-data-provider have lot of overlap