ethereum-optimism / op-analytics

Onchain Data, Utilities, References, and other Analytics on Optimism
111 stars 51 forks source link

Migrate Defillama TVL: Chain-Level & Breakdown API #939

Open MSilb7 opened 3 weeks ago

MSilb7 commented 3 weeks ago

Current State Defillama API pulls live in two places:

Note that we also scrape Defillama's normalizeChain file to get chain metadata (i.e. chainId, tags) for segmenting and joining with other data sources: https://github.com/DefiLlama/defillama-server/blob/master/defi/src/utils/normalizeChain.ts

All scripts have a ton of legacy code / debt as needs have evolved over time.

We currently use Defillama TVL scripts to: 1) Get the total TVL of all chains (chain level API) - We want to use Defillama's double counting filters, so we use the chain-level API, rather than summing up the breakdown API (which would lead to double counting) 2) Breakdown TVL by app, category, token, token type, etc

Future State We should migrate these scripts to the new subcommands structure & simplify the data pulls, where we're not doing any calculations at the ingestion level, just pulling data and storing it.

Steps: Chain-Level

  1. Hit the https://api.llama.fi/v2/chains endpoint to get a list of chains
  2. Iterate through each chain like https://api.llama.fi/v2/historicalChainTvl/{chain} to get historical TVl
  3. Store Results

Full breakdown

  1. Hit the https://api.llama.fi/protocols endpoint to get a list of protocols & their metadata (we want name, slug, category, parent protocol at least)
  2. Iterate over each protocol (by slug) using https://api.llama.fi/protocol/{protocol}
  3. Store results Note: This creates huge datasets, so we may need to push to BQ by protocol? We've also observed some larger protocols not return data (i.e. uniswap-v3), ~possibly because of github actions memory limits. So tbd if we'll need to figure out other utilities/solutions. Note - take 00:00:00 timestamps to reduce dupes

Metadata

  1. Scrape it
  2. Store results

Next steps Once we have the raw data, we can build charts like "net flows" versus "token price movement" to indicate higher signal TVL movements due to actual transfers.

We can also build dashboards to evaluate raw TVL and/or flows at the category level, using Defillama's categories and/or building our own.

We can also categorize tokens (i.e. liquid staking, wrapped bitcoin, gas tokens) to track TVL metrics by these token categories

Other considerations These endpoints give us token symbol, but not contract address. So we may need to eventually request or map these tokens to contract addresses, for future metrics such as "TVL of interoperable assets"

MSilb7 commented 2 weeks ago

DFL context on filtering out double counting - should likely be used as an intermediate model on top of the raw data

https://gist.github.com/0xngmi/719ed3cf2929a53e48aad4be76167eb6