overlay-market / overlay-risk

Risk metrics for various data streams
MIT License
13 stars 4 forks source link

Uniswap v3 risk metrics #26

Closed deepsp94 closed 2 years ago

deepsp94 commented 2 years ago

What's done:

  1. influx_univ3.py: Ingesting metrics and catching most exceptions
  2. influx_metrics_univ3.py: Calculating metrics using above data
  3. TWAP length of 1 minute using uni v3 oracle
  4. Window length of 1 min is enforced with a tolerance of 10 secs. Both window length and tolerance are parametrized

What's left:

  1. Making influx_univ3.py a bit more resilient. Still facing some time out issues
  2. Deploying these scripts on heroku and checking how things run over an extended period of time
deepsp94 commented 2 years ago

Context regarding the work above (empty commits):

Heroku re-deploys an app only if there is an additional commit. So if a new buildpack is added, it can only be used if the app is re-deployed, which will only happen if there is a new commit. Hence the empty commits.

I needed to install gsl to the heroku app in order to run pystable. There are some gsl buildpacks by the community that are available in the Elements marketplace and elsewhere on the internet. These aren't actively maintained and are either buggy or install really old versions of gsl which aren't useful for running pystable.

The apt buildpack (https://github.com/heroku/heroku-buildpack-apt) coupled with the last commit on this branch (72f0b06) seems to have done the trick, ie, got pystable running on heroku.

deepsp94 commented 2 years ago

@mikeyrf I have already deployed this branch to my project pipeline on Heroku and can confirm that the scripts run on Heroku. Adding descriptions for important files/changes below so you can prioritise these files as you review the code.

  1. Aptfile: This is a config file required to run the heroku apt buildpack (https://elements.heroku.com/buildpacks/heroku/heroku-buildpack-apt). This buildpack is required to install gsl.
  2. pyproject.toml:
    • Removed scipy because poetry is unable to install it
    • Added my fork of eth-brownie to be able to use alchemy's archive node through Heroku but without relying on the brownie command brownie networks add
  3. sushi_quotes.json: This is a copy of quotes.json but named with the prefix "sushi_" so it's understood that these quotes are different from the uni v3 ones. I've retained the original quotes.json file too so that older deployments don't stop running.
  4. influx_metrics.py: Fixed a bug related to column names in the original script
  5. influx_metrics_univ3.py: Calculates metrics from price cumulatives. Logical flow of code as follows -
    • Find the timestamp when metrics were last recorded in the bucket ovl_metrics_univ3. Otherwise, if metrics have never been recorded before then just start from 30 days back.
    • Find list of timestamps between find_start() and now
    • Get all price cumulatives for above list of timestamps
    • For each timestamp, calculate TWAPs and metrics. TWAPs are hourly and metrics have a 90d lookback period.
  6. influx_univ3_1h.py: Gets price cumulatives from archive node at 10 min intervals. These are used in influx_metrics_univ3.py to calculate 1h TWAP.
    • A batch of cumulatives are pulled from the archive node using multiple threads, these are converted to a pandas dataframe which is pushed to influx in one go. Therefore the entire batch containing data for many days gets pushed to influx at once, making data collection for weeks/months faster.
    • The same script also supports real time data collection. When real time data is needed, the size of the pandas dataframe being pushed to influx is just 1 row (corresponding to the latest/current timestamp).