If you are an academic writing a paper, feel free to reach out to Austin
The wonderful folks at Allium have integrated with v3-polars as part of their DEX Analytics Portal grant
You'll need to configure the authentication with Allium. To do that:
If you have access to the Allium app:
If you do not have access to the Allium app, email dexanalytics@allium.so
After you get credentials, pass them to the ALLIUM_POLARSV3_QUERY_ID and ALLIUM_POLARSV3_API_KEY environment variables
Pro tip: you can use python to set the env variables:
import os
os.environ['ALLIUM_POLARSV3_QUERY_ID'] = 'abcdefg'
os.environ['ALLIUM_POLARSV3_API_KEY'] = 'deadbeef'
# add the rest of your code here, like:
# state.v3Pool(...)
To use the GBQ database, you need to be auth'd with Uniswap Labs backend. Likely, if you don't know how to do this, then you should use another provider.
However, if you think you should be auth'd and are not, run the code below
gcloud auth login
To solve "GCP could not be imported" errors, try installing the bigquery library
pip install google-cloud-bigquery
Create a new connector in v3/helpers/connectors using template.py. Integrate your connector into data_update.py under update_tables and make a PR!
Pull and then read all ETH/USDC swaps on Arbitrum
address = '0xc31e54c7a869b9fcbecc14363cf510d1c41fa443'
# if you're using allium, pass update_from='allium' to v3Pool()
arb = state.v3Pool(address, 'arbitrum', update = True)
swaps = arb.swaps
Working off the previous example, get the price of the arbitrum pool every 15 minutes
priceArb = arb.getPriceSeries(starting, frequency = '15m', gas = True)
Calculate the output a 1 ETH swap at block 150000000
calldata = {'as_of': 150000000,
'tokenIn': '0x82aF49447D8a07e3bd95BD0d56f35241523fBab1',
'swapIn': 1e18}
amt, _ = arb.swapIn(calldata)