Open StayFoolisj opened 2 years ago
hi. are you using alchemy's free plan? some queries are too intense for the free plan
we could add a rate limit for alchemy free plans, but there isn't an exact conversion between alchemy's rate limit units ("CUPS") and conventional rate limit units (requests per second, concurrent connections)
Yes I'm using the free plan. Got your point, makes sense. I'm also a bit confused by their compute units when trying to figure out what I need..
My main purpose here is getting OHLCV data from dexes, on a lot of markets. Let's say I want to get full historical data from 100 markets, each with an average of 500k swaps = 50M swaps in total. Do you have any reference on the cost from either Alchemy, or any other provider?
yea Im pretty sure that workload is too heavy for alchemy's lower tier plans. with a smarter rate limiter it would work, after running queries for a long period of time, but ctc doesnt have a smart limiter that respects alchemy's CUPS. for these types of queries you probably need a beefier plan or to run your own archive node
Hi
ctc
looks magnificent. With the following command I think I'm getting rate limited by Alchemy as I'm getting 429'd:ctc uniswap chart 0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640
As you can see here. https://user-images.githubusercontent.com/42708283/193460468-32a6de08-b897-41af-ba10-cd08ada71f93.mov
From what I read,
ctc
is exceeding compute requests per second. Is there another archival node provider recommended to use instead, or is it possible to implement some of the suggestions found here: https://docs.alchemy.com/reference/throughput#retries