bmoscon / cryptofeed

Cryptocurrency Exchange Websocket Data Feed Handler
Other
2.2k stars 679 forks source link

Retrieving crypto data from complementary data aggregation services / Coingecko? #266

Closed yohplala closed 3 years ago

yohplala commented 4 years ago

Hi,

Currently, cryptofeed is managing data coming from exchanges. Please, is there any contraindication for proposing new data feeds like CoinMarketCap, CoinGecko, Nomics... From these data feeds, I would like to get complementary 'global' crypto data like circulating supply, global trading volumes, dominance...

Could related PRs be accepted in cryptofeed? If yes, what would be your recommandations? I see 3 main points:

Thanks for your feedbacks. Have a good day, Bests,

olibre commented 4 years ago

I am thinking about how we could handle this exciting new feature...

🤔

Maybe we could add a folder called aggregators next to exchanges. The global name for both could be platform: the source platform can be an aggregator or an exchange.

I would also rename exchange -> market in reference to a market place. because some exchanges (Binance) provide several market places (Binance US, Binance Jersey..)

We can also rename pair -> symbol: a Ticker Symbol is the short name of an instrument.

A symbol can be

Please suggest @yohplala some symbol names. For example, based on https://api.coingecko.com/api/v3/global we may have "BTC-MarketCap": 29919599.687056687

yohplala commented 4 years ago

@olibre Thanks for providing 1st thoughts on this.

Hmm, I would not remove the notion of data_type.

Obviously, there can be several others, like all the on-chain indicators computed by Glassnode, but to be honest, the first three above are the ones that interest me the most at present time :)

@bmoscon, any contraindication regarding such features? More specifically, do you intend to restrict cryptofeed usecase on exchanges, or will you accept PRs for extending the type of seeds such as the ones given above?

Thanks for your feedback, Have a good day, Bests,

bmoscon commented 4 years ago

why would you need to switch anything? You can create a new class (DataFeed) that mirrors the minimum requirements for interfacing with the feed handler (would need a subscribe method).

yohplala commented 4 years ago

@bmoscon Thanks Bryant! So my understanding is that you have nothing against integrating such a new type of data feed. Great! I still have a new feature to workout for Cryptostore in the coming weeks. I should then be able to have a look to this topic in the coming weeks! Thanks again!

yohplala commented 3 years ago

Hi @bmoscon ,

I have made a 1st implementation. I am not sure where I am yet, as my objective is to be able to use Cryptostore over it. I am not sure the Callback with Redis is done yet.

Still, if you have any comment on current status, I will gladly take them into account. Branch is here: https://github.com/yohplala/cryptofeed/tree/coingecko

I have been testing it with following script, copied from demo.py.

from cryptofeed import FeedHandler
from cryptofeed.callback import Callback
from cryptofeed.defines import PROFILE
from cryptofeed.providers import Coingecko

async def profile(**kwargs):
    print(f"Profile Update for {kwargs['feed']}")
    print(kwargs['timestamp'])
    f= open('/home/me/Documents/code/draft/cryptofeed/data_{!s}.txt'.format(kwargs['timestamp']),"w+")
    for k,v in kwargs['data'].items():
        f.write('{!s}: {!s}\r\n'.format(k,v))
    f.close()

def main():
    f = FeedHandler()
    f.add_feed(Coingecko(pairs=['BTC'], channels=[PROFILE],
                         callbacks={PROFILE: Callback(profile)}))

#    config = {TRADES: ['BTC-USDT', 'ETH-USDT'], L2_BOOK: ['BTC-USDT']}
#    f.add_feed(Huobi(config=config, callbacks={TRADES: TradeCallback(trade), L2_BOOK: BookCallback(book)}))

    f.run()

if __name__ == '__main__':
    main()

I get one error message at the start, but I also get it with other exchanges, so I am guessing it is 'normal'. Please advise if it is not the case. When starting the script I get:

2020-10-13 19:59:43,588 : ERROR : Unhandled exception
Traceback (most recent call last):
  File "/home/me/Documents/code/cryptofeed/cryptofeed/feedhandler.py", line 162, in run
    loop.run_forever()
  File "/home/me/miniconda3/lib/python3.7/asyncio/base_events.py", line 525, in run_forever
    raise RuntimeError('This event loop is already running')
RuntimeError: This event loop is already running

But then the loop goes on.

I thank you for your help and advises. Next steps will be to test with config, and test with Redis Callback (my understanding is that I will have to create a new ProfileStream in redis.py)

Thanks again, Bests

bmoscon commented 3 years ago

looks pretty good so far, and yes, you'd need to add your new callback to all the backends, there are quite a few :)

yohplala commented 3 years ago

Thanks Bryant, So you confirm the error message I am getting is ok? In this case, maybe it is not a 'real error' and maybe it should be caught and only a warning be issued? (just thinking aloud - for a beginner like me, having this makes wondering what is not going ok)

bmoscon commented 3 years ago

i dont see anything in your code that would cause the error, that error looks like its coming from something else running asyncio, like ipython or a jupyter notebook. You can't run cryptofeed in something like that

yohplala commented 3 years ago

Thanks Bryant, Ok, I confirm I am running it in ipython (spyder environment installed with miniconda) Thanks again Bryant!

yohplala commented 3 years ago

Hi @bmoscon, As far as I understand, it seems I am nearly done with this. (Github branch up to date with my last modifications) There is one last callback for postgres backend that I have not completed.

I m not understanding why for some channels, write is re-implemented, example line 63,

class FundingPostgres(PostgresCallback, BackendFundingCallback):
    default_table = FUNDING

    async def write(self, feed: str, pair: str, timestamp: float, receipt_timestamp: float, data: dict):
        await super().write(feed, pair, timestamp, receipt_timestamp, f"'{json.dumps(data)}'")

And not for some others:

class OpenInterestPostgres(PostgresCallback, BackendOpenInterestCallback):
    default_table = OPEN_INTEREST

Also, 2 differences with the way other exchanges are implemented:

I will welcome any feedback on these 3 items Bryant, if you have any. Thanks in advance, Bests,

yohplala commented 3 years ago

Implemented in PR #331