bmoscon / cryptofeed

Cryptocurrency Exchange Websocket Data Feed Handler
Other
2.21k stars 682 forks source link

Store complete book in postgres from binance L2_BOOK #991

Closed KlaSun closed 1 year ago

KlaSun commented 1 year ago

I'm currently recording the orderbook snapshots and deltas to postgresql with the postgresql backed for Binance L2 Book. This is working well, however in order to get the most recent orderbook I'd have to manually re-construct it. I saw in another thread Bryant mentionned that there was an orderbook object that contained the full updated book, however I don't see this in my database.

Is there a way to achieve this ? Code below:

from cryptofeed import FeedHandler
from cryptofeed.backends.postgres import CandlesPostgres, TradePostgres, LiquidationsPostgres, BookPostgres
from cryptofeed.defines import CANDLES, L2_BOOK, TRADES, LIQUIDATIONS
from cryptofeed.exchanges import BinanceFutures
import pandas as pd

postgres_cfg = {'host': '127.0.0.1', 'user': 'postgres', 'db': 'db', 'pw': 'pw'}

def main():
    config = {'log': {'filename': 'cryptofeed_trades.log', 'level': 'DEBUG', 'disabled': False}}
    f = FeedHandler(config=config)
    f.add_feed(BinanceFutures(channels=[L2_BOOK],
                              symbols=['BTC-USDT-PERP', 'ETH-USDT-PERP'], callbacks={
            L2_BOOK: BookPostgres(snapshot_interval=7500, table='l2_book', **postgres_cfg)}))
    f.run()

if __name__ == '__main__':
    main()
bmoscon commented 1 year ago

there is no way to do what you want to do, you have to reconstruct it. You'll have a snapshot every 7,500 updates, so you dont need to reconstruct everything, just the time periods between snapshots you are interested in