Closed Allen-Taylor closed 1 month ago
Yes I did compare to log subscribe, and it's faster and more reliable due to decoding that is already done everywere on github, and we would only need to decode this specific field here on red in the screenshot below, but we would need some data that could be only found inside the full transaction and not available by logs.
For this specific repo, we probably would be doing better if we did decode the data from the logs instead, the speed go down for a few ms, but we would need to change the workers to not spawn each txn, and be fixed or smth.
I've been directly decoding the Program data from the logsSubscribe at the "processed" commitment. Many of these scanners grab the TXN sig and then query for info that way, but that is timely in the world of sniping.
The program log that starts with the "/vdt" is the buy or sell data in base64.
If it is a new mint, it is the Program data without the "/vdt".
I've been pretty interested in benchmarking how long it takes to identify new trades and new mints for copy trading and sniping.
The fastest I've been able to scan is using the Helius Geyser endpoint with the transactionSubscribe at the "processed" commitment. I was able to snipe 4 seconds after the dev buy.
I'm not too sure how to get fastest than that other than having my own dedicated RPC.
There is one sniper on pump that buys in at the same as the dev. I very sure he is "back-running" the dev buy.
You can decode the fastest if you have your own locally validator but if you need some kind of information that is not available in the logs like detecting scam bots that is actually my use case, only by decoding checking the logs will be a issue, but for this point it's the best way, you can note that I think u're trying to make a MEV as it seems, this is the right path, but for sending the txns is recommended that you do use JITO since they are the ones who mines the majority part of the block but you can have issues with the "level" of information since idk if they allow you to listen from their rpcs, if i'm not wrong this guy who buys in the same level, he owns a RPC and he also do have a validator, as he prob send the txns directly to the validator into a channel that I forgot the name, and he also listen to the processed as you said, since sometimes his txns as I saw in the past, fails due to account failing to create or something, his buys is not in a different block, it's in the same block as the creation
idk if I helped in some sort of way, I just spit out all the info I have
Fastest checklist:
I was able to get the snipe down to 2 seconds with Geyser, my swap code and running it from my VPS (all in Rust).
I tried JITO to submit my transactions but it actually made it slower, even with a generous tip.
I think the only way to get on the same block is to own an RPC.
I tried that submitting to the TPU leader, but it is incredibly unreliable. Maybe I was doing it wrong but I was getting all kinds of dropped transactions.
Thanks!
I decode the "/vdt" data,find a series of bytes,how can i map layout to such bytes?
This is how i'm doing it in python.
import asyncio
import websockets
import json
import base64
from solders.pubkey import Pubkey # type: ignore
from construct import Struct, Padding, Int64ul, Flag, Bytes
WSS = ""
trade = Struct(
Padding(8),
"mint" / Bytes(32),
"solAmount" / Int64ul,
"tokenAmount" / Int64ul,
"isBuy" / Flag,
"user" / Bytes(32),
"timestamp" / Int64ul,
"virtualSolReserves" / Int64ul,
"virtualTokenReserves" / Int64ul
)
def format(parsed_data, txn_sig):
return {
"mint": str(Pubkey.from_bytes(bytes(parsed_data.mint))),
"sol_amount": parsed_data.solAmount / 10**9,
"token_amount": parsed_data.tokenAmount / 10**6,
"is_buy": parsed_data.isBuy,
"user": str(Pubkey.from_bytes(bytes(parsed_data.user))),
"timestamp": parsed_data.timestamp,
"virtual_sol_reserves": parsed_data.virtualSolReserves,
"virtual_token_reserves": parsed_data.virtualTokenReserves,
"txn_sig": txn_sig
}
async def logs_subscribe():
async with websockets.connect(WSS) as websocket:
# Request to subscribe to logs
request = {
"jsonrpc": "2.0",
"id": 1,
"method": "logsSubscribe",
"params": [
{"mentions": ["6EF8rrecthR5Dkzon8Nwu78hRvfCKubJ14M5uBEwF6P"]},
{"commitment": "processed"}
]
}
await websocket.send(json.dumps(request))
print("Subscribed to logs...")
txn_sigs = set()
while True:
response = await websocket.recv()
log_data = json.loads(response)
txn_sig = log_data.get("params", {}).get("result", {}).get("value", {}).get("signature", "")
logs = log_data.get("params", {}).get("result", {}).get("value", {}).get("logs", [])
if txn_sig in txn_sigs:
continue
txn_sigs.add(txn_sig)
if any("Program log: Instruction: Buy" in log_entry for log_entry in logs):
if any("Program log: Instruction: Sell" in log_entry for log_entry in logs):
# Ignore BUMP bots
continue
for log_entry in logs:
if "Program data: vdt/" in log_entry:
program_data_base64 = log_entry.split("Program data: ")[1]
program_data_bytes = base64.b64decode(program_data_base64)
parsed_data = trade.parse(program_data_bytes)
trade_data = format(parsed_data, txn_sig)
print(trade_data)
asyncio.get_event_loop().run_until_complete(logs_subscribe())
will be closing as I feel no responsing will be getting in here anymore
Was lurking Github and found your repo.
I see you are using blockSubscribe, but per the docs it is considered unstable and the RPC needs --rpc-pubsub-enable-block-subscription enabled.
https://solana.com/docs/rpc/websocket/blocksubscribe
Have you compared your listener to Geyser or logSubscribe?