Closed Jamiejoin closed 4 years ago
Hello, thank you for posting the issue. I was able to reproduce the error and it seems that it comes from the default settings of httpx.AsyncClient
- limits
argument in the constructor is by default set to:
DEFAULT_LIMITS = Limits(max_connections=100, max_keepalive_connections=20)
Here is my devised test:
from orjson import JSONDecodeError # Error to watch out for
rpc = bitcoinrpc.BitcoinRPC(RPC_ADDRESS, RPC_PORT, RPC_USER,RPC_PASSWORD)
async def get_raw_txs_from_block(block_hash: str, num_txs: int):
"""
Query exactly `num_txs` from block `block_hash` concurrently.
"""
block = await rpc.getblock(block_hash)
tasks = []
for i, tx_hash in enumerate(block["tx"]):
tasks.append(rpc.getrawtransaction(tx_hash))
if i == num_txs:
break
return await asyncio.gather(*tasks)
async def test_concurrent_connections_limit():
"""
Test how many concurrent calls can the BitcoinRPC handle.
"""
block_262250 = "00000000000000053a73327c9f7a5cffe695e7d6e66d6dca06dc70c7faea7d2d"
for i in (2, 10, 50, 100):
try:
await get_raw_txs_from_block(block_262250, i)
except JSONDecodeError:
print(f"{i} calls concurrently is way too much!")
else:
print(f"{i} calls concurrently is OK!")
Running it yields the following output:
>>> await test_concurrent_connections_limit()
2 calls concurrently is OK!
10 calls concurrently is OK!
50 calls concurrently is way too much!
100 calls concurrently is way too much!
So, I will look into it during the upcoming days. My idea is to expose the creation of the httpx.AsyncClient
through **kwargs
in the BitcoinRPC.__init__
method.
Once again, thank you very much for pointing out the issue!
Best regards, Libor.
And as a sidenote, you can query all raw transactions in one call to getblock
. Just set the verbosity
level to 2. :)
block_262250 = "00000000000000053a73327c9f7a5cffe695e7d6e66d6dca06dc70c7faea7d2d"
await rpc.getblock(block_262250, verbosity=2)
Libor
Thank you very much. I will close the question first and try it. I will reply to any results, thank you.
Is the error below caused by the query speed being too fast?
The error will stop at this txid : 103e90d89a2695f8952b513b12c705850b6b91d6ee1a9040c87d2344a3e38ff9