Open paulocheque opened 6 months ago
Hi, thanks! I did not benchmark it before, but it's used in our production systems. But overall I think in native async case it should be almost a no-op and in sync case it will slow down depending on in which context we are running. I would appreciate any help in creating those benchmarks and I'll add them to our test suite
ps: you can add from universalasync import async_to_sync_wraps
in README
import timeit
import asyncio
from universalasync import async_to_sync_wraps
def sync_func(): pass
async def async_func(): pass
@async_to_sync_wraps
async def wrapped_async_func(): pass
def run_async_func():
asyncio.run(async_func())
def run_wrapped_async_func():
asyncio.run(wrapped_async_func())
ITERATIONS = 10000
sync_time_1 = timeit.timeit(sync_func, number=ITERATIONS)
sync_time_2 = timeit.timeit(wrapped_async_func, number=ITERATIONS)
async_time_1 = timeit.timeit(run_async_func, number=ITERATIONS)
async_time_2 = timeit.timeit(run_wrapped_async_func, number=ITERATIONS)
print(f'Synchronous functions: {sync_time_1} vs {sync_time_2}')
print(f'Asynchronous functions: {async_time_1} vs {async_time_2}')
I got this error: ValueError: a coroutine was expected, got None
on the line
async_time_2 = timeit.timeit(run_wrapped_async_func, number=ITERATIONS)
Overhead for the sync function: 0.00020570901688188314 vs 0.28296741703525186 If I calculated correctly, it is an overhead of 137025%
Hi, very insteresting library!
Do you have any benchmark to calculate the overhead for async and for sync functions using the decorator?
Something like: