Closed OToNaShi-4 closed 2 years ago
Hi,
what values are you using in cls.mysql_max_size
?
That value configures how many connections the pool should keep open at most.
With a pool max size of 10 I get to somwhere around 75MB memory use at the end (with some small modifications on your code) on python 3.10.2, with a pool max size of 100 I arrive at somewhere around 250MB with a total of ~75 connections in the pool.
You can use Pool.clear()
to close unused connections if that's useful for your application.
This reduces it to about 200MB.
Running explicit garbage collection afterwards for testing purposes brings me back to around the same size as with a pool max size of 10, around 75MB.
import asyncio
import gc
import aiomysql
async def wrap(pool):
print('+', end='', flush=True)
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute('select * from task join task_sku tws on tws.task_id = task.id join sku s on tws.sku_id = s.id where task.id = 1')
res = await cur.fetchall()
print('.', end='', flush=True)
async def test():
time = 0
total = 8000
pool = await aiomysql.create_pool(loop=asyncio.get_running_loop(), **{
'host' : 'localhost',
'db' : 'reptile_1',
'user' : 'testuser',
'password': 'testpass',
'minsize' : 1,
'maxsize' : 100,
'echo' : True
})
tasks = []
while time <= total:
tasks.append(asyncio.create_task(wrap(pool)))
await asyncio.sleep(0)
time += 1
await asyncio.gather(*tasks)
del tasks
print('done')
print(f'pool size: {pool.size}')
await asyncio.sleep(10)
await pool.clear()
print(f'pool cleared, new size: {pool.size}')
await asyncio.sleep(10)
print(f'garbage collected {gc.collect()} items')
while True:
try:
await asyncio.sleep(0)
except KeyboardInterrupt:
break
if __name__ == '__main__':
loop = asyncio.new_event_loop()
loop.run_until_complete(test())
Note that you should keep a reference to tasks if you want to avoid them from being unexpectedly garbage collected, see the note on https://docs.python.org/3/library/asyncio-task.html#asyncio.create_task.
Hi,
what values are you using in
cls.mysql_max_size
? That value configures how many connections the pool should keep open at most.With a pool max size of 10 I get to somwhere around 75MB memory use at the end (with some small modifications on your code) on python 3.10.2, with a pool max size of 100 I arrive at somewhere around 250MB with a total of ~75 connections in the pool.
You can use
Pool.clear()
to close unused connections if that's useful for your application. This reduces it to about 200MB.Running explicit garbage collection afterwards for testing purposes brings me back to around the same size as with a pool max size of 10, around 75MB.
import asyncio import gc import aiomysql async def wrap(pool): print('+', end='', flush=True) async with pool.acquire() as conn: async with conn.cursor() as cur: await cur.execute('select * from task join task_sku tws on tws.task_id = task.id join sku s on tws.sku_id = s.id where task.id = 1') res = await cur.fetchall() print('.', end='', flush=True) async def test(): time = 0 total = 8000 pool = await aiomysql.create_pool(loop=asyncio.get_running_loop(), **{ 'host' : 'localhost', 'db' : 'reptile_1', 'user' : 'testuser', 'password': 'testpass', 'minsize' : 1, 'maxsize' : 100, 'echo' : True }) tasks = [] while time <= total: tasks.append(asyncio.create_task(wrap(pool))) await asyncio.sleep(0) time += 1 await asyncio.gather(*tasks) del tasks print('done') print(f'pool size: {pool.size}') await asyncio.sleep(10) await pool.clear() print(f'pool cleared, new size: {pool.size}') await asyncio.sleep(10) print(f'garbage collected {gc.collect()} items') while True: try: await asyncio.sleep(0) except KeyboardInterrupt: break if __name__ == '__main__': loop = asyncio.new_event_loop() loop.run_until_complete(test())
Note that you should keep a reference to tasks if you want to avoid them from being unexpectedly garbage collected, see the note on https://docs.python.org/3/library/asyncio-task.html#asyncio.create_task.
there is up to 80 connection on my server, after it run about 4 hours, the memory use up to near 500m , until server memory run out . may be i should dorp down the max connection count a bit
Describe the bug
after i run the same sql use diffrent connector over 8000 times, the memory use increase very fast and never drop is that i'm use the aiomysql in a wrong way?
aiomysql version : 0.1.0 python version : 3.10.1 os : macos m1 mysql version: 8.0.27
Table Dump
reptile_1_localhost-2022_04_18_22_06_40-dump.sql.zip
To Reproduce
Expected behavior
the memory use drop after sql excute finish
Logs/tracebacks
Python Version
aiomysql Version
PyMySQL Version
SQLAlchemy Version
OS
macos m1
Database type and version
Additional context
No response
Code of Conduct