noirello / bonsai

Simple Python 3 module for LDAP, using libldap2 and winldap C libraries.
MIT License
116 stars 32 forks source link

async paged_search auto_page_acquire not working #47

Closed Kikkopanda closed 3 years ago

Kikkopanda commented 3 years ago

When I try to run the paged search with the client connection as asynchronous I get all the results I'm expecting (1238)

def paged_search(*args, **kwargs):
    global client
    if not client:
        client = bonsai.LDAPClient(f"ldap://{settings['ldap']['host']}")
        client.set_server_chase_referrals(False)
        client.set_auto_page_acquire(True)
        client.set_credentials("SIMPLE", user=settings['ldap']['user'], 
                                        password=settings['ldap']['password'])
    final_result=[]
    logger.info(f"Connecting to ldap://{settings['ldap']['host']}...")
    with client.connect(is_async=False, timeout=settings['ldap']['timeout']) as conn:
        result = conn.paged_search(*args, **kwargs)
        for r in result:
            final_result.append(r)
        logger.debug(len(final_result))

    return final_result
# Output
# DEBUG:aiohttp.server:1238

However when I change the flags and functions to asynchronous I'm only getting the first page of results

async def paged_search(*args, **kwargs):
    global client
    if not client:
        client = bonsai.LDAPClient(f"ldap://{settings['ldap']['host']}")
        client.set_server_chase_referrals(False)
        client.set_auto_page_acquire(True)
        client.set_credentials("SIMPLE", user=settings['ldap']['user'], 
                                        password=settings['ldap']['password'])

    final_result=[]
    logger.info(f"Connecting to ldap://{settings['ldap']['host']}...")
    async with client.connect(is_async=True, timeout=settings['ldap']['timeout']) as conn:
        result = await conn.paged_search(*args, **kwargs)
        for r in result:
            final_result.append(r)
        logger.debug(len(final_result))

    return final_result
# OUTPUT
# DEBUG:aiohttp.server:1003

I suspect that it's only returning the first page of results and what I've also noticed is that I'm also getting an extra 3 empty entries which I haven't been able to track down as to why yet.

I did one last test based on a previous issue you resolved #31 where you turned off auto_page_acquire and used the function acquire_next_page and that seemed to work for me

async def paged_search(*args, **kwargs):
    global client
    if not client:
        client = bonsai.LDAPClient(f"ldap://{settings['ldap']['host']}")
        client.set_server_chase_referrals(False)
        client.set_auto_page_acquire(False)
        client.set_credentials("SIMPLE", user=settings['ldap']['user'], 
                                        password=settings['ldap']['password'])

    final_result=[]
    logger.info(f"Connecting to ldap://{settings['ldap']['host']}...")
    async with client.connect(is_async=True, timeout=settings['ldap']['timeout']) as conn:
        result = await conn.paged_search(*args, **kwargs)
        for r in result:
            final_result.append(r)
        logger.debug(len(final_result))
        msgid = result.acquire_next_page()
        while msgid is not None:
            result = await conn._evaluate(msgid)
            for r in result:
                final_result.append(r)
            msgid = result.acquire_next_page()
    logger.debug(len(final_result))
    return final_result
# Output
# DEBUG:aiohttp.server:1003
# DEBUG:aiohttp.server:1238

This was tested using bonsai 1.2.1 running in a docker container based on the image python:3.7.6-buster

noirello commented 3 years ago

Did you try to use async for ?

...
async with client.connect(is_async=True, timeout=settings['ldap']['timeout']) as conn:
    result = await conn.paged_search(*args, **kwargs)
    async for r in result:
        final_result.append(r)
    logger.debug(len(final_result))
...
Kikkopanda commented 3 years ago

Hi noirello, That solved the problem. Thank you for helping resolve this issue for me.