approximatelabs / lambdaprompt

λprompt - A functional programming interface for building AI systems
MIT License
374 stars 22 forks source link

Running multiple queries simoultaneusly #5

Closed alvitawa closed 1 year ago

alvitawa commented 1 year ago

Super cool library!

However, I'm not succeeding at making multiple requests run simultaneously. Here is my (pseudo) code:

        GPT3PROMPT = AsyncGPT3Prompt("Test prompt, answer: ")

        async def answer_question(question):
            r = await GPT3PROMPT(question)
            print("STEP")
            return r

        loop = asyncio.new_event_loop()
        asyncio.set_event_loop(loop)

        pending_questions = []
        for question in self.questions:
            pending_questions.append(loop.create_task(answer_question(question)))

        answers = []
        for answer in loop.run_until_complete(asyncio.gather(*pending_questions)):
            answers.append(answer)

        loop.close()

I can see that the 'STEP' is printed sequentially with a few seconds in between prints, so the requests are not being sent at the same time. I am not super familiar with aiohttp, but I think this may be because the same session needs to be used for all the requests?

Is this not supported by lambdaprompt? Or should I be doing something differently?

(I could also use separate threads but I would prefer to do this concurrently)

bluecoconut commented 1 year ago

hmm... there does seem to be something weird going on. I did a lot of things to fight asyncio when making this so the package had both direct and async calls... maybe somewhere along the way i ended up breaking the whole point of having async functions and didn't even notice! haha

I just reconfirmed with this code

from lambdaprompt import AsyncGPT3Prompt
import asyncio 

GPT3PROMPT = AsyncGPT3Prompt("Test {{ question }}, answer: ")

async def answer_question(question):
    print("Enter step")
    r = await GPT3PROMPT(question=question)
    print("STEP")
    return r

answers = await asyncio.gather(*[
    answer_question("What is the answer to life, the universe, and everything?"),
    answer_question("How much wood could a woodchuck chuck if a woodchuck could chuck wood?"),
    answer_question("What is the meaning of life?"),
])

that the "steps" come out roughly uniformly rather than "all at once".

I'll dig around for a little bit soon and see if I can figure this out, hopefully its something simple... Also, if you look into it (there's not too much code) and notice any glaring errors, shout them out! I'm curious about resolving this asap.

bluecoconut commented 1 year ago

Okay, I figured it out. I was trying to be too clever in the code - i was trying to be very DRY in the __call__ method, but theres just no way around needing to actually use an await. which means that i need to keep ~13 lines of repeated code around.

Here's the diff

https://github.com/approximatelabs/lambdaprompt/commit/3483858aa82e28b3b5d19fd966ca06490e025966

I just pushed it up, it should be (soon) available as lambdaprompt 0.3.5

so if you run a pip install -U lambdaprompt you should get the new version and it should work for exactly your use case.

Please report back if the fix works for you! :)

alvitawa commented 1 year ago

Thanks a lot!

It works perfectly now :)