Strvm / meta-ai-api

Llama 3 API (MetaAI Reverse Engineered)
154 stars 27 forks source link

Making decisions on the previous response #14

Open bryt-frost opened 2 months ago

bryt-frost commented 2 months ago

Can there be a way to make decisions on the previous response or for it to remember the previous response

ahew5505 commented 4 weeks ago

Here's one possible solution from a personal project of mine:

from meta_ai_api import MetaAI

ai = MetaAI()

conversation = True

while conversation == True: 
    text = open('text.txt', 'a') #opens the file so that it can be written to
    text2 = open('text.txt', 'r') #This opens the same file but in read mode

    query = input()
    conv.write("Me: " + query + "\n") #writes the query to the doc

    if query:
        response = ai.prompt(message="The following text is the conversation to this point: \n" + text2.read() +  "Please consider this text the conversation up to this point. This is my new question: " + query)
          #The line above speaks directly with the AI, telling it to reference the saved conversation for context
        print(response['message'])
        text.write("You: " + response['message'] + '\n') #writes the response to the doc

    else:
        conversation = False

    text.close()

The most important part of this solution is continuously saving each piece of the conversation to a text file. Other than that, you can concatenate the prompt to add any additional parameters / dialogues with no code! Obviously, this solution can have TONS of potential issues with latency and memory, especially with larger projects. However a better solution is definitely above my experience level and pay-grade ($0 with 2 semesters worth of student loans :/ ).