kyb3r / emergent

An implementation of long term memory and external tools for LLMs
MIT License
67 stars 7 forks source link

Create an internal reasoning / search_memory tool #5

Open syntex01 opened 1 year ago

syntex01 commented 1 year ago

I think that the agent should speak to itself and reason what topics would be helpful for him to know right now. We can then provide the closest match for each topic. In this internal chat the bot could later on also access various functions like searching the web.

kyb3r commented 1 year ago

I read the microsoft research paper on “sparks of AGI” https://arxiv.org/abs/2303.12712. It seems like GPT-4 is a lot lot better at using tools, and reasoning. Gpt-3.5 is hit or miss, sometimes it uses the tool provided in in context examples, sometimes it does not. One thing i found helped a lot is providing an example convo before the system message. Gpt-3.5 tends to ignore the system message a lot.


[
    {"role": "user", "content": "I want to book an appointment with Dr Bob"},
    {"role": "assistant", "content": "AVAILABLE_TIMES('Dr Bob') -> …"},
    {"role": "assistant", "content": "Here are the available times for Dr Bob ..."},

    {"role": "system", "content": "You are a booking assistant blah blah ..."} # Example convo before this

]
syntex01 commented 1 year ago

I also noticed this with other projects. Gpt4 is very responsive and rarely makes mistakes when provided clear instructions. You can even tell it to execute certain functions when it sees the need for it. (By identifying the function name in the output)

kyb3r commented 1 year ago

Take a look at this: https://openai.com/blog/chatgpt-plugins

syntex01 commented 1 year ago

That's really nice

kyb3r commented 1 year ago

This is actually insane. It looks like openai made models finetuned to learn how to use tools using their plugin specification.

For now, plugins are designed for calling backend APIs, but we are exploring plugins that can call client-side APIs as well.

Client-side apis would benefit us, that would provide a simple way for us to make a tool for accessing memories.

syntex01 commented 1 year ago

Yeah the advancement gpt4 brought do allow for many things that were harder with 3.5. I am really happy to see the announcement about the plugins.

kyb3r commented 1 year ago

I have access to GPT-4 now and I have already tested out tools

IMG_0242

it works well, gpt-4 is smart :)

It should be easy to create a memory tool that takes in a query as an argument

syntex01 commented 1 year ago

Did you also get access to tools?

kyb3r commented 1 year ago

no, i just made my own implementation lol.

you define your own tools, and the ChatAgent class will let gpt-4 know about these tools and how to use them.

works reliably with gpt-4. I finished implementing it in another (private) repo, i’ll try finish merging it into this one soon

kyb3r commented 1 year ago

Alrighty I got it to work!

You: Tell me about the HMCS system we discussed in the past 

FUNCTION CALL:  search_memory {'query': 'HMCS system'}

Agent: In the past, we discussed the Hierarchical Memory Consolidation System (HMCS), which was developed in 2023. HMCS is designed to enhance the memory capacity of large language models, particularly those that rely on natural language processing. This new memory system can improve the performance of AI models, making them more efficient and effective in processing natural language data. Note that HMCS is not related to Her Majesty's Canadian Ship, which is a prefix used for ships in the Canadian Navy.
syntex01 commented 1 year ago

Ah really nice:) just that you know, I am currently skiing so I can't implement my ideas rn. But I will be back in 6 days:) I will check the updates you make to the code however.

kyb3r commented 1 year ago

Currently, this only works with GPT-4 (since its smart enough to reason when to use a tool and when not to)