jakethekoenig / llm-chat

A library of components for making llm chats
2 stars 0 forks source link

Build conversation for get_completion #113

Open jakethekoenig opened 1 month ago

jakethekoenig commented 1 month ago

Currently we only get the parent message. But we should get it's parent and so on and build the conversation. If the author matches an llm model we should set role: "assistant", otherwise we should use role: "user".

jakethekoenig commented 1 month ago

@mentatbot can you fix this? I got lazy when writing the relevant logic in messageHelpers.ts. lmk if you have any questions.

mentatbot[bot] commented 1 month ago

I will start working on this issue