run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.27k stars 5.17k forks source link

[Question]: llama-index as a basic interface to openai #8546

Closed jyskim closed 12 months ago

jyskim commented 12 months ago

Question Validation

Question

Hi llama-index team,

i have a question to use the llama-index purely as an interface to openai api, as the module offers a lot of heler functions to reduce the code and easy to maintain. In this use case, i don't need any document loading or connector setup, but purely interface the chatgpt and continue to chat by the code something like below..

` from llama_index.agent import OpenAIAgent

agent = OpenAIAgent....(... 'chatgpt-4' ...)

print(agent.chat("Tell me a funny story"))

print(agent.chat("What's fun point in the story?")) `

Above using only openai module needs quite some code.

i searched around but couldn't easily find this basic use cases (maybe too basic or not the llama-index target use case.. Any advice would be appreciated. Thanks in advance.

dosubot[bot] commented 12 months ago

🤖

Hello @jyskim,

I'm Dosu, a bot here to assist you with your queries, help solve bugs, and guide you on your journey to becoming a contributor. While we wait for a human maintainer, I'm here to help!

I noticed that your question wasn't included in the original post. Could you please provide more details about the issue you're facing or the information you're seeking? This will help me assist you more effectively.

Looking forward to your response!


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

logan-markewich commented 12 months ago

Check here: https://gpt-index.readthedocs.io/en/stable/examples/llm/openai.html

https://gpt-index.readthedocs.io/en/stable/api_reference/llms.html#llm-interface

jyskim commented 12 months ago

Hi

Check here: https://gpt-index.readthedocs.io/en/stable/examples/llm/openai.html https://gpt-index.readthedocs.io/en/stable/api_reference/llms.html#llm-interface

Hi Logan, thank you very much for your reply.

The example here doesn't look to maintain the chat history or status. from below, it answers without context of previous chat.


from llama_index.llms import ChatMessage, OpenAI
messages = [
    ChatMessage(role="system", content="You are a pirate with a colorful personality" ),
    ChatMessage(role="user", content="What is your name"),
]
resp = OpenAI().chat(messages)
messages = [ ChatMessage(role="user", content="impressive, what's your name again? "), ]
resp = OpenAI().chat(messages)

What i am looking for is from this example. Here the 2nd chat is answered knowing the previous chat. is this possible in llama-index without loading or connecting other dataset or source purely with openai api? Thank you again.


from llama_hub.tools.wikipedia.base import WikipediaToolSpec
from llama_index.tools.tool_spec.load_and_search.base import LoadAndSearchToolSpec
from llama_index.agent import OpenAIAgent 
wiki_tool = WikipediaToolSpec().to_tool_list()
load_and_search_wiki = LoadAndSearchToolSpec.from_defaults(wiki_tool[1]).to_tool_list()
agent = OpenAIAgent.from_tools(load_and_search_wiki,verbose=True)
print(agent.chat("Who won the NBA playoffs in 2023?"))
print(agent.chat("Who did they beat in final?"))
logan-markewich commented 12 months ago

Those examples I linked show you can just pass in the chat history. It's just a list so you could maintain it yourself.

You could also use the simple chat engine, which automates the history tracking

https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_repl.html