tonykipkemboi / trip_planner_agent

CrewAI agents that can plan your vacation.
MIT License
95 stars 48 forks source link
crewai llm llms open-source openai python

🏖️ VacAIgent: Streamlit-Integrated AI Crew for Trip Planning

Forked and enhanced from the crewAI examples repository

Beach Vacation Scene ~ generated by GPT-4V

Introduction

VacAIgent leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a user-friendly Streamlit interface. This project demonstrates how autonomous AI agents can collaborate and execute complex tasks efficiently, now with an added layer of interactivity and accessibility through Streamlit.

Check out the video below for code walkthrough 👇

Watch the video

(Trip example originally developed by @joaomdmoura)

CrewAI Framework

CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a streamlined Streamlit user interface.

Streamlit Interface

The introduction of Streamlit transforms this application into an interactive web app, allowing users to easily input their preferences and receive tailored travel plans.

Running the Application

To experience the VacAIgent app:

Disclaimer: The application uses GPT-4 by default. Ensure you have access to OpenAI's API and be aware of the associated costs.

Details & Explanation

Using GPT 3.5

To switch from GPT-4 to GPT-3.5, pass the llm argument in the agent constructor:

from langchain.chat_models import ChatOpenAI

llm = ChatOpenAI(model='gpt-3.5-turbo') # Loading gpt-3.5-turbo (see more OpenAI models at https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4)

class TripAgents:
    # ... existing methods

    def local_expert(self):
        return Agent(
            role='Local Expert',
            goal='Provide insights about the selected city',
            tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website],
            llm=llm,
            verbose=True
        )

Using Local Models with Ollama

For enhanced privacy and customization, you can integrate local models like Ollama:

Setting Up Ollama

Integrating Ollama with CrewAI

Pass the Ollama model to agents in the CrewAI framework:

from langchain.llms import Ollama

ollama_model = Ollama(model="agent")

class TripAgents:
    # ... existing methods

    def local_expert(self):
        return Agent(
            role='Local Expert',
            tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website],
            llm=ollama_model,
            verbose=True
        )

Benefits of Local Models

License

VacAIgent is open-sourced under the MIT License.