xbreid / fastapi-assistant-streaming

Medium tutorial codebase showcasing OpenAI Assistant streaming with FastAPI
MIT License
10 stars 2 forks source link

FastAPI with OpenAI Assistant API Integration

Overview

This project demonstrates how to integrate FastAPI with OpenAI's Assistant API, utilizing Server-Sent Events (SSE) for real-time streaming of responses. The application allows creating interactive sessions with an OpenAI Assistant, handling real-time data streaming, and showcasing how asynchronous communication can enhance user interaction.

Features

Getting Started

Prerequisites

Installation

Clone the repository:

git clone https://github.com/xbreid/fastapi-assistant-streaming.git
cd fastapi-assistant-streaming

Install required packages:

pip install -r requirements.txt

Set up environment variables: Create a .env file in the project root directory and add your OpenAI API key and Assistant ID:

OPENAI_API_KEY=your_openai_api_key_here
OPENAI_ASSISTANT_ID=your_assistant_id_here

Running the Application

Start the FastAPI Development Server:

fastapi dev main.py

Testing Endpoints

Check the Health:

curl -X 'GET' --url 'http://localhost:8000/'

Get the Assistant:

curl -X 'GET' --url 'http://localhost:8000/api/v1/assistant'

Create a thread:

curl -X POST http://localhost:8000/api/v1/assistant/threads -H "Content-Type: application/json"

Send a message:

curl -N -X POST \
-H "Accept: text/event-stream" -H "Content-Type: application/json" \
-d '{"text": "Hello! Please introduce yourself", "thread_id": "thread_abc123" }' \
http://localhost:8000/api/v1/assistant/chat

Contributing

Contributions are always welcome!

License

Distributed under the MIT License. See LICENSE for more information.