irthomasthomas / undecidability

6 stars 2 forks source link

namuan/chat-circuit: Branch Out Your Conversations #858

Open ShellLM opened 1 month ago

ShellLM commented 1 month ago

namuan/chat-circuit: Branch Out Your Conversations

Features

Editor Features

Running the Application

To run this application, follow these steps:

  1. Generate models configuration file:

    ollama list | tail -n +2 | awk '{print $1}' > models.conf
  2. Install dependencies:

    python3 -m pip install -r requirements.txt
  3. Run application:

    python3 main.py

Model Configuration

The LLM models available are loaded from models.conf in the current directory. See models.conf.example.

The default model is the first one in that list.

You can also run this command to generate the models.conf file:

ollama list | tail -n +2 | awk '{print $1}' > models.conf

Note: If models.conf is not found, the application will use a default set of models.

Suggested labels

None

ShellLM commented 1 month ago

Related content

459 similarity score: 0.89

443 similarity score: 0.88

418 similarity score: 0.88

656 similarity score: 0.87

499 similarity score: 0.87

762 similarity score: 0.86