About/Goals
Ollama is an AI model management tool that allows users to install and use custom large language models locally.
The project aims to:
- [x] Create a Discord bot that will utilize Ollama and chat to chat with users!
- [ ] User Preferences on Chat
- [x] Message Persistance on Channels and Threads
- [x] Threads
- [x] Channels
- [x] Containerization with Docker
- [x] Slash Commands Compatible
- [x] Generated Token Length Handling for >2000
- [x] Token Length Handling of any message size
- [ ] User vs. Server Preferences
- [ ] Redis Caching
- [x] Administrator Role Compatible
- [ ] Multi-User Chat Generation (Multiple users chatting at the same time)
- [ ] Automatic and Manual model pulling through the Discord client
- [ ] Allow others to create their own models personalized for their own servers!
- [ ] Documentation on creating your own LLM
- [ ] Documentation on web scrapping and cleaning
Environment Setup
- Clone this repo using
git clone https://github.com/kevinthedang/discord-ollama.git
or just use GitHub Desktop to clone the repo.
- You will need a
.env
file in the root of the project directory with the bot's token. There is a .env.sample
is provided for you as a reference for what environment variables.
- For example,
CLIENT_TOKEN = [Bot Token]
- Please refer to the docs for bot setup.
Resources
- NodeJS
- This project runs on
lts\hydrogen
.
- To run dev in
ts-node
/nodemon
, using v18.18.2
is recommended.
- To run dev with
tsx
, you can use v20.10.0
or earlier.
- This project supports any NodeJS version above
16.x.x
to only allow ESModules.
- Ollama
[!CAUTION]
v18.X.X
or lts/hydrogen
will not run properly for npm run dev-mon
. It is recommended to just use npm run dev-tsx
for development. The nodemon version will likely be removed in a future update.
Acknowledgement
discord-ollama © 2023 by Kevin Dang is licensed under CC BY 4.0