issues
search
rustformers
/
llmcord
A Discord bot, written in Rust, that generates responses using the LLaMA language model.
GNU General Public License v3.0
91
stars
14
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Slash commands never show up and aren't accessible
#24
KyleStiers
opened
8 months ago
1
[Feature Request] Support InternLM
#23
vansinhu
opened
1 year ago
1
Support for GPT-2
#22
pabl-o-ce
closed
1 year ago
3
Support embeds as a response type
#21
philpax
opened
1 year ago
3
Add Embed message and increase the MESSAGE_CHUNK_SIZE to 4096
#20
pabl-o-ce
closed
1 year ago
5
invild model
#19
ice-ux
opened
1 year ago
1
Post a link to the start of a queued generation when it starts
#18
philpax
opened
1 year ago
0
Cancel a queued generation
#17
philpax
opened
1 year ago
0
Add configurable and customisable feed prompt buttons
#16
philpax
opened
1 year ago
0
Per-command input fields
#15
philpax
opened
1 year ago
0
Per-command models
#14
philpax
opened
1 year ago
0
Add Docker support
#13
pabl-o-ce
closed
1 year ago
0
Delete button
#12
philpax
opened
1 year ago
0
Replying to messages to continue generation
#11
philpax
opened
1 year ago
0
Discord message character limits
#10
cadaeix
closed
1 year ago
0
Config settings for specified commands
#9
cadaeix
opened
1 year ago
0
Error
#8
dillfrescott
closed
1 year ago
4
Prompt presets
#7
philpax
opened
1 year ago
1
Panic during send breaks generation for everyone
#6
philpax
closed
1 year ago
0
Don't show the initial prompt in Alpaca mode
#5
philpax
closed
1 year ago
0
Disable `hallucinate` when using an Alpaca model
#4
philpax
closed
1 year ago
0
Hint for prompt parameter in `alpaca` is wrong
#3
philpax
closed
1 year ago
0
[Feature Request] Show prompt in messages
#2
slotthhy
closed
1 year ago
0
Ability to cancel generation
#1
slotthhy
closed
1 year ago
0