mattermost / mattermost-plugin-ai

Mattermost Copilot plugin supporting multiple LLMs
https://mattermost.com/copilot
Apache License 2.0
137 stars 31 forks source link

Add ability for LLM to use files from content extraction. #61

Closed crspeller closed 1 month ago

crspeller commented 1 year ago

Currently the LLM does not have access to file contents even though the MM server adds the extracted content.

This is not straightforward as you need to balance what the LLM will pay attention too and the LLM context limit. Many files will not fit in the context.

a3957273 commented 4 months ago

With models like gpt-4o, mistral large and llama 3.1 now having a context window of 128k there might be value in providing a 'simplistic' version of this feature that has no context awareness. It could simply error when the context length is exceeded with a useful error message.

Some informal user testing with the linked PR that was closed suggests only a small percentage hit that context limit. It could possibly be an option to enable / disable in case other users are on models with smaller context windows.

crspeller commented 4 months ago

@a3957273 Agree here. I think a lot has changed since I closed #71 in regards to context length. It would be valuable to just do our best here if the context size is large.