nomic-ai / pygpt4all

Official supported Python bindings for llama.cpp + gpt4all
https://nomic-ai.github.io/pygpt4all/
MIT License
1.02k stars 162 forks source link

Upgraded the module to meet gpt4all-ui backend requirements #82

Closed ParisNeo closed 1 year ago

ParisNeo commented 1 year ago

Added the possibility to stop the generation process any time. Added a function to count the number of tokens that a prompt has. This is needed by GPT4All-ui to optimize the generation process.

Thanks alot

abdeladim-s commented 1 year ago

Thanks @ParisNeo

abdeladim-s commented 1 year ago

@ParisNeo as I told you, I think you can do the same with version 1.0.7, have you tried it ?