issues
search
mit-han-lab
/
TinyChatEngine
TinyChatEngine: On-Device LLM Inference Library
https://mit-han-lab.github.io/TinyChatEngine/
MIT License
760
stars
73
forks
source link
Support new features
#67
Closed
RaymondWang0
closed
1 year ago
RaymondWang0
commented
1 year ago
Support new models
Enable memorization of previous contexts
Enable configurable number of threads