mit-han-lab / TinyChatEngine

TinyChatEngine: On-Device LLM Inference Library
https://mit-han-lab.github.io/TinyChatEngine/
MIT License
756 stars 73 forks source link

problem while running make chat #85

Open Imran2708 opened 11 months ago

Imran2708 commented 11 months ago

Hi @RaymondWang0 , I'm trying to implement this solution on windows(cpu) OS. And the pre-requisites have been met. Model used is LLaMA2_7B_chat_awq_int4 --QM QM_x86 I'm getting error with 'make chat -j' Please provide the solution for this error. Thanks, Imran B

error
plasm0r commented 10 months ago

I'm having a similar problem:

image

plasm0r commented 10 months ago

This pull request looks promising: https://github.com/mit-han-lab/TinyChatEngine/pull/81

plasm0r commented 10 months ago

I tested the pull request, it fixes this issue.