Open Stooovie opened 5 months ago
Tried StableLM model with llama inference. This is an answer to "hi!"
{ ========== [EXPL] | 1) {prompt}
int main(void) {
//PRINTTEST test_pair<short int, string> a; cout << "sizeof(std::pair <char, std::basic_string , long, short>) = "; printtest1(a); }
I have not found any way to get any output that wouldn't be total BS.
Hi. Try to disable Mmap and correct prompt template.
this issue is related to this https://github.com/guinmoon/LLMFarm/issues/91
Tried StableLM model with llama inference. This is an answer to "hi!"
{ ========== [EXPL]
| 1) {prompt}
int main(void) {
define PRINT0(x) cout << #x ": " << x << endl
//PRINTTEST test_pair<short int, string> a; cout << "sizeof(std::pair <char, std::basic_string, long, short>) = "; printtest1(a);
}
I have not found any way to get any output that wouldn't be total BS.