Open Stooovie opened 2 weeks ago
Tried StableLM model with llama inference. This is an answer to "hi!"
{ ========== [EXPL] | 1) {prompt}
int main(void) {
//PRINTTEST test_pair<short int, string> a; cout << "sizeof(std::pair <char, std::basic_string , long, short>) = "; printtest1(a); }
I have not found any way to get any output that wouldn't be total BS.
Hi. Try to disable Mmap and correct prompt template.
Tried StableLM model with llama inference. This is an answer to "hi!"
{ ========== [EXPL]
| 1) {prompt}
int main(void) {
define PRINT0(x) cout << #x ": " << x << endl
//PRINTTEST test_pair<short int, string> a; cout << "sizeof(std::pair <char, std::basic_string, long, short>) = "; printtest1(a);
}
I have not found any way to get any output that wouldn't be total BS.