Open ivanstepanovftw opened 1 year ago
CC: @dranger003
Hm, interesting. Probably ne
has to be padded to 8 byte boundary:
// n-dimensional tensor
struct ggml_tensor {
enum ggml_type type;
enum ggml_backend backend;
int n_dims;
char padne[4]; // TMP
int64_t ne[GGML_MAX_DIMS]; // number of elements
size_t nb[GGML_MAX_DIMS]; // stride in bytes:
// nb[0] = sizeof(type)
// nb[1] = nb[0] * ne[0] + padding
// nb[i] = nb[i-1] * ne[i-1]
It did not help; changing ne
back to int helps. But now I am facing other issues.
Following #24 WASM compiled library does not work. Nor f32, nor fresh q4_0 models work.
Here is what I am getting in console:
Uncaught (in promise) RuntimeError: Aborted(alignment fault)
. Exact line that is falling before'SAFE_HEAP_STORE_i64_8_8'
call is ggml.c:4632, which have this content:Chrome Console Output
``` [C/C++ DevTools Support (DWARF)] Loading debug symbols for wasm://wasm/017aede6... index.html?_ijt=8885d1g3pefs1slvbkk1nbfo9s&_ij_reload=RELOAD_ON_SAVE:93 Writing model to filesystem... because: No such file or directory bert.wasm.js:1415 bert_load_from_file: loading model from '/ggml-model-q4_0.bin' - please wait ... bert.wasm.js:1415 bert_load_from_file: n_vocab = 30522 bert.wasm.js:1415 bert_load_from_file: n_max_tokens = 512 bert.wasm.js:1415 bert_load_from_file: n_embd = 384 bert.wasm.js:1415 bert_load_from_file: n_intermediate = 1536 bert.wasm.js:1415 bert_load_from_file: n_head = 12 bert.wasm.js:1415 bert_load_from_file: n_layer = 6 bert.wasm.js:1415 bert_load_from_file: f16 = 2 [C/C++ DevTools Support (DWARF)] Loaded debug symbols for wasm://wasm/017aede6, found 567 source file(s) bert.wasm.js:1415 bert_load_from_file: ggml ctx size = 12.26 MB bert.wasm.js:581 Aborted(alignment fault) abort @ bert.wasm.js:581 alignfault @ bert.wasm.js:365 $SAFE_HEAP_STORE_i64_8_8 @ 017aede6:0x129fa4 $ggml_new_tensor_impl @ ggml.c:4632 $ggml_new_tensor @ ggml.c:4667 $ggml_new_tensor_2d @ ggml.c:4683 $bert_load_from_file @ bert.cpp:495 $embind_init_bert()::$_0::operator()(std::__2::basic_string