Mozilla-Ocho / llamafile

Distribute and run LLMs with a single file.
https://llamafile.ai
Other
16.75k stars 830 forks source link

update GGML_HIP_UMA #473

Closed Djip007 closed 1 week ago

Djip007 commented 2 weeks ago

add UMA config for higher speed like in (https://github.com/ggerganov/llama.cpp/pull/7414) but made 2 changes:

an other change is look for 'hipcc' on linux and not 'amdclang++'

(1 possible solution for https://github.com/Mozilla-Ocho/llamafile/issues/439 / https://github.com/Mozilla-Ocho/llamafile/discussions/468)