Closed mastoca closed 2 weeks ago
Thank you!
idk if u use this
No, I don't use ollama: I only ever used ollama for a few days when I first made this flake, but I quickly lost interest due to the rather disappointing state of LLMs at the moment. I also used it to try a new big model (like llama3.1
) a few times, but I just don't find the capabilities of current models (even the best ones) to be particularly useful, except very rarely.
I appreciate the PR, though, since I know there are a few people (such as yourself) that still use this flake. I'm not entirely sure why, since it's not updated more than (or even as much as) the ollama package in Nixpkgs, but I'm glad it's still useful.
It is unfortunate that rocm is broken, though. Hopefully a fix is found soon.
No, I don't use ollama: I only ever used ollama for a few days when I first made this flake, but I quickly lost interest due to the rather disappointing state of LLMs at the moment. I also used it to try a new big model (like
llama3.1
) a few times, but I just don't find the capabilities of current models (even the best ones) to be particularly useful, except very rarely.
I agree. The hype is nowhere near the utility.
I appreciate the PR, though, since I know there are a few people (such as yourself) that still use this flake. I'm not entirely sure why, since it's not updated more than (or even as much as) the ollama package in Nixpkgs, but I'm glad it's still useful.
It is unfortunate that rocm is broken, though. Hopefully a fix is found soon.
The makefiles for rocm look like they also need some tlc,..eg the way it looks for hipcc... I'll look at them when I get some time.
idk if u use this... I still do; so this is just fyi ...enjoy