-
i used the default scripts on README to pruner vicuna7b and get the prunered output "pytorch.bin".
but i can not reload the pytorch.bin use the .from_pretrained()
what shou i do to inference ?
…
-
I am currently trying to use the `llama-node@0.0.28` library with [Deno](https://deno.land). However, when I execute my program, I am not getting the expected prompt result. Instead, the Rust backend …
-
I was following the steps given in the readme, new to rust, so have no idea how to get through this one, I tried googling but nothing.
```
Compiling ggml-sys v0.2.0-dev (D:\textgen\cria\llm\cra…
-
### Description of the Bug
Creative tab of ICBM crashes game
### Reproduction
1: creative mode
2: open icbm tab
### Version
0.1.8 (beta)
### Relevant log output
```shell
---- Minecraft Crash R…
-
Is there anyway to provide a history of prompts (like chat-gpt4) and tell the llm-cli to return json?
-
### System Info
OS Version:
Distributor ID: Ubuntu
Description: Ubuntu 20.04.3 LTS
Release: 20.04
Codename: focal
8 A-100 GPUS
### Information
- [X] Docker
- [ ] The CLI directly
### Tasks
…
-
# Weekly GitHub Trending! (2023/07/17 ~ 2023/07/24)
## Python trending 10repo's
### [facebookresearch](https://github.com/facebookresearch) / [llama](https://github.com/facebookresearch/llama)
LLaMA モ…
-
### Bug Description
Hello,Thanks for your good application.
I am connecting to a rental server via ssh. The shell is cshrc, sudo is not available.
Python3 is already installed,
When I do % pytho…
-
Not sure if we should consider this out of scope, but `bloomz.cpp` is a fork of `llama.cpp` that's capable of inference with the BLOOM family of models. The changes don't look very large, so there's r…
-
if i attempt to port your binding to nodejs but still keep original llama-rs crate in pure rust. how should i solve package naming conflict.
would it be better if you rename this to llama-binding o…