-
### Describe the bug
This notebook requires use of filter dictionary to obtain information associated with building the custom model. Despite the efforts discussed below, I am not able to successfull…
-
-
There are 4 samples in the reference HF output that has no output other than the EOS.
```
>>> df = pd.read_pickle("06062024_mixtral_15k_v4.pkl")
>>> df[df['tok_ref_output_len'] == 1]
datase…
-
On my old 4-core computer, I got roughly 2.1 t/s with openorca-chat model. When I upgraded to a 6-core PC, the speed doubled. Other models got the same speed increase. Then two things happened at the …
-
For AdaptLLM, where we can find training code. Only inference codes are provided
-
I followed this guide: https://access.cknowledge.org/playground/?action=install
And then i use: cm pull repo mlcommons@cm4mlops --branch=dev
Ran this command: cmr "run-mlperf inference _find-per…
-
How are you downloading Mistral-7B-OpenOrca model, ikeep getting this error:
OSError: Incorrect path_or_model_id: '/media/2nvme/llm/Mistral-7B-OpenOrca'. Please provide either the path to a local f…
AFMSB updated
7 months ago
-
I attempted to download mistral-7b-openorca.Q4_0.gguf multiple times. The download completed, but then different error states happened randomly:
* the download button returned to its original configu…
-
Hello mlcommons team,
I want to run the "Automated command to run the benchmark via MLCommons CM" (from the example: https://github.com/mlcommons/inference/tree/master/language/llama2-70b), but I a…
-
Some datasets*, for example [vivym/midjourney-messages](https://huggingface.co/datasets/vivym/midjourney-messages) and [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) are not …