-
Hi! I've been using EleutherAI's LM Evaluation Harness and I'd like to be able to also runs some code tasks using your Big Code Evaluation Harness. We need the scores for each sample in each benchmark…
-
In [`35c62c6`](https://github.com/hjstrauss/MonitorMySites/commit/35c62c6b19efd6bf672f2323858fed2420f6d4ef
), LM Bogen WSV (https://www.lmbogenwsv.de) was **down**:
- HTTP code: 0
- Response time: 0 m…
-
Running `sensors-detect` as root does not find any fans. Running `cat /sys/class/thermal/cooling_device*/type` shows no fans. `fancontrol` complains there are no fancontrol configuration while `fanco…
-
It would be great to be able to chat with the currently loaded model via the CLI, much as you can in the main LM Studio app.
-
很惊艳的工作!
针对模型合并LM_Cocktail想请教几个问题:
1、针对embedding模型,如果我有2个垂域场景,那单独针对2个场景分别微调2个模型,再使用LM_Cocktail合并,还是,将2个垂域场景的训练数据先合并在一起,微调一个模型,这两种做法有什么差异?各自有什么优缺点呢?LM_Cocktail核心解决了什么问题,模型合并比数据合并再微调的方式有什么优势?
2、关于你们在介…
-
Hi guys, in generation.py, I noticed following code snippet as below. Looks like LORA is not used for inference at all, or is there anything I missed ? Thank you
```
unwanted_prefix = "_orig_mod.…
-
Right now, the current state of LLMs involves a lot of "prompt engineering" that the user needs to do, not only to elicit the the correct response but sometimes also the correct style of response (i.e…
-
```
# apt install -y lm-sensors
# sensors
```
```
coretemp-isa-0000
Adapter: ISA adapter
Package id 0: +45.0°C (high = +62.0°C, crit = +72.0°C)
Core 0: +44.0°C (high = +62.0°C, crit …
-
### Mandatory information:
**Full path to the case(s) of the experiment on NIRD**
/projects/NS2345K/oyvinds/cases
**experiment_id**
ssp245SST-piAer
**model_id**
NorESM2-LM
**CASENAME(s)…
-
Hi,
Following the example on https://pbiecek.github.io/DALEX/reference/plot.model_performance_explainer.html , if you rearrange the order of arguments from plot(mp_rf, mp_glm, mp_lm, geom = "boxplo…