-
First I want to say THANK YOU to make this project possible. It's amazing how many possibilities will open thanks to this community :)
I want to run llama2 on my iPhone, however most of the iPhones…
-
Below is my code , If the text is too long the aspect & sentiment are not getting extracted , Please correct me if I am doing something wrong.
```
from pyabsa import AspectTermExtraction as ATEPC,…
-
### 🚀 The feature, motivation and pitch
Sparse Causal Flash Attention as implmented [here](https://github.com/epfml/dynamic-sparse-flash-attention) and described in [this paper](https://arxiv.org/abs…
-
Hi there.
I’ve run the training code in this repository for 25k out of the 100k batches and achieved a validation loss of around 1.28, or perplexity of 3.59. After this, the training loss continues…
-
Love your idea of this extension, could you add endpoint specification in settings?
So that I can change say the base url of OpenAI from https://api.openai.com/v1/chat/completions to something els…
-
Hi ! Great work :)
I have a question regarding the [loss weighting ](https://github.com/imoneoi/openchat/blob/master/ochat/training_deepspeed/train.py#L144)implementing in the repository. Do I unde…
-
首先感谢作者团队的贡献!作者的思路给我了很大的启发,在复现过程中发现一些问题:
1、一个带有标签的低频单词被unk遮掉后,形成的【tag,unk】会不会影响最终的整体效果;
2、单纯了训练了一下Language Model看结果我不知道是好是坏,本来是CV,忽然NLP可能不太理解。所以期待作者能帮我解答一下
。
-
**Describe the Bug**
When running a scrape that has actions, an Enter key action after typing out in search input doesn't do anything for sites like Amazon, Meetup and Eventbrite. A click action on t…
-
```
When I use interpolate-ngram to interpolate two models by CM or GLI with
perplexity optimization, I get following faults:
1st:
interpolate-ngram -lm "model1.lm, model2.lm" -smoothing ModKN -inte…
-
```
I have a background unigram model (bg.arpa), some additional training data
(train.txt) and some dev text (dev.txt). I want to create an interpolated
unigram that optimizes the perplexity of dev.tx…