-
Hello,
I ran Bert example on MI-250x by using command:
python3 examples/03_bert/benchmark_ait.py --batch-size 32 --seq-length 512 --encoders-only false
However, it aborted with the following erโฆ
-
Another weird error once I was running in colab:
in ()
----> 1 from fast_bert.data_cls import BertDataBunch
2 from fast_bert.learner_cls import BertLearner
3 from fast_bert.metrics โฆ
-
### ๐ Describe the bug
Thanks for the new torch release!
I am using torch nested in the bettertransfromer implementation. https://pytorch.org/blog/a-better-transformer-for-fast-transformer-encoโฆ
-
## 9์ 3์ผ ํ์ ์ฃผ์ ๋ด์ฉ
**GPT-3**: BERT ์ธ์ด๋ชจ๋ธ ๋ณด๋ค ์ฑ๋ฅ์ด ๋์ OpenAI์์ ์ ์ํ ์ธ์ด๋ชจ๋ธ
**๋ณธ ๊ณผ์ ์์๋ BERT๋ฅผ ํ์ฉํด์ ๊ฐ์ง ๋ด์ค ๊ฒ์ถ ์์คํ
์ ํ์ธ**
* ์ค์ง
- colab์์ BERT ์ฌ์ฉ ์์ ๊ฒํ
- BERT ์์คํ
์์ ๋จ์ด ๊ฐ ์ ์ฌ์ฑ์ ๊ฐ์ํ(visualization) ํ๋ ๊ฒ์โฆ
-
**Describe the bug**
I was expecting a compressed & faster BERT model after running the BERT ZeroQuant example in DeepSpeedExamples. However, the clean model isn't any smaller (still 417.7 MB) or fasโฆ
-
Hi,
When I load the the tokenizer of `paraphrase-multilingual-MiniLM-L12-v2` model via the hugging face model hub:
```
tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/paraphrase-โฆ
-
Hello,
I am new to all of this and I was trying to follow and run the RAG notebooks on google colab. The 5th notebook in the RAG series (Fast Start) when run on colab throws the following error. In โฆ
-
[ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT](https://dl.acm.org/doi/10.1145/3397271.3401075)
## Abstract
Recent progress in Natural Language Unโฆ
-
I was wondering if fast Bert support or would in future support pandas data frame as train, test, validation input rather than just CSV files for the same purpose?
-
I used bert-base as the basic model and retrained my long-bert with a length of 1536. Then I compared the difference in inference speed of the original bert-base-1536. After a lot of testing, I found โฆ