-
### Project Name
RAG MongoDB food pairing
### Description
I am learning how to use Azure and RAG, but I would like to apply my knowledge to building a RAG for food and drink pairing (especia…
-
#### Repro Steps
1. Set yourself as learning German
1. Write a post in German
1. Remove German as a learning language
1. Edit your post
Observe that you can not select German as the language …
Lanny updated
2 years ago
-
OpenAI often returns text that looks like MD but trips up LogSeq. Then LogSeq doesn't display the response correctly, and gives a warning: "Full content not displayed; Logseq doesn't support multiple …
thams updated
5 months ago
-
Room Name: CodeConnect
Description:
CodeConnect is a vibrant, interactive space tailored for computer science enthusiasts and learners of all levels. Modeled after popular platforms like Reddi…
-
## 집현전 중급반 스터디
- 2022년 4월 10일 일요일 9시
- 김응엽님 임진호님 신재욱님 최유정님 발표
- 논문 링크: https://arxiv.org/abs/1909.11942
> ### Abstract
> Increasing model size when pretraining natural language representations …
-
- **Abstract (2-3 lines):** In this talk, I will present an overview of Deep Learning, and then present in detail what are the advancements being made in Natural Language Processing, using deep learni…
-
## Bug Description
I have used your template before (approx 1 year ago) without issues. Now, I re-cloned your project, installed it's dependencies, and I am trying to run the training job: `poetry …
-
If skills "VS Code" and "Visual Studio Code" are both used in the list of skills, we get nodes for each, even though VS Code is a [Wikipedia Redirect Page](https://en.wikipedia.org/wiki/Wikipedia:Redi…
-
## ざっくり言うと
BERT以降,モデルサイズを大きくすることで精度の向上を図るトレンドがあるが,その潮流とは異なり,パラメータ数の削減を目的とした新モデルの提案(ALBERTはA Lite BERTの略).同じモデル構成で比較すると精度は落ちるが,パラメータ数が少ない分モデルを大きくすることが可能になり,結果としてBERT largeとほぼ同じ性能のモデルが約1/5のパラメータ数で達成された…
-
Dear Mr. Doimo,
I recently read your paper, The Representation Landscape of Few-Shot Learning and Fine-Tuning in Large Language Models, and I must say, it’s an excellent piece of work. I am also aw…