-
Thank you for your significant contributions to Vision and Language Navigation.
I've been utilizing the bash pretrain_src/scripts/pretrain_r2r.bash script to pre-train the given 9 tasks. However, I…
-
> Hugging Face Transformers is an open-source framework for deep learning created by Hugging Face. It provides APIs and tools to download state-of-the-art pre-trained models and further tune them to m…
-
I am trying to apply SmoothQuant during W8A8 quantization of `meta-llama/Llama-3.2-11B-Vision-Instruct` where I ignore all of the modules except for language_model. However I find that it crashes when…
-
### Project Name
ProactiveOS: AI-Driven Autonomous OS Management
### Description
Description:
ProactiveOS is an advanced AI-driven operating system management solution designed to enhance system st…
-
It seems GPT like llama2 is more popular.
But the paper still use T5.
Compared to GPT, does it have any special advantages to use T5?
-
*Sent by @amir9979 (amir9979@gmail.com). Created by [fire](https://fire.fundersclub.com/).*
---
\---------- Forwarded message ---------
From: **Google Scholar Alerts**
Date: Sun, Mar 10,…
-
Thanks for your work in anomaly detection domain. I am reaching out to discuss an aspect of your work that caught my attention, specifically regarding the experiments conducted in a zero-shot setting.…
-
# 📜 [A Survey of Transformers](https://arxiv.org/pdf/2106.04554.pdf)
### ⚡ 한줄요약
2021년 6월 기준으로 정리한 transformer 아키텍쳐에 대한 서베이 논문.
### 🏷️ Abstract
> Transformers have achieved great success in …
-
NusaCatalogue: https://indonlp.github.io/nusa-catalogue/card.html?id_mm_laion
| Dataset | id_mm_laion |
|-------------|---|
| Description | Indo_MultiModal_LAION is a translate…
-
# Summary
NLP 성능을 LLM 수준으로 유지시키면서 VLM을 scratch로 학습시키는 건 굉장히 어려움. 따라서, frozen pretrained language model로부터 어떤 식으로 VLM을 학습시키는지를 investigate하는 방향으로 연구가 진행되어 옴.
### 기존 연구 방향
1. Shallow alignmen…