-
# 🌟 New adapter setup
## Model description
Big Bird is a new model available in huggingface ( efficient transformer)
## Open source status
* [ X] the model implementation is available: (gi…
-
# Proposal: Proxy/Delegate Voting
Author(s): Luke Duncan
Last updated: 11/29/2017
## Abstract
Proxy or Delegate based voting allows voters to opt to participate actively or delegate their …
-
I hope this message finds you well. I recently read your impressive paper on [FLatten Transformer: Vision Transformer using Focused Linear Attention], and I must say I was truly amazed by your work.
…
-
Hello, we are a team researching the dependency management mechanism of Golang. During our analysis, we came across your project and noticed that it contains a vulnerability (CVE-2022-41723, CVE-2022-…
-
自注意力(Self-Attention) 機制
至目前為止,model用到的輸入皆可看為一個vector
但遇到更複雜的輸入時,像是輸入為一個sequence或是每次輸入長短不一的向量怎麼辦?
![image](https://user-images.githubusercontent.com/34474924/236625854-800b74f8-9ee9-4517-97b4-e3…
-
Hi,
I was wondering, why use WordConv (separable convolution) in NL encoder and not the usual Feedforward NN (like original transformer)? Is it mainly because separable conv is easier to train? Did…
-
## Module
queue_job
## Describe the bug
Since https://github.com/OCA/queue/pull/347, jobs have a `cancelled` state.
However the job runner does not understand this state.
https://github.c…
-
### Bug description
Hi Sebastian,
There is the following statement in Section 3.5.2:
> In the transformer architecture, including models like GPT, dropout in the attention
mechanism is typically…
-
Hi,
Thank you for your excellent work at first!
I have two questions:
1. What is your strategy for generating plain background images with false positive hints, and does picking a different backgro…
-
**Github username:** @akshaysrivastav
**Submission hash (on-chain):** 0x904a7e8b507563a67a2f337cd89822b215f05571fb53def0ab7438fbc4dfa4b7
**Severity:** high
**Description:**
## Description
As per the…