-
https://github.com/Wiredcraft/nestjs-bunyan-logger/blob/9b8e8847e53cb984da6f9c0cce7ca15d34bdc0da/src/logger.providers.ts#L111
@jiangmengyu0409 the transformer is a powerful feature of this library,…
-
I found this project being discussed in local llama subreddit.
I read the paper but had questions.
One of the questions that came up that is gnawing at me... Why Transformer++ as your basis of co…
-
https://github.com/ValSpada/parcialesFuncional/blob/2d4a239db3581a12484650315a178079b5f709b9/transformers.hs#L85-L86
Repite código. Si queremos ver si una condición se cumple para varios valores po…
-
### Eschewed features
- [X] This issue is not requesting templating, unstuctured edits, build-time side-effects from args or env vars, or any other eschewed feature.
### What would you like to have …
-
can we directly use videollama2 using transformers?
-
Hi,
Thanks for the great repo. I wanted to know whether the data/model used for training the 154M parameter model for calculating the perplexity is available?
-
**Motivation**
The `basename()` transformer will make it easier for rule authors to write concise and effective rules, particularly for fields returning a full path by extracting the base name from…
-
-
I'm trying to replicate the results on the KIT dataset, but I've observed that the training and validation classification accuracy of the Masked Transformer and Residual Transformer are not very high.…
-
![image](https://github.com/csguoh/LEMMA/assets/49752023/47dad5f0-9c91-4fd2-9149-70c24a1ae197)
Hi!
I am very interested in this project.
Can you tell me what project does the weight pretrain_tran…