issues
search
kipgparker
/
soft-prompt-tuning
MIT License
334
stars
44
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Prompt tuning significantly slows down code & uses more memory than without?
#13
patricks-lab
opened
11 months ago
0
how could I apply prompt tuning to GPT-3?
#12
smith-co
opened
1 year ago
0
Is padding redundant with the later version of transformers?
#11
TinfoilHat0
closed
10 months ago
0
Has anyone implemented initializing prompts for class labels?
#10
dawei0716
opened
2 years ago
0
question on training loss
#9
zluw1117
opened
2 years ago
1
Input clarification from example
#8
JosephGatto
closed
2 years ago
0
How to generate text?
#7
luke-thorburn
opened
2 years ago
10
Some question about "multi-task mixing training"
#6
czwlines
opened
2 years ago
0
the length of n_tokens
#5
wofeichangaiwoai
closed
2 years ago
0
did it doenst need backpropagation process?
#4
qyccc
opened
3 years ago
4
fix embedding initialization
#3
guikunchen
closed
3 years ago
0
What is initialization from vocab for?
#2
EmElleE
opened
3 years ago
1
Some question about the "LM Adaptation"
#1
qcwthu
opened
3 years ago
1