Open shubhamagarwal92 opened 3 months ago
1 seems to be caused by changes in latest version of transformers
interacting badly with how we setup generation. For now I am going to pin transformers to v4.36.0 to avoid the issue.
I am not sure what happened to (2) and (3), they worked for me. If those issues persist can you print out the contents of the batch and show me?
Hi! Great work! Congratulations! Thanks for releasing the code!
However, I am not able to reproduce the results for taskrunners using any of the
allenai/uio2-large
,allenai/uio2-xl
orallenai/uio2-xxl
models. I am usingpython 3.8
withtransformers 4.38.2
. Also downloaded thetokenizer.model
from the official llama repo.I am getting the error as
Could you please let me know how to provide the
generation_config
?I am getting the
IndexError
while calculatingout = model(batch)
as:I am getting the error (even though I provided modality as
text
) as: