Open anttitapsa opened 2 months ago
Hi, I have the same issue here, did you find a workaround / other version of triton ?
@CorvusVaine I have not tried any workarounds yet but based on the error messages when using the newer Triton version the error is that trans_b
argument in dot()
function in flash-attention is not defined in other versions. I found this discussion about the same problem and the workaround based on that is to build the package from the source. It is also suggested in the GitHub repo of DNABERT-2.
Has this problem been resolved? I’m also looking for the solution.
Hi,
I use deep learning model which is written using specifically triton version 2.0.0.dev20221202. Now, I tried to set up environment into new location, but the version 2.0.0.dev20221202 binaries were not found anymore when using pip and python 3.8.19. Have you removed or merged this version somewhere during last month? One month a go I was able to install this via pip. I tried to use other versions of the triton to run model but it did not work. The model I am using is DNABERT-2.