Closed changliu98 closed 5 months ago
Hi Qi Luo, thanks for updating the requirements.txt
, do appreciate it! The setup for flash-attention with text-generation cli is little tricky(I guess it's the common scenario dealing with GPU), hope we can have better docs on this in the future😊
Hi, thank you all for the hard wok. I am trying to replicate the results, while the provided
requirements.txt
doesn't specify the version number, is it convenient to publish the package/Python version, or the environment yaml file if you were using conda?