Open rabeeqasem opened 2 months ago
Hi! We updated the packages few weeks ago to support Mamba, and since we renamed it to:
mistral_inference.transformer import Transformer
.
Is it possible to know where you learned about the from mistral_inference.model import Transformer
so we update the docs? Thanks!
following 3 instructions
!git clone https://github.com/mistralai/mistral-inference.git %cd /root/mistral_inference/src/ from mistral_inference.transformer import Transformer
It's also on the page here https://huggingface.co/mistralai/Codestral-22B-v0.1
It's also on the page here https://huggingface.co/mistralai/Codestral-22B-v0.1
Thank you, just fixed it!
can't 'from mistral_inference.transformer import Transformer' as well;
can't 'from mistral_inference.transformer import Transformer' as well;
Could you share your error? Since the previous errors were about from mistral_inference.model import Transformer
not working since as replaced with from mistral_inference.transformer import Transformer
in recent versions.
can't 'from mistral_inference.transformer import Transformer' as well;
Could you share your error? Since the previous errors were about
from mistral_inference.model import Transformer
not working since as replaced withfrom mistral_inference.transformer import Transformer
in recent versions.
I have solved the problem by building a new python environment with Py 3.11. FYI, The error will still occur when ignoring the version of these dependencies. Pip installation is successful but the package can't work at all. I hope I made it clear.
Hi! We updated the packages few weeks ago to support Mamba, and since we renamed it to:
mistral_inference.transformer import Transformer
. Is it possible to know where you learned about thefrom mistral_inference.model import Transformer
so we update the docs? Thanks!
It's also being used here: https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1
Thank you! Have fixed it! ❤️
Python -VV
Pip Freeze
Reproduction Steps
Expected Behavior
i have a proplem when im importing mistral_inference when i finedtuned the mistral model based on this official repo but when im try to do an inferance using the mistral_inference it gives me this error
Additional Context
No response
Suggested Solutions
No response