redotvideo / mamba-chat

Mamba-Chat: A chat LLM based on the state-space model architecture 🐍
Apache License 2.0
911 stars 69 forks source link

How could I run this on windows 10? #10

Open KevinRyu opened 11 months ago

KevinRyu commented 11 months ago

Hello,

When I tried to install packages with requirements.txt, I got the following error.

Error ERROR: Could not find a version that satisfies the requirement triton (from mamba-ssm) (from versions: none) ERROR: No matching distribution found for triton

As I know, triton package supports something like linux only. What should I do?

justusmattern27 commented 11 months ago

Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here

SzaremehrjardiMT commented 11 months ago

Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here

Does Mamba Chat only run on a GPU?

justusmattern27 commented 11 months ago

@SzaremehrjardiMT Currently yes. There's an open issue in llama.cpp to support the mamba architecture, though, which would make it possible to run without a GPU: https://github.com/ggerganov/llama.cpp/issues/4353

KevinRyu commented 11 months ago

Hey, it probably doesn't make sense to run this on Windows as you'll need a GPU (which I assume you don't have locally). It's probably best to use some cloud GPU service or just run on Google Colab. You can find a demo Mamba-Chat on Google Colab here

richardburleigh commented 11 months ago

@KevinRyu It won't be optimized, but you can try mamba-minimal