pytorch-labs / gpt-fast

Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
BSD 3-Clause "New" or "Revised" License
5.36k stars 485 forks source link

Can we run gpt-fast from Windows Command Prompt or Powershell? #45

Closed maxloosmu closed 6 months ago

maxloosmu commented 6 months ago

For this code:

export MODEL_REPO=meta-llama/Llama-2-7b-chat-hf
./scripts/prepare.sh $MODEL_REPO

I could change to: set MODEL_REPO=meta-llama/Llama-2-7b-chat-hf

But I'm not sure how to proceed further. Is there a way to run gpt-fast from Windows Command Prompt or Powershell?

Screenshot 2023-12-13 015631

Chillee commented 6 months ago

I don't think this repo supports powershell - I would suggest using WSL if you want to run this repo on Windows.

maxloosmu commented 6 months ago

@Chillee thanks, but i'm having problems trying to run WSL2 with GPU:

https://github.com/microsoft/WSL/issues/10869

Chillee commented 6 months ago

Unfortunately, I don't have any plans to support powershell in this repo.