modular-ml / wrapyfi-examples_llama

Inference code for facebook LLaMA models with Wrapyfi support
GNU General Public License v3.0
130 stars 14 forks source link

where is zeromq_proxy_broker.py and dir standalone? #4

Closed ChuangLee closed 1 year ago

ChuangLee commented 1 year ago

The examples in readme confused me, where is zeromq_proxy_broker.py file and standalone dir?

Replace all occurances of and before running the scripts

Start the Wrapyfi ZeroMQ broker from within the Wrapyfi repo:

cd wrapyfi/standalone python zeromq_proxy_broker.py --comm_type pubsubpoll Start the first instance of the Wrapyfi-wrapped LLaMA from within this repo and env (order is important, dont start wrapyfi_device_idx=0 before wrapyfi_device_idx=1): CUDA_VISIBLE_DEVICES="0" OMP_NUM_THREADS=1 torchrun --nproc_per_node 1 example.py --ckpt_dir /checkpoints/7B --tokenizer_path /checkpoints/tokenizer.model --wrapyfi_device_idx 1 --wrapyfi_total_devices 2

ChuangLee commented 1 year ago

sorry , the 'wrapyfi/standalone' is subdirectory of wrapyfi project, not in this wrapyfi-examples_llam dir.