Closed psinha30 closed 2 years ago
from transformers import AutoModelForCausalLM, AutoTokenizer from parallelformers import parallelize import torch model = AutoModelForCausalLM.from_pretrained("./2.7B") tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-2.7B") print("Model Loaded") parallelize(model, num_gpus=2, fp16=True, verbose='detail')
Error
how to fix?
How to reproduce
Environment