Open joykirat18 opened 3 months ago
Not able to load microsoft/Phi-3-mini-128k-instruct.
Code snippet
def loadTransformerLensModel(modelPath): tokenizer = AutoTokenizer.from_pretrained(modelPath) hf_model = AutoModelForCausalLM.from_pretrained( "microsoft/Phi-3-mini-128k-instruct", device_map="cuda", torch_dtype="auto", trust_remote_code=True, ) model = HookedTransformer.from_pretrained(modelPath, hf_model=hf_model, device='cpu') return model, tokenizer model, tokenizer = loadTransformerLensModel(model_name)```
What changes should be made for running Phi-3-mini-128k-instruct
Not able to load microsoft/Phi-3-mini-128k-instruct.
Code snippet