Closed yaswanth-iitkgp closed 2 months ago
Hi @yaswanth-iitkgp, can you make your _single_call
function return a list of string (e.g., [generated_text]
)?
Hi @hsiehjackson , Thanks for the help, but I was able to use the model by just using HF model for now.
Description:
I was unable to run the
vllm
server on my server, so I modified theVLLMClient
class in theRULER/scripts/pred/client_wrappers.py
file. Below is the modified code for theVLLMClient
class:When I run the command bash run.sh llama3 synthetic, I do not see any generated text and encounter the following error:
I am trying to get the scores for llama3-8B and then I want to test on llama3-8B-1M context. Could you please help me resolve this issue?