Closed starplatinum3 closed 4 months ago
I modified the code, for example . it works
context_score = few_shot_context_relevance_scoring_vllm(
context_relevance_system_prompt, query, document, model_choice, query_id, debug_mode, host_url, request_delay, failed_extraction_count, in_domain_prompts_dataset
,openai_api_key=openai_api_key)
add openai_api_key param
if VLLM_AVAILABLE:
def few_shot_context_relevance_scoring_vllm(
system_prompt: str, query: str, document: str, model_choice: str,
query_id: str, debug_mode: bool, host_url: str, request_delay: int,
failed_extraction_count: Dict[str,int] = {'failed': 0},
few_shot_examples=None,openai_api_key = "EMPTY"
) -> int:
client = OpenAI(
api_key=openai_api_key,
base_url=host_url
)
The local model needs to pass in a token in order to request it, May I ask if there is such a parameter