Open jackqdldd opened 1 week ago
runconfig = RunConfig( timeout=120, max_retries=1, max_wait=120, max_workers=1, log_tenacity=True ) testset = generator.generate( testset_size=args.test_size, query_distribution=distributions, run_config=runconfig, with_debugging_logs=True, raise_exceptions=True, )
The max_workers seems not work! Frequent calls to LLM caused it to freeze
hey @jackqdldd which LLM, LLM provider are you using?
runconfig = RunConfig( timeout=120, max_retries=1, max_wait=120, max_workers=1, log_tenacity=True ) testset = generator.generate( testset_size=args.test_size, query_distribution=distributions, run_config=runconfig, with_debugging_logs=True, raise_exceptions=True, )
The max_workers seems not work! Frequent calls to LLM caused it to freeze