This runs fine in parallel with no issues. However, adding model = CoT(signature).activate_assertions() makes the execution sequential.
Is there any workaround? Am I doing something wrong? Thanks in advance!
PS:
Here is the CoT class in case its useful:
class CoT(dspy.Module):
def __init__(self, signature: dspy.SignatureMeta):
super().__init__()
self.generate_answer = dspy.ChainOfThought(signature)
def forward(self, **kwargs):
pred = self.generate_answer(**kwargs)
json_answer = re.findall(r"(\{[^}]*\})", pred.answer)
if json_answer:
pred.answer = json_answer[0]
print(pred.answer)
dspy.Suggest(
bool(json_answer),
"answer must be a string that can be passed directly to json.loads",
)
return pred
I am trying to parellelize the calls to the OpeanAI API using the ThreadPoolExecutor, within the function that each thread runs I have the following:
This runs fine in parallel with no issues. However, adding
model = CoT(signature).activate_assertions()
makes the execution sequential.Is there any workaround? Am I doing something wrong? Thanks in advance!
PS:
Here is the CoT class in case its useful: