Closed igormoreira-2 closed 1 year ago
Hi @igormoreira-2
I don't see any problems with the example you provided. Whenever you call the _get_prompt()
method (which in fact is not supposed to be called by the users in most of the cases), only the prompt itself (that is passed to the model afterwards) is returned. The line "Your JSON response" is a part of the prompt as it instructs the model to immediately start producing the JSON. However, the JSON object itself is obviously cannot be in the prompt as it has to be generated by the model.
What exactly are you trying to achieve ? If you need the predictions, you can just call .predict([query])
, but if you want to see the raw response of the model, this is not really possible.
Hi @OKUA1 , thanks for the swift response. You are right, there is no issue at all. It's just that by reading the get_prompt()
response I assumed the json output would be there. But yeah, when I called the .predict()
method, it worked just fine. As I said, I just started experimenting with skllm
2 days ago, so I'm still learning ;) .
Hi,
I just started using
skllm
and I have tried to build a simple DynamicFewShotGPTClassifier with the following code:Everything seems to work just fine, but the
Your JSON response
is blank. Any ideas of what could be happening? ThanksPrompt output: