Closed mbofb closed 5 months ago
I had the same issue. In my case it was caused by the addition of "inferenceParameters.setAntiPrompt("\n");" When I use some other string than newline or don't set the anti-prompt at all, I get answers.
changing setAntiPrompt on the model I used did not help to get text for this model, but I tried another model (nous-hermes-llama-2-7b/nous-hermes-llama-2-7b.Q4_0.gguf) and it works now for this different model.
I just released version 3.0 and the problems should hopefully no longer occur. Otherwise, feel free to re-open.
response from Llama is empty