ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

fix: Ensure JSON response is properly parsed in generate function when format is set to 'json' #199

Open wyy511511 opened 4 days ago

wyy511511 commented 4 days ago

When I setformat as "json", my response looks like this:

{'model': 'llama3', 'created_at': '2024-06-28T05:44:32.12715Z', 'response': '{"results": [\n{\n"entity_type": "IMEI",\n"text": "06-184755-866851-3"\n}\n]}\n\n \n\n ', 'done': True, 'done_reason': 'stop', 'context': [], 'total_duration': 4528553708, 'load_duration': 2316500, 'prompt_eval_count': 437, 'prompt_eval_duration': 2743245000, 'eval_count': 32, 'eval_duration': 1781380000}

As you can notice, although it indeed returns a JSON object, the previous code parses the 'response' as a string (hence the presence of \n characters and type checking result).

After I added some code to correctly parse the JSON response, the response now looks like this:

{'model': 'llama3', 'created_at': '2024-06-28T09:16:56.417739Z', 'response': {'results': [{'entity_type': 'IMEI', 'text': '06-184755-866851-3'}]}, 'done': True, 'done_reason': 'stop', 'context':

It becomes cleaner and matches my expected format.