ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

json format do not return all the results #94

Open jojogh opened 3 months ago

jojogh commented 3 months ago

The code is below and problem i met is below:

code = """

package org.demo.codesmell.config;

public class AppConfig {

public static final String APP_PASSWORD = "appPassword";

public static final String Password = "password";
}

""" prompt = f""" Please review the following code, and you must find all the issues in the code:{code}"""

import ollama #ollama run codellama:34b-instruct-q5_K_M from ollama import Options

response = ollama.chat(model='codellama:34b-instruct-q5_K_M', messages=[ { 'role': 'user', 'content': prompt, },

],

format="json",

        #stream=True,
        options=Options(
            temperature=0,
            #top_p=0.9,
            #max_tokens=1024,
            num_ctx=8192,
            num_predict=-1)
        )

print(response['message']["content"])

It will return the following informations, 5 issues The code you provided is a Java class that defines some constants for an application configuration. Here are the issues I found:

  1. The naming convention for constants is not consistent. Some constants are in all caps, while others are camelCase. It's best to use a consistent naming convention throughout the code.
  2. The Password constant is misspelled. It should be APP_PASSWORD.
  3. The APP_PASSWORD constant is not used anywhere in the code. If it's not being used, it can be removed.
  4. There are no getters or setters for the constants. It's best to provide getters and setters for each constant so that they can be accessed and modified from outside the class.
  5. The class is not final, which means it can be extended. However, there is no need to extend this class as it only contains constants. It would be better to make the class final to prevent unintended inheritance.

But, as I just use the option, format="json", it just return 2 issues in json format, is that a bug?

gautam-fairpe commented 2 months ago

No, It is not a bug. When you force model to return in specified json format, result will not be as good as normal generation.

gautam-fairpe commented 2 months ago

I faced similar issue while using llama cpp python with lm-format-enforcer.

https://github.com/noamgat/lm-format-enforcer#:~:text=Diagnostics%20%2D%20Will%20I%20always%20get%20good%20results%3F