TheoKanning / openai-java

OpenAI Api Client in Java
MIT License
4.78k stars 1.2k forks source link

retrofit2 HTTP 400 Exception #32

Closed PhoenixOrigin closed 1 year ago

PhoenixOrigin commented 2 years ago
package net.Amogh;

import com.theokanning.openai.OpenAiService;
import com.theokanning.openai.completion.CompletionRequest;

public class Main {

    public static void main(String[] args){
        OpenAiService service = new OpenAiService("token here");
        CompletionRequest completionRequest = CompletionRequest.builder()
                .prompt("Hi")
                .echo(true)
                .build();
        service.createCompletion(completionRequest).getChoices().forEach(System.out::println);
    }
}

I have removed the token however I am sure it is correct. It produces

Exception in thread "main" retrofit2.adapter.rxjava2.HttpException: HTTP 400 
    at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:57)
    at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:38)
    at retrofit2.adapter.rxjava2.CallExecuteObservable.subscribeActual(CallExecuteObservable.java:48)
    at io.reactivex.Observable.subscribe(Observable.java:10151)
    at retrofit2.adapter.rxjava2.BodyObservable.subscribeActual(BodyObservable.java:35)
    at io.reactivex.Observable.subscribe(Observable.java:10151)
    at io.reactivex.internal.operators.observable.ObservableSingleSingle.subscribeActual(ObservableSingleSingle.java:35)
    at io.reactivex.Single.subscribe(Single.java:2517)
    at io.reactivex.Single.blockingGet(Single.java:2001)
    at com.theokanning.openai.OpenAiService.createCompletion(OpenAiService.java:91)
    at net.Amogh.Main.main(Main.java:14)

every single time its run.

cryptoapebot commented 2 years ago

I (might) still be on 0.6, but this code works for me. retrofit-1.9.0.jar okhttp-4.10.0.jar retrofit2-rxjava2-adapter-1.0.0.jar rxjava-2.2.21.jar

I had very similar problems which the workaround was exact jar versions.

    public static String generate3(String prompt) {
        String response = "Your comment makes me ";
        String original = response;

        if (Blacklist.censor(prompt) || prompt.contains(Quotes.disclaimer)) {
            System.out.println("Found blacklisted word.");
            return response;
        } else {
            System.out.println("prompting bot with: " + prompt);
            CompletionRequest completionRequest = CompletionRequest.builder()
                    .prompt(mood.get(feeling) + prompt)
                    .maxTokens(248)
                    .temperature(0.80)
                    .topP(1.0)
                    .frequencyPenalty(0.55)
                    .presencePenalty(0.49)
                    .echo(false)
                    .build();
            List<String> responses = new ArrayList<String>();

            try {
                responses = service.createCompletion(engine, completionRequest).getChoices().stream().map(CompletionChoice::getText).collect(Collectors.toList());
                responses.add(response);

                if (responses != null && responses.size() > 0) {
                    responses.stream().forEach(System.out::println);
                    response = responses.get(0);
                } else {
                    System.out.println("Response is null or size=0");
                }
            } catch (Exception e) {
                System.out.println(e.getMessage());
            }

            return (original.equals(response)) ? response : Quotes.disclaimer + response + "...\n";
        }
    }
cryptoapebot commented 2 years ago

Also, FYI.

OpenAI has deprecated engine-based endpoints in favor of model-based endpoints. For example, instead of using v1/engines/{engine_id}/completions, switch to v1/completions and specify the model in the CompletionRequest. The code includes upgrade instructions for all deprecated endpoints.

I think you use the model via:

.model("curie")

public static final String engine = "davinci";
public static final String full_engine = "text-davinci-002";
public static final String full_insert_engine = "text-davinci-insert-002";
PhoenixOrigin commented 2 years ago

Ok thanks, I tried in node's and it worked but anyways can u send me ur build.gradle?

PhoenixOrigin commented 2 years ago

nope im using 0.8.0 and it doesnt work

plugins {
    id 'java'
}
group 'net.Amogh'
version '1.0-SNAPSHOT'

repositories {
    mavenCentral()
}

dependencies {
    testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
    testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'

    implementation 'com.theokanning.openai-gpt3-java:client:0.8.0'
    // https://mvnrepository.com/artifact/com.squareup.retrofit/retrofit
    implementation group: 'com.squareup.retrofit', name: 'retrofit', version: '1.9.0'
    implementation group: 'com.squareup.okhttp3', name: 'okhttp', version: '4.10.0-RC1'
    implementation group: 'com.jakewharton.retrofit', name: 'retrofit2-rxjava2-adapter', version: '1.0.0-RC1'
    implementation group: 'io.reactivex.rxjava2', name: 'rxjava', version: '2.2.21'

}

test {
    useJUnitPlatform()
}

build.gradle ^

cryptoapebot commented 2 years ago

Curious.

buildscript {
    repositories {
        mavenCentral()
    }

    dependencies {
        classpath 'com.vanniktech:gradle-maven-publish-plugin:0.19.0'
    }
}

allprojects {
    repositories {
        mavenCentral()
    }

    plugins.withId("com.vanniktech.maven.publish") {
        mavenPublish {
            sonatypeHost = "S01"
        }
    }
}
cryptoapebot commented 2 years ago

And the API one.

apply plugin: 'java-library'
apply plugin: "com.vanniktech.maven.publish"

dependencies {
    compileOnly 'org.projectlombok:lombok:1.18.22'
    annotationProcessor 'org.projectlombok:lombok:1.18.22'
}

compileJava {
    sourceCompatibility = '1.8'
    targetCompatibility = '1.8'
}
cryptoapebot commented 2 years ago

One other check.

------------------------------------------------------------
Gradle 7.4.2
------------------------------------------------------------

Build time:   2022-03-31 15:25:29 UTC
Revision:     540473b8118064efcc264694cbcaa4b677f61041

Kotlin:       1.5.31
Groovy:       3.0.9
Ant:          Apache Ant(TM) version 1.10.11 compiled on July 10 2021
JVM:          18.0.2-ea (Private Build 18.0.2-ea+9-Ubuntu-222.04)
OS:           Linux 5.15.0-47-generic amd64
PhoenixOrigin commented 2 years ago

Ye I tried manually using app and its working fine e

PhoenixOrigin commented 2 years ago

can u send me a full build.gradle i can try?

cch0 commented 1 year ago

Does the issue still exist ? 0.9.0 version works for me with the provided OpenAIService example.

'com.theokanning.openai-gpt3-java:client:0.9.0'

Here is the build.gradle

plugins {
    id 'java'
}

targetCompatibility = JavaVersion.VERSION_1_8
group = 'com.example'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = "8"

repositories {
    mavenCentral()
}

dependencies {
    implementation 'com.theokanning.openai-gpt3-java:client:0.9.0'
}

tasks.named('test') {
    useJUnitPlatform()
}
TheoKanning commented 1 year ago

Try upgrading to 0.10.0 and using the OpenAiService from the new service library. It will read the server error message and display it in the stack trace for you.