Azure / azure-functions-openai-extension

An extension that adds support for Azure OpenAI/ OpenAI bindings in Azure Functions for LLM (GPT-3.5-Turbo, GPT-4, etc)
MIT License
73 stars 26 forks source link

[Dotnet-ooproc] TextCompletion never returns a response #95

Closed Justrebl closed 2 months ago

Justrebl commented 3 months ago

What language are you using?

Dotnet (OOP) : 8.0.300 Extension Version : 0.16.0-alpha

Expected Behavior

TextCompletionInput to return an answer to the prompt + userinput

Actual Behavior

Function Execution runs undefinitely until timeout.

image

Host.json

{
    "version": "2.0",
    "logging": {
        "logLevel": {
            "default": "Trace"
        },
        "applicationInsights": {
            "samplingSettings": {
                "isEnabled": true,
                "excludedTypes": "Request"
            },
            "enableLiveMetricsFilters": true
        }
    }
}

Steps to Reproduce

Set the local.settings.json values to a valid Azure OpenAI resource & Deployment (tested with gpt-4-32k and gpt-35-turbo) and execute the repro sample here innovation-toolbox/AOAI-AzFunc-Net-OOProc-BindingExample

Relevant code being tried

[Function(nameof(WhatIs))]
        public static HttpResponseData WhatIs(
            [HttpTrigger(AuthorizationLevel.Function, Route = "whatis/{technology}")] HttpRequestData req,
            // [TextCompletionInput(prompt: "What is {technology}?", Model = "%MODEL_NAME%", Temperature = "%TEMPERATURE%", MaxTokens = "%MAX_TOKENS%")] TextCompletionResponse response)
            [TextCompletionInput(prompt: "What is {technology}?", Model = "%MODEL_NAME%")] TextCompletionResponse response)

        {
            HttpResponseData responseData = req.CreateResponse(HttpStatusCode.OK);
            responseData.WriteString(response.Content);
            return responseData;
        }

Left the temperature and max_token values commented as i thought it was the problem initially, but it happens the scenario of the demo documented here also faces the same issue

Relevant log output

No logs until timeout

Where are you facing this problem?

Local - Core Tools

Additional Information

Will test another language to see if same behaviour

Justrebl commented 3 months ago

Might be related to #34, not sure the issue's satus.

manvkaur commented 3 months ago

@aishwaryabh , can you please validate this issue and also if it's related to issue #34

manvkaur commented 3 months ago

@Justrebl to unblock yourself, you can follow the dotnet-ooproc sample for text completion in this repo. Update the Program.cs, host.json

The sample uses the requirements specified in the README

aishwaryabh commented 2 months ago

So after replacing the following 3 lines in the csproj:

     <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.21.0" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.1.0" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore" Version="1.2.1" />

with the updated ones:

    <PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.22.0" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.2.0" />
    <PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore" Version="1.3.2" />

And you run the code you see the following error:

Image

So the key is to make the method async and add an await call like so:

    [Function(nameof(WhatIs))]
    public static async Task<HttpResponseData> WhatIs(
            [HttpTrigger(AuthorizationLevel.Function, Route = "whatis/{technology}")] HttpRequestData req,
            // [TextCompletionInput(prompt: "What is {technology}?", Model = "%MODEL_NAME%", Temperature = "%TEMPERATURE%", MaxTokens = "%MAX_TOKENS%")] TextCompletionResponse response)
            [TextCompletionInput(prompt: "What is {technology}?", Model = "chat")] TextCompletionResponse response)

    {
        HttpResponseData responseData = req.CreateResponse(HttpStatusCode.OK);
        await responseData.WriteStringAsync(response.Content);
        return responseData;
    }

Then you see the function running as expected.