Closed Justrebl closed 2 months ago
Might be related to #34, not sure the issue's satus.
@aishwaryabh , can you please validate this issue and also if it's related to issue #34
@Justrebl to unblock yourself, you can follow the dotnet-ooproc sample for text completion in this repo. Update the Program.cs, host.json
The sample uses the requirements specified in the README
So after replacing the following 3 lines in the csproj:
<PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.21.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.1.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore" Version="1.2.1" />
with the updated ones:
<PackageReference Include="Microsoft.Azure.Functions.Worker" Version="1.22.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http" Version="3.2.0" />
<PackageReference Include="Microsoft.Azure.Functions.Worker.Extensions.Http.AspNetCore" Version="1.3.2" />
And you run the code you see the following error:
So the key is to make the method async and add an await call like so:
[Function(nameof(WhatIs))]
public static async Task<HttpResponseData> WhatIs(
[HttpTrigger(AuthorizationLevel.Function, Route = "whatis/{technology}")] HttpRequestData req,
// [TextCompletionInput(prompt: "What is {technology}?", Model = "%MODEL_NAME%", Temperature = "%TEMPERATURE%", MaxTokens = "%MAX_TOKENS%")] TextCompletionResponse response)
[TextCompletionInput(prompt: "What is {technology}?", Model = "chat")] TextCompletionResponse response)
{
HttpResponseData responseData = req.CreateResponse(HttpStatusCode.OK);
await responseData.WriteStringAsync(response.Content);
return responseData;
}
Then you see the function running as expected.
What language are you using?
Dotnet (OOP) : 8.0.300 Extension Version : 0.16.0-alpha
Expected Behavior
TextCompletionInput to return an answer to the prompt + userinput
Actual Behavior
Function Execution runs undefinitely until timeout.
Host.json
Steps to Reproduce
Set the local.settings.json values to a valid Azure OpenAI resource & Deployment (tested with gpt-4-32k and gpt-35-turbo) and execute the repro sample here innovation-toolbox/AOAI-AzFunc-Net-OOProc-BindingExample
Relevant code being tried
Left the temperature and max_token values commented as i thought it was the problem initially, but it happens the scenario of the demo documented here also faces the same issue
Relevant log output
No logs until timeout
Where are you facing this problem?
Local - Core Tools
Additional Information
Will test another language to see if same behaviour