This is a Phi-3 book for getting started with Phi-3. Phi-3, a family of open sourced AI models developed by Microsoft. Phi-3 models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks.
MIT License
2.49k
stars
256
forks
source link
Unhandled exception. System.ClientModel.ClientResultException: HTTP 400 (api_error: ) phi3.5 does not support tools #196
- [x ] bug report -> please search issues before submitting
- [ ] feature request
- [ ] documentation issue or request
- [ ] regression (a behavior that used to work and stopped in a new release)
Minimal steps to reproduce
go to Phi-3CookBook\md\07.Labs\CsharpOllamaCodeSpaces\src\Sample03
run dotnet run
phi3.5 does not support tools
at OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
at OpenAI.Chat.ChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)
at OpenAI.Chat.ChatClient.<>c__DisplayClass8_0.<g__getResultAsync|0>d.MoveNext()
--- End of stack trace from previous location ---
at OpenAI.Chat.AsyncStreamingChatCompletionUpdateCollection.AsyncStreamingChatUpdateEnumerator.CreateEventEnumeratorAsync()
at OpenAI.Chat.AsyncStreamingChatCompletionUpdateCollection.AsyncStreamingChatUpdateEnumerator.System.Collections.Generic.IAsyncEnumerator.MoveNextAsync() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeStreamingCoreAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeStreamingCoreAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeStreamingCoreAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()
at Microsoft.SemanticKernel.KernelFunction.InvokeStreamingAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunction.InvokeStreamingAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunction.InvokeStreamingAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()
at Program.$(String[] args) in D:\code\github\topce\Phi-3CookBook\md\07.Labs\CsharpOllamaCodeSpaces\src\Sample03\Program.cs:line 57
at Program.$(String[] args) in D:\code\github\topce\Phi-3CookBook\md\07.Labs\CsharpOllamaCodeSpaces\src\Sample03\Program.cs:line 57
at Program.(String[] args)
Expected/desired behavior
OS and Version?
Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)
This issue is for a: (mark with an
x
)Minimal steps to reproduce
phi3.5 does not support tools at OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options) at OpenAI.Chat.ChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)g__getResultAsync|0>d.MoveNext()
--- End of stack trace from previous location ---
at OpenAI.Chat.AsyncStreamingChatCompletionUpdateCollection.AsyncStreamingChatUpdateEnumerator.CreateEventEnumeratorAsync()
at OpenAI.Chat.AsyncStreamingChatCompletionUpdateCollection.AsyncStreamingChatUpdateEnumerator.System.Collections.Generic.IAsyncEnumerator.MoveNextAsync() at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.GetStreamingChatMessageContentsAsync(String targetModel, ChatHistory chat, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeStreamingCoreAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeStreamingCoreAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeStreamingCoreAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()
at Microsoft.SemanticKernel.KernelFunction.InvokeStreamingAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunction.InvokeStreamingAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.KernelFunction.InvokeStreamingAsync[TResult](Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource.GetResult()
at Program.$(String[] args) in D:\code\github\topce\Phi-3CookBook\md\07.Labs\CsharpOllamaCodeSpaces\src\Sample03\Program.cs:line 57
at Program.$(String[] args) in D:\code\github\topce\Phi-3CookBook\md\07.Labs\CsharpOllamaCodeSpaces\src\Sample03\Program.cs:line 57
at Program.(String[] args)
at OpenAI.Chat.ChatClient.<>c__DisplayClass8_0.<
Expected/desired behavior
OS and Version?
azd version?
Versions
Mention any other details that might be useful