Closed andrewconnell closed 2 weeks ago
Thanks for reporting @andrewconnell sorry for the trouble, we will take a look and get back to you soon!
NP... looks like it's a new feature & maybe I'm missing something. Frankly, it's also the first time I've used Ollama & a local lang. model, so I could have messed something up there too.
I can repro and most likely it's a bug on our end. Sorry for the trouble. Will fix asap
Confirmed issue in our code. PR with fix open and will merge asap. Sorry for the trouble and thanks for letting us know
Bug fixed in v0.19.1 available on Homebrew and shortly on winget
Description
Following docs:
When recording & saving a session, I received the error: OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
Expected behaviour
It creates the OpenAPI spec file using the local lang. model support. shown in this post: https://devblogs.microsoft.com/microsoft365dev/dev-proxy-v0-19-with-simulating-llm-apis-and-new-azure-api-center-integrations/?ocid=microsoft365dev_eml_tnp_autoid134_title
Actual behaviour
Error when saving the OpenAPI spec from the recorded requests:
Steps to reproduce
install Dev Proxy & do the initial run to trust cert
install Ollama & start service
verify Ollama is running and listening on default port by trying to start it again
update Dev Proxy config file to add the OpenApiSpecGeneratorPlugin plugin, update the
urlsToWatch
, & enable the local language modelstart Dev Proxy, start recording
navigate to the following URL: https://api.nasa.gov/mars-photos/api/v1/rovers/spirit/photos?api_key=DEMO_KEY&sol=1&page=1
stop recording
observe the error in the console... but the OpenAPI spec file is successfully created
Dev Proxy Version
0.19.0
Operating system (environment)
macOS
Shell
zsh
Configuration file
Additional Info
No response