microsoft / dev-proxy

Dev Proxy is an API simulator that helps you effortlessly test your app beyond the happy path.
https://aka.ms/devproxy
MIT License
457 stars 53 forks source link

[BUG]: Error when using local lang. model support for OpenAPI spec generation #810

Closed andrewconnell closed 2 weeks ago

andrewconnell commented 2 weeks ago

Description

Following docs:

When recording & saving a session, I received the error: OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.

◉ Recording...

 req   ╭ GET https://api.nasa.gov/mars-photos/api/v1/rovers/curiosity/photos?api_key=KzzpfOrja1LWM5QEoExF1d3CQNfGRjXv6WL8I7iw&sol=1000&camera=MAST
 api   ╰ Passed through
○ Stopped recording
 info    Creating OpenAPI spec from recorded requests...
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 info    Created OpenAPI spec file api.nasa.gov-20240627153754.json
 info    DONE

Expected behaviour

It creates the OpenAPI spec file using the local lang. model support. shown in this post: https://devblogs.microsoft.com/microsoft365dev/dev-proxy-v0-19-with-simulating-llm-apis-and-new-azure-api-center-integrations/?ocid=microsoft365dev_eml_tnp_autoid134_title

Actual behaviour

Error when saving the OpenAPI spec from the recorded requests:

◉ Recording...

 req   ╭ GET https://api.nasa.gov/mars-photos/api/v1/rovers/curiosity/photos?api_key=KzzpfOrja1LWM5QEoExF1d3CQNfGRjXv6WL8I7iw&sol=1000&camera=MAST
 api   ╰ Passed through
○ Stopped recording
 info    Creating OpenAPI spec from recorded requests...
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 info    Created OpenAPI spec file api.nasa.gov-20240627153754.json
 info    DONE

Steps to reproduce

  1. install Dev Proxy & do the initial run to trust cert

    brew tap microsoft/dev-proxy
    brew install dev-proxy
  2. install Ollama & start service

    brew install ollama
    brew services start ollama
  3. verify Ollama is running and listening on default port by trying to start it again

    ollama serve
    Error: listen tcp 127.0.0.1:11434: bind: address already in use

    NOTE - after i got the Actual Behavior error above, i repeated the process but at this point, I had Ollama download & run the Phi3 model (ollama run phi3), but this had no effect - still got the same error.

  4. update Dev Proxy config file to add the OpenApiSpecGeneratorPlugin plugin, update the urlsToWatch, & enable the local language model

  5. start Dev Proxy, start recording

    devproxy --failure-rate 0
    r
  6. navigate to the following URL: https://api.nasa.gov/mars-photos/api/v1/rovers/spirit/photos?api_key=DEMO_KEY&sol=1&page=1

  7. stop recording

    s
  8. observe the error in the console... but the OpenAPI spec file is successfully created

Dev Proxy Version

0.19.0

Operating system (environment)

macOS

Shell

zsh

Configuration file

{
  "$schema": "https://raw.githubusercontent.com/microsoft/dev-proxy/main/schemas/v0.19.0/rc.schema.json",
  "plugins": [
    {
      "name": "RetryAfterPlugin",
      "enabled": true,
      "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
    },
    {
      "name": "GenericRandomErrorPlugin",
      "enabled": true,
      "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll",
      "configSection": "genericRandomErrorPlugin"
    },
    {
      "name": "OpenApiSpecGeneratorPlugin",
      "enabled": true,
      "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
    }
  ],
  "urlsToWatch": [
    "https://jsonplaceholder.typicode.com/*",
    "https://api.nasa.gov/*"
  ],
  "genericRandomErrorPlugin": {
    "errorsFile": "devproxy-errors.json"
  },
  "rate": 50,
  "logLevel": "information",
  "newVersionNotification": "stable",
  "languageModel": { "enabled": true }
}

Additional Info

No response

garrytrinder commented 2 weeks ago

Thanks for reporting @andrewconnell sorry for the trouble, we will take a look and get back to you soon!

andrewconnell commented 2 weeks ago

NP... looks like it's a new feature & maybe I'm missing something. Frankly, it's also the first time I've used Ollama & a local lang. model, so I could have messed something up there too.

waldekmastykarz commented 2 weeks ago

I can repro and most likely it's a bug on our end. Sorry for the trouble. Will fix asap

waldekmastykarz commented 2 weeks ago

Confirmed issue in our code. PR with fix open and will merge asap. Sorry for the trouble and thanks for letting us know

waldekmastykarz commented 2 weeks ago

Bug fixed in v0.19.1 available on Homebrew and shortly on winget