microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.32k stars 3.14k forks source link

Python: Bug: Semantic Kernel breaks FASTAPI Debugging Launch config #8420

Closed sminath33 closed 1 week ago

sminath33 commented 2 weeks ago

Describe the bug After adding the semantic kernel import to a python project running on FASTAPI from the VSCode Debugger this breaks pymeta (which is a required dependency). This prevents all debugging for our project. Seems there is a conflict between VSCode debugger and semantic kernel

To Reproduce Steps to reproduce the behavior:

  1. Add Semantic Kernel as a python dependency
  2. Add 'import semantic_kernel' to a file
  3. Launch FastAPI using uvicorn (-m uvicorn api-hosting-lambda.src.app:app --reload --port 8002) from a launch config (see below)
  4. pymetic error

Exception has occurred: ParseError (note: full exception trace is shown but execution is paused at: _G_many_2) (7, [('expected', 'letter or digit', None)]) File "/pymeta_generated_code/pymeta_grammarGrammar.py", line 259, in _G_many_2 (Current frame) _G_apply_1, lastError = self._apply(self.rule_letterOrDigit, "letterOrDigit", []) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/pymeta_generated_code/pymeta_grammarGrammar.py", line 262, in rule_name _G_many_3, lastError = self.many(_G_many_2) File "/pymeta_generated_code/pymeta_grammarGrammar.py", line 604, in _G_lookahead_2 _G_apply_1, lastError = self._apply(self.rule_name, "name", []) File "/pymeta_generated_code/pymeta_grammarGrammar.py", line 608, in rule_rule _G_lookahead_3, lastError = self.lookahead(_G_lookahead_2) File "/pymeta_generated_code/pymeta_grammar__Grammar.py", line 641, in _G_many_1 _G_apply_1, lastError = self._apply(self.rule_rule, "rule", []) File "/pymeta_generated_code/pymeta_grammar__Grammar.py", line 644, in rule_grammar _G_many_2, lastError = self.many(_G_many_1) File "/home/ubuntu/GenerativeAIGit/koch-gpt/api-hosting-lambda/src/koch_ai.py", line 8, in import semantic_kernel File "/home/ubuntu/GenerativeAIGit/koch-gpt/api-hosting-lambda/src/conversations_handler.py", line 26, in from koch_ai import KochAIChatBot File "/home/ubuntu/GenerativeAIGit/koch-gpt/api-hosting-lambda/src/app.py", line 6, in from conversations_handler import ( File "", line 1, in pymeta.runtime.ParseError: (7, [('expected', 'letter or digit', None)])

Expected behavior FastAPI should start up in the debugging session within VSCode

Screenshots VSCode debugging Launch config { "name": "API Hosting on FastAPI", "type": "python", "request": "launch", "module": "uvicorn", "args": [ "api-hosting-lambda.src.app:app", "--reload", "--port", "8002" ], "cwd": "${workspaceFolder}/api-hosting-lambda/src", "env": { "JSON_LOGS": "0", "LOG_LEVEL": "debug" }, "jinja": true, "justMyCode": true } Platform

Additional context Starting the app with the same command 'poetry run python -m uvicorn app:app --reload --port 8002' outside of a debugger works fine.

eavanvalkenburg commented 1 week ago

Hi @sminath33

Not sure what is going on here, I just tried this and did not encounter issues, I have the following script, using fastapi[stardard] or fastapi[all] and SK (both just take the latest from pypi:

from typing import Union

import semantic_kernel
from fastapi import FastAPI

app = FastAPI()

kernel = semantic_kernel.Kernel()

@app.get("/")
def read_root():
    return {"Hello": "World"}

@app.get("/items/{item_id}")
def read_item(item_id: int, q: Union[str, None] = None):
    return {"item_id": item_id, "q": q}

and this starts, both with the same launch setup as you have, and with the newer debugpy launch command.

{
    "name": "Python Debugger: FastAPI",
    "type": "debugpy",
    "request": "launch",
    "module": "uvicorn",
    "args": [
        "main:app",
        "--reload"
    ],
    "jinja": true
},
eavanvalkenburg commented 1 week ago

Please reopen if you are still having issues!