microsoft / vscode-copilot-release

Feedback on GitHub Copilot Chat UX in Visual Studio Code.
https://marketplace.visualstudio.com/items?itemName=GitHub.copilot-chat
Creative Commons Attribution 4.0 International
342 stars 37 forks source link

프롬프트 응답 실패. #2732

Open Phantomn opened 2 days ago

Phantomn commented 2 days ago

Type: Bug

LLM 구현과 관련된 질문에서 자체적으로 응답을 필터링하여 결과를 출력시키지 않음

Extension version: 0.22.4 VS Code version: Code 1.95.3 (f1a4fb101478ce6ec82fe9627c43efbf9e98c813, 2024-11-13T14:50:04.152Z) OS version: Windows_NT x64 10.0.26100 Modes: Remote OS version: Linux x64 5.15.153.1-microsoft-standard-WSL2

System Info |Item|Value| |---|---| |CPUs|11th Gen Intel(R) Core(TM) i7-11700 @ 2.50GHz (16 x 2496)| |GPU Status|2d_canvas: enabled
canvas_oop_rasterization: enabled_on
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
skia_graphite: disabled_off
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: enabled
webnn: disabled_off| |Load (avg)|undefined| |Memory (System)|31.77GB (7.71GB free)| |Process Argv|--folder-uri=vscode-remote://wsl+Ubuntu-22.04/home/phantom/ocr --remote=wsl+Ubuntu-22.04 --crash-reporter-id 6b812d20-c719-4bf5-ac6a-aeb1b4ebb2ef| |Screen Reader|no| |VM|50%| |Item|Value| |---|---| |Remote|WSL: Ubuntu-22.04| |OS|Linux x64 5.15.153.1-microsoft-standard-WSL2| |CPUs|11th Gen Intel(R) Core(TM) i7-11700 @ 2.50GHz (16 x 0)| |Memory (System)|15.51GB (12.82GB free)| |VM|0%|
A/B Experiments ``` vsliv368:30146709 vspor879:30202332 vspor708:30202333 vspor363:30204092 vswsl492cf:30256860 vscod805cf:30301675 binariesv615:30325510 vsaa593cf:30376535 py29gd2263:31024239 c4g48928:30535728 azure-dev_surveyone:30548225 962ge761:30959799 pythonnoceb:30805159 asynctok:30898717 pythonmypyd1:30879173 h48ei257:31000450 pythontbext0:30879054 cppperfnew:31000557 dsvsc020:30976470 pythonait:31006305 dsvsc021:30996838 bdiig495:31013172 dvdeprecation:31068756 dwnewjupytercf:31046870 nativerepl1:31139838 pythonrstrctxt:31112756 cf971741:31144450 iacca1:31171482 notype1:31157159 5fd0e150:31155592 dwcopilot:31170013 stablechunks:31184530 ```
TylerLeonhardt commented 1 day ago

English summary: When asking questions related to LLM implementation, the system fails to filter and output responses correctly.