ggerganov / llama.cpp

LLM inference in C/C++
MIT License
63.87k stars 9.15k forks source link

Bug: The "server" provided web-ui chat seems to sometimes not properly quote "<" ">" charaters in its HTML output. #7905

Closed ghchris2021 closed 3 weeks ago

ghchris2021 commented 2 months ago

What happened?

The llama.cpp was built from yesterday's main GIT. IIRC similar output anomalies were seen in previous versions and are able to be seen at least in firefox under linux as a browser UI client of the "server"'s web interface. IIRC similar HTML output issues were seen whether using llama3, codestral, et. al. so I presume it may be a simple / ubiquitous HTML generation / quoting thing vs. anything model / server arguments specific.

In the below output I expected to see the usual specified include files in "" style instead of apparently the "<" affecting the HTML markup parsing in the browser likely for not always being escaped / quoted.

Example output / test case:

` User: Show an example of C code including several system header files.

Llama: Sure, here's an example of a simple C program that includes several system header files:

c

include

include

include

include

include

`

Name and Version

version: 52 (73bac2b) built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu

What operating system are you seeing the problem on?

Linux

Relevant log output

No response

ghchris2021 commented 2 months ago

Apparently this is related to

3723

and similar was mentioned as one of many error cases in https://github.com/ggerganov/llama.cpp/issues/3723#issuecomment-2166814000

github-actions[bot] commented 3 weeks ago

This issue was closed because it has been inactive for 14 days since being marked as stale.