HelgeSverre / ollama-gui

A Web Interface for chatting with your local LLMs via the ollama API
https://ollama-gui.vercel.app/
MIT License
515 stars 84 forks source link

CSS Bug in AI Response Prose (Dark Mode) #20

Open MagicPlants opened 7 months ago

MagicPlants commented 7 months ago

You see here that in dark mode that STRONG tag in these lists produces a color that isn't legible on the background color.

bug in css ollama-guiThis is how it looks to me and where I found the problem in the inspector FIXED-bug in css ollama-guiThis is what I propose the fix could look like.

The offending code is below:

.prose :where(strong):not(:where([class~="not-prose"],[class~="not-prose"] *)) {
    color: var(--tw-prose-bold);
}

I suggest a fix would look like this (as seen in the screenshots)

.prose :where(strong):not(:where([class~="not-prose"],[class~="not-prose"] *)) {
    color: #d2d2d2;
    font-weight: 600;
}

Perfect color for a dark mode theme and adds a bit of contrast. #d2d2d2 for the win.

I'll perform a pull request and fix it myself as well, just give me a little bit. I think it's located in /src/components/Messages/UserMessage.vue and /src/components/Messages/AiMessage.vue but I will confirm when I do the pull request.

Unrelated: I will also look into adding OpenAI support (the bounty).

MagicPlants commented 7 months ago

Fixed it by adding

.prose :where(strong):not(:where([class~="not-prose"],[class~="not-prose"] *)) {
  color: #d2d2d2;
  font-weight: 600;
}

to the file

src/style.css

See attached Pull request. Feel free to merge as it fixes anywhere the STRONG tag occurs in dark mode.

HelgeSverre commented 7 months ago

better to add dark:prose-invert, which will do the same thing, but applies neatly to everything.