huggingface / llm-ls

LSP server leveraging LLMs for code completion (and more?)
Apache License 2.0
602 stars 49 forks source link

Fix off-by-1 in prompt creation #64

Closed HennerM closed 8 months ago

HennerM commented 8 months ago

Both for fill-in-the-middle prompts as well as normal prompts, there is a bug where we cut off one character from the input.

This happens when we are at the end of a line, e.g.

def hello_<CURSOR>

we were clamping the line column from 0 to len_chars() - 1. However it is valid to be "behind" all the characters in a line, and thus the -1 gets us to the wrong position