simonw / llm-palm

Plugin for LLM adding support for Google's PaLM 2 model
Apache License 2.0
14 stars 4 forks source link

Add quickfix when genai chat is not responding because of filtering #7

Open Spstolar opened 11 months ago

Spstolar commented 11 months ago

This is a quick fix to https://github.com/simonw/llm-palm/issues/6

When submitting a request like:

"""I am trying to improve the documentation of my Python module. Please add type hints to the function below:

```python
def replace(text, replacement_lookup):
    words = text.split()
    replaced_words = [replacement_lookup.get(word, word) for word in words]
    return " ".join(replace_words)

"""



`llm` would not respond. I believe this is due to a filter being triggered because it determines this is not English text (see [Known Issues](https://developers.generativeai.google/guide/troubleshooting)). Removing the markdown indicators seems to help and so does only submitting the signature of the function, but it would be nice if it would return something for the initial prompt. As a way to still return a response it is possible to submit the prompt to `generate_text` rather than `chat`, which is my proposed fix. Doing this corrects the issue for me (I get a response when I submit the above prompt).