Open chansonzhang opened 3 months ago
The feature request for adding a language constraint in the response_synthesizer
to ensure responses are always in a specified language, such as Chinese (zh-CN), is a valuable suggestion. Introducing a language parameter in the settings (e.g., Settings.language="zh-CN"
) could effectively guide the language model to generate responses in the desired language. This could be implemented by modifying the Settings
class to include a language setting and adjusting the prompt or the processing of the response in each response_synthesizer
class to adhere to the specified language setting. This approach might involve modifying prompt templates to include language-specific instructions or filtering the model's output to match the desired language.
For implementing this feature, you can refer to the source code of the Settings
class and the response_synthesizer
in the LlamaIndex repository:
Additionally, there was a closed issue (German) Language Option for Default Prompts which discussed setting custom prompts for different languages. While this was specific to German, the solution could potentially be adapted for Chinese or any other language by setting custom prompts as per the documentation at https://gpt-index.readthedocs.io/en/latest/how_to/custom_prompts.html.
This concept is feasible with appropriate adjustments to the framework's settings and response generation logic. If you need more specific guidance or have further questions, feel free to ask.
Feature Description
Add a param or setting entry for language. e.g.
Settings.language="zh-CN"
Reason
In some situation, we just want the bot to speak a certain language, e.g. Chinese. However, the LLM sometimes will speak other language, e.g. English.
Here is a use case from a Chinese bot: User: 123 Bot: I'm sorry, but your query "123" is not clear. Could you please provide more details or context so I can assist you better? (We expect the bot always speak Chinese here.)
Value of Feature
Ease of controlling the behavior of the response_synthesizer. With this param of setting entry, the user of llama-index doesn't have to hack the prompt.