Open Raghavan1988 opened 1 year ago
torvalds.dev is analyzing the ticket
The issue "[Feature Request] How to allow multiple Langchain prompt template keys" seems to be about allowing multiple keys to be used in the Langchain prompt templates.
Currently, the get_prompt_input_key
function in the memory_wrapper.py
file expects only one input key, as it raises a ValueError if more than one key is found. This function is used to get the key for the input prompt.
The prompt_templates.py
file contains the templates for different types of prompts. Each template has a specific format and expects certain keys to be present in the input.
To allow multiple keys, the following actions could be taken:
Modify the get_prompt_input_key
function to handle multiple keys. This could involve returning a list of keys instead of a single key, and handling this list in the functions that call get_prompt_input_key
.
Update the prompt templates in prompt_templates.py
to handle multiple keys. This could involve changing the string formatting to include multiple keys, and updating the PromptTemplate
class to handle multiple keys.
Update the GPTIndexMemory
class and other related classes in memory_wrapper.py
to handle multiple keys. This could involve changing how the input_key
and output_key
attributes are used, and updating methods like _get_prompt_input_key
and load_memory_variables
to handle multiple keys.
Relevant files to this issue are:
llama_index/langchain_helpers/memory_wrapper.py
llama_index/langchain_helpers/prompt_templates.py
Feature Description
As I understand it, Langchain supports multiple prompt template keys.
image
When GPT-Index is used for memory, this is reduced to one:
https://github.com/jerryjliu/gpt_index/blob/f6a985241f388a9b7513ef5405025e6e2324e01e/gpt_index/langchain_helpers/memory_wrapper.py#L22
It would be useful if multiple prompt template keys and GPT-Index could be used together.
Reason
No response
Value of Feature
No response