langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.57k stars 15.3k forks source link

Inconsistent promp when calling LLMGraphTransformer with ChatOllama using from langchain_ollama import ChatOllama #26614

Open csaiedu opened 1 month ago

csaiedu commented 1 month ago

Checked other resources

Example Code

from langchain_experimental.graph_transformers import LLMGraphTransformer
from langchain_community.chat_models import ChatOllama
from langchain_core.documents import Document
from langchain_community.graphs.graph_document import GraphDocument

llm = ChatOllama(model=local model)
llm_transformer = LLMGraphTransformer(
    llm=llm,
)
doc= Document(page_content="The dog has a ball which is red")
graph=llm_transformer.convert_to_graph_documents([doc])
print(graph)

from langchain_ollama import ChatOllama

llm = ChatOllama(model=local model)
llm_transformer = LLMGraphTransformer(
    llm=llm,
)
doc= Document(page_content="The dog has a ball which is red")
graph=llm_transformer.convert_to_graph_documents([doc])
print(graph)

Error Message and Stack Trace (if applicable)

No response

Description

In this code example the outcome of of processing a simple text through LLMGraphTransformer changes if I replace the definition of ChatOllama from

from langchain_community.chat_models import ChatOllama

to

from langchain_ollama import ChatOllama

Output : from langchain_community.chat_models import ChatOllama page_content='The dog has a ball which is red' [GraphDocument(nodes=[Node(id='dog', type='Animal', properties={}), Node(id='ball', type='Object', properties={})], relationships=[Relationship(source=Node(id='dog', type='Animal', properties={}), target=Node(id='ball', type='Object', properties={}), type='HAS_OBJECT', properties={})], source=Document(metadata={}, page_content='The dog has a ball which is red'))]

from langchain_ollama import ChatOllama page_content='The dog has a ball which is red' [GraphDocument(nodes=[Node(id='John Doe', properties={}), Node(id='Red Ball', properties={})], relationships=[], source=Document(metadata={}, page_content='The dog has a ball which is red'))]

System Info

langchain_core '0.3.1' langchain_community '0.3.0' langchain_experimental '0.3.0' langchain_ollama '0.2.0'

csaiedu commented 1 month ago

It seems to be related by looking at ollama Debug logs, to an incorrect prompt syntax "[INST] You are a top-tier algorithm designed for extracting ..." becomes prompt="[AVAILABLE_TOOLS] [{\"type\":\"function\",\"function\":{\"name\":\"DynamicGraph\",\"description\":\"Represents a graph document consisting of nodes and relationships.\",\"parameters\":{\"type\":\"object\",\"required\":[\"nodes\",\"relationships\"],\"properties\":{\"nodes\":{\"type\":\"\",\"description\":\"List of nodes\"},\"relationships\":{\"type\":\"\",\"description\":\"List of relationships\"}}}}}][/AVAILABLE_TOOLS][INST] # Knowledge Graph Instructions for GPT-4...

There might have been changed in chatollama to accomodate for function calling ("With Structured Output") that are not working correctly and referecing instruction for GPT4