Open except-pass opened 1 month ago
I think it is not an error, just a warning. Did you try checking whether the nodes are getting created or not? @except-pass
The nodes were not created.
On Wed, Sep 25, 2024, 5:23 PM Prateek Chhikara @.***> wrote:
I think it is not an error, just a warning. Did you try checking whether the nodes are getting created or not? @except-pass https://github.com/except-pass
— Reply to this email directly, view it on GitHub https://github.com/mem0ai/mem0/issues/1906#issuecomment-2375287333, or unsubscribe https://github.com/notifications/unsubscribe-auth/AUTRZXGCMQG66SRA77AZ6LDZYMSTXAVCNFSM6AAAAABO3BVBGWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNZVGI4DOMZTGM . You are receiving this because you were mentioned.Message ID: @.***>
Let me take a look. We never faced this problem before.
Hello, just to mention I have the same issue. I can see the graph/nodes on my collab nb. I can query on Neo4j workspace 'query' section but I can't see the graph on the 'explore' section
@except-pass if you are using Neo4j locally, you need to install APOC plugins. Please check the docs https://docs.mem0.ai/open-source/graph_memory/overview
@Ashoka74 you can check out Neo4j documentation about explore. https://neo4j.com/docs/aura/preview/explore/introduction/
Hi, I have a same problem when trying to call:
result = m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
WARNING:neo4j.notifications:Received notification from DBMS server: {severity: WARNING} {code: Neo.ClientNotification.Statement.UnknownPropertyKeyWarning} {category: UNRECOGNIZED} {title: The provided property key is not in the database} {description: One of the property names in your query is not available in the database, make sure you didn't misspell it or that the label is available when you run this statement in your application (the missing property name is: name)} {position: line: 20, column: 22, offset: 1336} for query: '\n MATCH (n)\n WHERE n.embedding IS NOT NULL AND n.user_id = $user_id\n WITH n, \n round(reduce(dot = 0.0, i IN range(0, size(n.embedding)-1) | dot + n.embedding[i] * $n_embedding[i]) / \n (sqrt(reduce(l2 = 0.0, i IN range(0, size(n.embedding)-1) | l2 + n.embedding[i] * n.embedding[i])) * \n sqrt(reduce(l2 = 0.0, i IN range(0, size($n_embedding)-1) | l2 + $n_embedding[i] * $n_embedding[i]))), 4) AS similarity\n WHERE similarity >= $threshold\n MATCH (n)-[r]->(m)\n RETURN n.name AS source, elementId(n) AS source_id, type(r) AS relation, elementId(r) AS relation_id, m.name AS destination, elementId(m) AS destination_id, similarity\n UNION\n MATCH (n)\n WHERE n.embedding IS NOT NULL AND n.user_id = $user_id\n WITH n, \n round(reduce(dot = 0.0, i IN range(0, size(n.embedding)-1) | dot + n.embedding[i] * $n_embedding[i]) / \n (sqrt(reduce(l2 = 0.0, i IN range(0, size(n.embedding)-1) | l2 + n.embedding[i] * n.embedding[i])) * \n sqrt(reduce(l2 = 0.0, i IN range(0, size($n_embedding)-1) | l2 + $n_embedding[i] * $n_embedding[i]))), 4) AS similarity\n WHERE similarity >= $threshold\n MATCH (m)-[r]->(n)\n RETURN m.name AS source, elementId(m) AS source_id, type(r) AS relation, elementId(r) AS relation_id, n.name AS destination, elementId(n) AS destination_id, similarity\n ORDER BY similarity DESC\n
I am using neo4j Aura.
Any help is welcome!
Same error here with the following minimal repro script.
Here's my full script for reference:
import os
from dotenv import load_dotenv
load_dotenv()
from mem0 import Memory
config = {
"graph_store": {
"provider": "neo4j",
"config": {
"url": os.environ["NEO4J_URL"],
"username": os.environ["NEO4J_USERNAME"],
"password": os.environ["NEO4J_PASSWORD"]
}
},
"version": "v1.1"
}
m = Memory.from_config(config)
m.add("I like pizza", user_id="alice2")
results = m.search("What food do I like?", user_id="alice2")
print(results)
Done some debugging and found that in graph_memory.py:102
the LLM chooses the NOOP tool despite a clear indication that a memory should be added:
You are an AI expert specializing in graph memory management and optimization. Your task is to analyze existing graph memories alongside new information, and update the relationships in the memory list to ensure the most accurate, current, and coherent representation of knowledge.
Input:
1. Existing Graph Memories: A list of current graph memories, each containing source, target, and relationship information.
2. New Graph Memory: Fresh information to be integrated into the existing graph structure.
Guidelines:
1. Identification: Use the source and target as primary identifiers when matching existing memories with new information.
2. Conflict Resolution:
- If new information contradicts an existing memory:
a) For matching source and target but differing content, update the relationship of the existing memory.
b) If the new memory provides more recent or accurate information, update the existing memory accordingly.
3. Comprehensive Review: Thoroughly examine each existing graph memory against the new information, updating relationships as necessary. Multiple updates may be required.
4. Consistency: Maintain a uniform and clear style across all memories. Each entry should be concise yet comprehensive.
5. Semantic Coherence: Ensure that updates maintain or improve the overall semantic structure of the graph.
6. Temporal Awareness: If timestamps are available, consider the recency of information when making updates.
7. Relationship Refinement: Look for opportunities to refine relationship descriptions for greater precision or clarity.
8. Redundancy Elimination: Identify and merge any redundant or highly similar relationships that may result from the update.
Task Details:
- Existing Graph Memories:
[]
- New Graph Memory: [{'source_node': 'alice2', 'source_type': 'person', 'relation': 'LIKES', 'destination_node': 'pizza', 'destination_type': 'food'}]
Output:
Provide a list of update instructions, each specifying the source, target, and the new relationship to be set. Only include memories that require updates.
Removing the NOOP tool from the tools list fixes the problem, but that's probably too brute-force - when the relationship already exists, update_graph_memory
still gets called.
A better solution, suggested in https://github.com/mem0ai/mem0/issues/1879#issuecomment-2395315212 is to change the empty array []
to the string 'None' in mem0/graphs/utils.py:
def get_update_memory_prompt(existing_memories, memory, template):
return template.format(existing_memories=existing_memories or "None", memory=memory)
I'll submit a PR.
🐛 Describe the bug
What happened: In the Graph memory example notebook (https://colab.research.google.com/drive/1PfIGVHnliIlG2v8cx0g45TF0US-jRPZ1?usp=sharing) attempting to add a memory throws this error
m.add("I like painting", user_id=user_id), display_graph()
How to recreate: I set my OPENAI_API_KEY, NEO4J_URI, NEO4J_USER, and NEO4J_PASSWORD environment variables. I can confirm I can log into Neo4j.
I am running Mem0 locally with a Neo4j docker container. My neo4j container is
My pip freeze is
aiohappyeyeballs==2.4.0 aiohttp==3.10.6 aiosignal==1.3.1 annotated-types==0.7.0 anyio==4.6.0 asttokens==2.4.1 async-timeout==4.0.3 attrs==24.2.0 backoff==2.2.1 certifi==2024.8.30 charset-normalizer==3.3.2 comm==0.2.2 contourpy==1.3.0 cycler==0.12.1 dataclasses-json==0.6.7 debugpy==1.8.6 decorator==5.1.1 distro==1.9.0 exceptiongroup==1.2.2 executing==2.1.0 fonttools==4.54.1 frozenlist==1.4.1 grandalf==0.8 greenlet==3.1.1 grpcio==1.66.1 grpcio-tools==1.66.1 h11==0.14.0 h2==4.1.0 hpack==4.0.0 httpcore==1.0.5 httpx==0.27.2 hyperframe==6.0.1 idna==3.10 interchange==2021.0.4 ipykernel==6.29.5 ipython==8.27.0 jedi==0.19.1 jiter==0.5.0 jsonpatch==1.33 jsonpointer==3.0.0 jupyter_client==8.6.3 jupyter_core==5.7.2 kiwisolver==1.4.7 langchain==0.2.16 langchain-community==0.2.17 langchain-core==0.2.41 langchain-text-splitters==0.2.4 langsmith==0.1.128 marshmallow==3.22.0 matplotlib==3.9.2 matplotlib-inline==0.1.7 mem0ai==0.1.16 monotonic==1.6 multidict==6.1.0 mypy-extensions==1.0.0 neo4j==5.24.0 nest-asyncio==1.6.0 netgraph==4.13.2 networkx==3.3 numpy==1.26.4 openai==1.48.0 orjson==3.10.7 packaging==24.1 pansi==2020.7.3 parso==0.8.4 pexpect==4.9.0 pillow==10.4.0 platformdirs==4.3.6 portalocker==2.10.1 posthog==3.6.6 prompt_toolkit==3.0.48 protobuf==5.28.2 psutil==6.0.0 ptyprocess==0.7.0 pure_eval==0.2.3 py2neo==2021.2.4 pydantic==2.9.2 pydantic_core==2.23.4 Pygments==2.18.0 pyparsing==3.1.4 python-dateutil==2.9.0.post0 python-dotenv==1.0.1 pytz==2024.2 PyYAML==6.0.2 pyzmq==26.2.0 qdrant-client==1.11.3 rank-bm25==0.2.2 rectangle-packer==2.0.2 requests==2.32.3 scipy==1.14.1 six==1.16.0 sniffio==1.3.1 SQLAlchemy==2.0.35 stack-data==0.6.3 tenacity==8.5.0 tornado==6.4.1 tqdm==4.66.5 traitlets==5.14.3 typing-inspect==0.9.0 typing_extensions==4.12.2 urllib3==2.2.3 wcwidth==0.2.13 yarl==1.12.1