GaiaNet-AI / docs

Apache License 2.0
8 stars 7 forks source link

Add tools support #32

Closed juntao closed 4 months ago

juntao commented 4 months ago

Hello, I am a PR summary agent on flows.network. Here are my reviews of code commits in this PR.


The patch introduces significant changes and updates for two new tools: Continue and FlowiseAI.

Potential Issues/Errors:

  1. A typo was found in the documentation for "gpt-planner," which could lead to confusion for users trying to follow instructions.
  2. Missing image references were identified in the FlowiseAI tool's documentation guide, potentially causing misunderstandings or difficulties for users.
  3. Outdated screenshot references were also found in this same guide, which could mislead users if they are no longer accurate.
  4. The Obsidian-local-gpt Plugin Setup Guide used to contain instructions on setting up one's own GaiaNet node, but now only directs users to follow the detailed instructions provided in the GaiaNet Node Setup Guide. This could be a source of confusion or frustration for users who expect more comprehensive guidance within this document.

Key Findings:

  1. The patch adds support for using different models for chat, code autocompletion, and embeddings in the Continue plugin, which increases its flexibility and versatility.
  2. The FlowiseAI tool now supports building a chatbot for real-time IP lookup using custom tools and functions, enhancing its capabilities and functionalities.
  3. The documentation has been significantly updated to provide clearer instructions, examples, and explanations for both the Continue and FlowiseAI tools, including prerequisites, installation instructions, and usage examples.
  4. The patch simplifies the Prerequisites section by clarifying that a Gaia node ready for LLM services through a public URL is necessary, and provides an example of a public GaiaNet node URL (https://llama-3-8b.us.gaianet.network/) and the model name it uses (Meta-Llama-3-8B-Instruct-Q5_K_M).
  5. The patch adds support for LLM tool calling in FlowiseAI, which allows users to configure the FlowiseAI tool to use a Gaia node that supports this feature.
  6. The documentation now includes detailed information about how to utilize LLM tool calling in the FlowiseAI environment, clarifying its role and usage in building a chatbot for real-time IP lookup.
  7. The patch updates the config.json file to include new models for chat, code autocompletion, and embeddings, and removes outdated references to public Gaia nodes.

Details

Commit 804b6f481b80b562ba30589158d90348b38fa9a9

Commit 56062831138f668fceb4cdc78f7ae5bf2b727b10

The patch introduces the following key changes:

  1. Added a new section "Prerequisites" to specify that a Gaia node ready for LLM services through a public URL is necessary.
  2. Provided instructions on how to use either your own node or a public node, with an example using a public node for the Continue plugin.
  3. Updated the API base URL in the Docker run command from "https://llama3.gaianet.network/v1" to "https://llama-3-8b.us.gaianet.network/v1".
  4. In step 2 of using GaiaNet node as the embedding API, corrected a typo in the sentence: "Click on + to upload your documentations." (originally "uopload").

Commit f1ee4b0a22a887cb3a804260b075222f4ad6ae44

  1. The Obsidian-local-gpt Plugin Setup Guide has been updated to simplify the Prerequisites section, providing clearer options for accessing a GaiaNet node: either using a public node or setting up one's own node.
  2. The guide now provides an example of a public GaiaNet node URL (https://llama-3-8b.us.gaianet.network/) and the model name it uses (Meta-Llama-3-8B-Instruct-Q5_K_M).
  3. The section on setting up one's own GaiaNet node has been removed, as users are now directed to follow the detailed instructions in the GaiaNet Node Setup Guide.
  4. The Prerequisites section now includes a table summarizing the necessary attributes for configuring the Obsidian plugin, including the API endpoint URL and model name.
  5. The guide has been updated to clarify that it uses a public node in this tutorial.

Commit 7487b0948f151590fb28bd6439f091e6d9195eb1

  1. This patch corrects a typo in the documentation for the "gpt-planner" application.
  2. The word "OpenAIT" was changed to "OpenAI".
  3. This change does not affect the functionality of the code, only the documentation.
  4. The correction ensures that the documentation accurately reflects the intended usage and setup instructions.
  5. The patch was signed off by Michael Yuan.

Commit fa8e7d7d47ba3fb627f5681f486452dda30e6eb0

Key Changes:

  1. The patch updates the "Flowise Docs QnA" example to create a chatbot for real-time IP lookup.
  2. It replaces the ChatOpenAI component with the ChatLocalAI component, using an open-source LLM on a Gaia node.
  3. A new function called get_ip_address_geo_location is added that takes an 'ip' as input and returns its geographical location.
  4. The patch adds a Custom Tool node to the FlowiseAI canvas, which can execute JavaScript code for this function when needed.
  5. A Buffer Memory node is also added to store the conversation history between the user and the chatbot.
  6. A Tool Agent node is introduced to manage tool calling and coordinating interactions with other nodes in the flow.
  7. The updated example demonstrates how to use FlowiseAI to build a chatbot that can call custom tools, such as the get_ip_address_geo_location function, to provide accurate and context-specific responses.

Commit b5386d11906870bd7e8d026734c8dc21a7a49994

  1. This patch adds a link to the documentation for LLM tool calling in FlowiseAI, which allows users to configure the FlowiseAI tool to use a Gaia node that supports this feature.
  2. The link directs to the LlamaEdge GitHub repository's API server documentation on Tool Use.
  3. This change is important as it provides more detailed information about how to utilize LLM tool calling in the FlowiseAI environment, which can enhance its capabilities and functionalities.

Commit fbb7bf98ef6c50454a8f90753fb7f177b2cc9a29

  1. The patch updates the documentation for using the Flowise AI tool, specifically a section that guides users to build a chatbot for realtime IP lookup.
  2. It corrects the format of the localhost URL in the instructions.
  3. The patch adds missing image references for two steps in the guide.
  4. It updates an outdated screenshot reference in one of the steps.
  5. The documentation now includes a clearer explanation about the role of the Buffer Memory node in the chatbot's functionality.
  6. The patch also clarifies that the tool call is sent back to the LLM with the original query, which justifies the use of the Buffer Memory node.
  7. Overall, this patch improves the clarity and accuracy of the documentation for using Flowise AI tools to build a realtime IP lookup chatbot.

Commit 17ca05022ab9b653ee46ab818340c44ecab3904c

  1. The Continue plugin now supports different models for chat, code autocomplete, and embeddings. Previously, it only used public Gaia nodes.
  2. A new model configuration "tabAutocompleteModel" has been added to the config.json file. This allows users to specify a custom model for code autocompletion.
  3. The "embeddingsProvider" section in the config.json file has been updated with specific details about the model and API base used for embeddings.
  4. The description of the changes made to the config.json file now includes information about the new models for chat, code autocompletion, and embeddings.
  5. Some lines related to the previous use of public Gaia nodes have been removed from the documentation in the config.json file section.

Commit df4dda71d36a9460a93a0cb1f81137b06f4521b8

  1. The documentation for the FlowiseAI tool call flow has been updated to clarify its usage, specifically focusing on the 'Custom Tool' node and function calls.
  2. The name of the custom function was changed from get_ip_address_geo_location to a more general get_ip_address_geo_location.
  3. A description for the tool has been added to explain when the LLM should use this function, enhancing its functionality and usability.
  4. The documentation now explains that the LLM will call this custom function when it detects a user query related to finding an IP address' location.
  5. The JSON message returned by the LLM for tool calling is not shown to the user in the chatbot, but rather captured and executed by the FlowiseAI 'Custom Tool' node.
  6. The documentation provides more details about how the custom function is called based on a matching description.