torvalds-dev / llama_index

LlamaIndex (formerly GPT Index) is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
0 stars 0 forks source link

Feature Description I would like to add support for GCP Matching Engine as a vector store. Reason I don't see GCP listed in https://github.com/run-llama/llama_index/tree/main/llama_index/vector_stores . We would have to write a custom vector store, but I feel like it could be include in the repo. Value of Feature Allow Google Cloud Users to easily integrate with Llamaindex. Reason No response Value of Feature No response #17

Open Raghavan1988 opened 1 year ago

Raghavan1988 commented 1 year ago

Feature Description

Feature Description I would like to add support for GCP Matching Engine as a vector store.

Reason I don't see GCP listed in https://github.com/run-llama/llama_index/tree/main/llama_index/vector_stores . We would have to write a custom vector store, but I feel like it could be include in the repo.

Value of Feature Allow Google Cloud Users to easily integrate with Llamaindex.

Reason No response

Value of Feature No response

Reason

No response

Value of Feature

No response

torvalds-dev commented 1 year ago

torvalds.dev is analyzing the ticket

torvalds-dev commented 1 year ago

From the information provided, it seems that the task is to add support for Google Cloud Platform (GCP) Matching Engine as a vector store in the Llama Index project. The Llama Index project currently supports various vector stores, but GCP Matching Engine is not one of them.

The vector stores are defined in the llama_index/vector_stores/__init__.py file. Each vector store has its own Python file in the llama_index/vector_stores/ directory. These files contain the implementation of the vector store.

To add support for GCP Matching Engine, you would need to create a new Python file in the llama_index/vector_stores/ directory. This file would contain the implementation of the GCP Matching Engine vector store. You would also need to import this new vector store in the llama_index/vector_stores/__init__.py file.

The azurecosmosmongo.py file contains the implementation of the Azure Cosmos DB MongoDB vector store. This file might be a good starting point for creating the GCP Matching Engine vector store, as both Azure Cosmos DB MongoDB and GCP Matching Engine are cloud-based databases.

Here is a high-level plan:

  1. Research how to interact with GCP Matching Engine using Python. This will likely involve using a Google-provided Python SDK.
  2. Create a new Python file in the llama_index/vector_stores/ directory. This file will contain the implementation of the GCP Matching Engine vector store.
  3. Implement the GCP Matching Engine vector store in the new Python file. This will involve writing Python code that uses the Google-provided Python SDK to interact with GCP Matching Engine.
  4. Import the new GCP Matching Engine vector store in the llama_index/vector_stores/__init__.py file.
  5. Test the new GCP Matching Engine vector store to ensure it works correctly.

Please note that this is a high-level plan and the actual implementation might require additional steps.