torvalds-dev / llama_index

LlamaIndex (formerly GPT Index) is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
0 stars 0 forks source link

[Feature Request]: GCP Matching Engine Support #16

Open torvalds-dev opened 1 year ago

torvalds-dev commented 1 year ago

Feature Description

I would like to add support for GCP Matching Engine as a vector store.

Reason

I don't see GCP listed in https://github.com/run-llama/llama_index/tree/main/llama_index/vector_stores . We would have to write a custom vector store, but I feel like it could be include in the repo.

Value of Feature

Allow Google Cloud Users to easily integrate with Llamaindex.

torvalds-dev commented 1 year ago

torvalds.dev is analyzing the ticket

torvalds-dev commented 1 year ago

From the issue description and the provided code, it seems that the user wants to add support for Google Cloud Platform's (GCP) Matching Engine as a vector store in the Llama Index project.

The Llama Index project currently supports various vector stores as seen in the llama_index/vector_stores/__init__.py file. Each vector store has its own Python file in the llama_index/vector_stores/ directory, and they are imported in the __init__.py file.

The llama_index/vector_stores/opensearch.py file is an example of how a vector store is implemented. It contains a class that inherits from the VectorStore class and implements the required methods.

To add support for GCP Matching Engine, the following steps should be taken:

  1. Create a new Python file in the llama_index/vector_stores/ directory for the GCP Matching Engine vector store. This file should contain a class that inherits from the VectorStore class and implements the required methods.

  2. Import the new GCP Matching Engine vector store in the llama_index/vector_stores/__init__.py file.

  3. Test the new vector store to ensure it works correctly with the Llama Index project.

Relevant files to this issue are:

  1. llama_index/vector_stores/__init__.py: This file imports all the vector stores. The new GCP Matching Engine vector store should be imported here.

  2. llama_index/vector_stores/opensearch.py: This file is an example of how a vector store is implemented. It can be used as a reference when implementing the GCP Matching Engine vector store.

  3. llama_index/indices/managed/types.py: This file contains the ManagedIndexQueryMode class. It might be relevant if the GCP Matching Engine vector store needs to support different query modes.