Open Agnostion opened 5 months ago
Adding brief details here on how this might get done, and I'm also going to add it to the list of good first issues:
To get started on this would basically consist of copying the GoogleContextProvider and then adding in some of the logic from the URLContextProvider
Validations
Problem
Hello! Hope your day is going well, guys.
I am trying to combine the functionalities of
@google
and@url
to create a crawler that not only fetches search results from Google but also parses the content of these results. Currently, as I understand it,@google
retrieves search results from Google but does not visit the links. This limits the ability to extract detailed information from the search results.I looked through the documentation but couldn't find a way to achieve this out of the box. It seems like a feature that would be very useful for extracting and processing information from multiple web pages, similar to the
@Web
feature in cursor.shSolution
Enable the combination of
@google
and@url
to create a crawler that fetches and parses the content of search results.To address this issue, it would be great to have a feature that allows combining
@google
and@url
such that the search results from Google can be automatically visited and parsed. This could involve using Google's Custom Search API, which requires a Google API key, to fetch the search results and then programmatically visiting each link to extract the desired information.This feature could be implemented as an analogue to the
@Web
feature in cursor.sh, potentially using Retrieval-Augmented Generation (RAG) to extract and process the content from the search results.Thank you for considering this feature request. Looking forward to your thoughts and feedback! Best regards, your user