A minimalist web-searching app with an AI assistant that runs directly from your browser.
Live demo: https://felladrin-minisearch.hf.space
Here are the easiest ways to get started with MiniSearch. Pick the one that suits you best.
Option 1 - Use MiniSearch's Docker Image by running in your terminal:
docker run -p 7860:7860 ghcr.io/felladrin/minisearch:main
Option 2 - Add MiniSearch's Docker Image to your existing Docker Compose file:
services:
minisearch:
image: ghcr.io/felladrin/minisearch:main
ports:
- "7860:7860"
Option 3 - Build from source by downloading the repository files and running:
docker compose -f docker-compose.production.yml up --build
Once the container is running, open http://localhost:7860 in your browser and start searching!
You can set MiniSearch as your browser's address-bar search engine using the pattern http://localhost:7860/?q=%s
, in which your search term replaces %s
.
You can add this Quicklink to Raycast, so typying your query will open MiniSearch with the search results. You can also edit it to point to your own domain.
Yes! For this, open the Menu and change the "AI Processing Location" to Remote server (API)
. Then configure the Base URL, and optionally set an API Key and a Model to use.
Create a .env
file and set a value for ACCESS_KEYS
. Then reset the MiniSearch docker container.
For example, if you to set the password to PepperoniPizza
, then this is what you should add to your .env
:
ACCESS_KEYS="PepperoniPizza"
You can find more examples in the .env.example
file.
Yes! In MiniSearch, we call this text-generation feature "Internal OpenAI-Compatible API". To use this it:
.env
file:
INTERNAL_OPENAI_COMPATIBLE_API_BASE_URL
: The base URL for your APIINTERNAL_OPENAI_COMPATIBLE_API_KEY
: Your API access keyINTERNAL_OPENAI_COMPATIBLE_API_MODEL
: The model to useINTERNAL_OPENAI_COMPATIBLE_API_NAME
: The name to display in the UIINTERNAL_OPENAI_COMPATIBLE_API_NAME
setting) from the "AI Processing Location" dropdown.Fork this repository and clone it. Then, start the development server by running the following command:
docker compose up
Make your changes, push them to your fork, and open a pull request! All contributions are welcome!
There are a few reasons for this: