Sensei Search is built using the following technologies:
Frontend: Next.js, Tailwind CSS
Backend: FastAPI, OpenAI client
LLMs: Mistral-7b and Command-R
Search: SearxNG
Memory: Redis
Deployment: AWS, Paka
How to Run Sensei Search
You can run Sensei Search either locally on your machine or in the cloud.
Running Locally
Follow these steps to run Sensei Search locally:
Prepare the backend environment:
cd sensei_root_folder/backend/
mv .env.development.example .env.development
Edit .env.development as needed. The example environment assumes you run models through Ollama. Make sure you have reasonably good GPUs to run the command-r model.
We deploy the app to AWS using paka. Please note that the models require GPU instances to run.
Before you start, make sure you have:
An AWS account
Requested GPU quota in your AWS account
The configuration for the cluster is located in the cluster.yaml file. You'll need to replace the HF_TOKEN value in cluster.yaml with your own Hugging Face token. This is necessary because the mistral-7b and command-r models require your account to have accepted their terms and conditions.
Follow these steps to run Sensei Search in the cloud:
jjleng/sensei: Yet another open source Perplexity
Snippet
"Sensei Search is an AI-powered tool designed to deliver the relevant search results.
Demo
http://sensei-frontend.default.52.24.120.109.sslip.io/
Screenshots
[Screenshots]
Tech Stack
Sensei Search is built using the following technologies:
How to Run Sensei Search
You can run Sensei Search either locally on your machine or in the cloud.
Running Locally
Follow these steps to run Sensei Search locally:
Prepare the backend environment:
No need to do anything for the frontend.
Run the app with the following command:
Open your browser and go to http://localhost:3000
Running in the Cloud
We deploy the app to AWS using paka. Please note that the models require GPU instances to run.
Before you start, make sure you have:
The configuration for the cluster is located in the cluster.yaml file. You'll need to replace the HF_TOKEN value in cluster.yaml with your own Hugging Face token. This is necessary because the mistral-7b and command-r models require your account to have accepted their terms and conditions.
Follow these steps to run Sensei Search in the cloud:
Install paka:
Provision the cluster in AWS:
Deploy the backend:
Deploy the frontend:
Get the URL of the frontend:
Open the URL in your browser."
Suggested labels
None