shreyaskarnik / DistiLlama

Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
MIT License
279 stars 28 forks source link
chrome-extension langchain llama2 llms local-llm mistral-7b ollama privacy-preserving-computing private readability retrieval-augmented-generation summarization zephyr

DistiLlama

image

What is DistiLlama?

DistiLlama is a Chrome extension that leverages locally running LLM perform following tasks.

Overview

One of the things that I was experimenting with is how to use a locally running LLM instance for various tasks and summarization (tl;dr) was on the top of my list. It was key to have all calls to LLM be local and all the data to stay private.

This project utilizes Ollama as the locally running LLM instance. Ollama is a great project that is easy to setup and use. I highly recommend checking it out.

To generate the summary I am using the following approach:

How to use DistiLlama?

Demo

Chat with LLM

Chat

Chat with Documents (PDF)

ChatWithDocs

Chat with Web Page

ChatWithPage

Summarization

Summary

TODOS

References and Inspiration