hoof-ai / hoof

"Just hoof it!" - A spotlight like interface to Ollama
https://hoof.ing
MIT License
53 stars 6 forks source link

MVP Plans #2

Open Dax911 opened 8 months ago

Dax911 commented 8 months ago

Putting this here for communication and bc if I don't take notes I will loose my train of thought. Thanks ChatGPT for helping me organize this.

Creating a macOS application that integrates with a local Ollama model and is triggered by a hotkey involves several steps. Here's a high-level overview of the tasks you'd need to accomplish to create a minimum viable product (MVP):

  1. Set Up a Local Server for Ollama Model: We will need a local server that can act as a "parser" for the Ollama service (would just start it as brew service) and handle requests. This server would receive text and return the model's response. (Can eventually support plugins for prompt engineering things)

  2. Develop a Tauri Application: Tauri is a framework for building desktop applications using web technologies. You can use it to create a lightweight and secure window for your chat interface.

  3. Implement Hotkey Functionality: We will need to use a library that can register global hotkeys on macOS. This library would listen for your specific hotkey combination and trigger the Tauri window to open.

  4. Clipboard and Selection Integration: The application should be able to grab the current selection or clipboard content when the hotkey is pressed.

  5. Create a User Interface: The Tauri window should have a user-friendly interface for chatting with the Ollama model, including a model selector and a chat display.

  6. Communication Between Tauri and the Local Server: Implement the logic to pass messages back and forth between the Tauri application and the local Ollama server.

  7. Packaging and Distribution: Once your application is ready, you'll need to package it for distribution so that others can easily install and use it on their macOS systems.

Now, let's create a task list for a GitHub issue to organize the development of this MVP:


Title: Develop a macOS Application for Local Ollama Model Interaction with Global Hotkey

Body:

Objective

Create a macOS application that allows users to interact with a local Ollama model using a global hotkey. The application will present a chat interface where users can send and receive messages from the Ollama model.

MVP Features

Tasks

Potential Libraries/Tools

Testing & Validation


This issue outlines the basic requirements and tasks for the project. You can add more details or break down the tasks further as needed. Once you have this issue created, you can start organizing the work into milestones, assigning tasks to contributors, and tracking progress.

Dax911 commented 8 months ago

Note to self: ollama doesn't support conversation storage. Will have to write our own memory and context provider to pass back to the API.

simoncollins commented 8 months ago

Yes, we'll probably need something like the OpenAI Assistance API that provides:

Later on there are lots of possibilities for extracting knowledge out of threads for long term memory etc.

sammcj commented 8 months ago

Re: Assistants API - https://github.com/transitive-bullshit/OpenOpenAI