ipvm-wg / homestar

Homestar is the individual node that makes up the Everywhere Computer network (similar to how IPFS Kubo, Iroh, Nabu, or other implementation nodes make up the IPFS network). It's written in Rust and is designed for performance and composability.
https://docs.everywhere.computer/homestar/what-is-homestar/
Apache License 2.0
210 stars 23 forks source link

Extension: homestar w/ llama2 build & model + prompt chain component/effect + example #628

Open zeeshanlakhani opened 6 months ago

zeeshanlakhani commented 6 months ago

Summary

Add functionality to Homestar to allow for the user-driven execution of an LLM Chain within a sandboxed environment (Wasm) as a workflow composed of a series of steps as prompts (akin to a series of step functions). The outcome of this feature is for the inference to operate locally on a trained model (e.g. Llama 2, Minstral) privately provided by the host platform it's executing on.

The learning goals of this feature addition are to experiment with working with LLMs locally on hosts where the training data remains private and only computationally derived information can be shared with other users/peers for consumption, allowing for working with AI computation in ways not tied to any specific vendor or large cloud provider. Frankly, this work will showcase everything against what IEEE Spectrum's Open-Source AI Is Uniquely Dangerous article scrutinizes. Incorporating ways for users to chain LLM steps together while controlling what inference gets exhibited without the infrastructure concerns or data risks typically associated with external cloud services, presents a unique opportunity to democratize AI capabilities. By ensuring that users can interact with and execute complex AI workflows with ease, this feature aims to bridge the gap between advanced AI technologies and non-technical end users.

Components