hyperspaceai / aios-cli

15 stars 0 forks source link

aiOS CLI

Overview

aios-cli is a command-line interface to access similar functionalities as the aiOS desktop app. The CLI is a nicer UX if you're a developer and tend to stay in the terminal or would like to run a node on servers as this does not require a desktop environment to run.

The CLI has a lot of commands, but the basic idea is that it allows you to run your local models (for personal inferences), host your downloaded models to provide inference to the network and earn points, and to use models other people are hosting for personal inference.

Installation

To install on all platforms you can use our install script (located here in the repo or) available hosted on our download endpoint. These scripts will download the latest release, install any required GPU drivers, and move the binary to somewhere in your PATH so that you can access aios-cli globally. You can learn more about how the script works here.

While the script is the recommended way to install, you can also download the binaries directly from the releases section of this repository.

Linux

curl https://download.hyper.space/api/install | bash

Mac

curl https://download.hyper.space/api/install | sh

Windows

You must be running in an Administrator PowerShell for both installation and uninstallation scripts to work on Windows

# If you have a real version of `curl` (i.e. something that returns a valid version when you do `curl --version`)
curl https://download.hyper.space/api/install?platform=windows | powershell -
# Otherwise
(Invoke-WebRequest "https://download.hyper.space/install?platform=windows").Content | powershell -

Uninstallation

Uninstallation is similar but just change the endpoint to /uninstall

Linux

curl https://download.hyper.space/api/uninstall | bash

Mac

curl https://download.hyper.space/api/uninstall | sh

Windows

(Invoke-WebRequest "https://download.hyper.space/uninstall?platform=windows").Content | powershell -

Docker

There are 2 pre-built docker images available, one on CPU that will install and serves Mistral 7B and one that requires an Nvidia GPU that installs and serves Llama3.

Make sure that the environment you run the Nvidia image in has nvidia-container-toolkit installed and selected as the default runtime.

Usage

aios-cli [OPTIONS] <COMMAND>

Example

Since there's a lot of commands coming up here is a basic example of some common use cases:

# Start the actual daemon
aios-cli start

# See what models are available
aios-cli models available
# Install one of them locally
aios-cli models add hf:TheBloke/Mistral-7B-Instruct-v0.1-GGUF:mistral-7b-instruct-v0.1.Q4_K_S.gguf
# Run a local inference using it
aios-cli infer --model hf:TheBloke/Mistral-7B-Instruct-v0.1-GGUF:mistral-7b-instruct-v0.1.Q4_K_S.gguf --prompt "Can you explain how to write an HTTP server in Rust?"

# Import your private key from a .pem or .base58 file
aios-cli hive import-keys ./my.pem
# Set those keys as the preferred keys for this session
aios-cli hive login
# Connect to the network (now providing inference for the model you installed before)
aios-cli hive connect

# Run an inference through someone else on the network (as you can see it's the exact same format as the normal `infer` just prefixed with `hive`)
aios-cli hive infer --model hf:TheBloke/Mistral-7B-Instruct-v0.1-GGUF:mistral-7b-instruct-v0.1.Q4_K_S.gguf --prompt "Can you explain how to write an HTTP server in Rust?"

# There's a shortcut to start and login/connect to immediately start hosting local models as well
aios-cli start --connect

Global Options

Commands

start

Starts the local aiOS daemon.

Usage: aios-cli start

status

Checks the status of your local aiOS daemon, shows you whether it is still running.

Usage: aios-cli status

kill

Terminates the currently running local aiOS daemon. This can be useful if you find yourself in a broken state and need a clean way to restart.

Usage: aios-cli kill

models

Commands to manage your local models.

Usage: aios-cli models [OPTIONS] <COMMAND>

Subcommands:

system-info

Shows you your system specifications that are relevant for model inference.

Usage: aios-cli system-info

infer

Uses local models to perform inference.

Usage: aios-cli infer [OPTIONS]

(Additional options and parameters for inference would be listed here)

hive

Runs commands using the Hive servers. For context Hive is what the Hyperspace hosted servers are referred to as.

Usage: aios-cli hive [OPTIONS] <COMMAND>

Subcommands:

Options:

version

Prints the current version of the aiOS CLI tool.

Usage: aios-cli version

help

Prints the help message or the help of the given subcommand(s).

Usage:

Points

To get points you need to put into a tier (currently ranging from best to worst as 1-5).

Each tier has some required models that you need to download and register to the network and certain amounts of GPU memory:

You can see what models you need by attempting to hive select-tier or by running hive allocate with the amount of VRAM you want to provide.

Here's a full workflow of using the CLI to start receiving points:

# Run this to see what models are required
aios-cli hive select-tier 5
# Download a required model
aios-cli models add hf:TheBloke/phi-2-GGUF:phi-2.Q4_K_M.gguf
# Make sure it's registered
aios-cli hive connect
aios-cli hive select-tier 5
# To check your current multiplier and points
aios-cli hive points

Updates

When you run start the CLI will be constantly polling and checking for updates as this software is in an early version and it is likely that there are breaking changes made at the network level that can make your node obsolete. These checks for updates and whether they were successful or not will show up in your logs for troubleshooting if you think something has gone wrong in the process.

To ensure that you are on the latest version or update while not in a started state, just run the version command while connected to the internet and the CLI will automatically check and update itself. If for some reason that's not working you can re-run the installation steps and the script will install the latest version.

Troubleshooting

For help with issues please make sure to attach the most recent few log files. These can be found at:

Support

Feel free to open an issue here in the GitHub if you run into any issues or think the documentation can be improved.