nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
70.79k stars 7.71k forks source link

[Feature] Tool plugins #2739

Open niansa opened 4 months ago

niansa commented 4 months ago

Tool plugins

What if gpt4all was able to load llama 3.1 tools in the form of plugins in wasm format? Imagine you could just download a llama 3.1 tool off the internet and it runs sandboxed as a proxy between the model and the internet.

It could literally just be an interpreter as it wouldn't have to do a lot of work and it's really easy to put together a simple CMake build environment for wasm stuff.

Also, wasm works with a huge bunch of programming languages (C++, Rust, Typescript, ...).

niansa commented 4 months ago

For reference, here's a CMakeLists.txt for compiling C++ into WASM with Clang:

# CMakeLists.txt

cmake_minimum_required(VERSION 3.15)

project(gw2modules LANGUAGES CXX)

set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
set(CMAKE_INTERPROCEDURAL_OPTIMIZATION ON)
set(CMAKE_EXE_LINKER_FLAGS "-Wl,--no-entry -Wl,--allow-undefined -Wl,--export-all -Wl,--strip-all -nostdlib")
set(CMAKE_CXX_FLAGS "-nostdlib -fvisibility=hidden --target=wasm32")
set(CMAKE_C_FLAGS "-nostdlib -fvisibility=hidden --target=wasm32")
set(CMAKE_EXECUTABLE_SUFFIX ".wasm")

add_library(gw2module INTERFACE global.hpp)

add_subdirectory(selfhealth)
# selfhealth/CMakeLists.txt

cmake_minimum_required(VERSION 3.9)

add_executable(selfhealth main.cpp)
target_link_libraries(selfhealth PRIVATE gw2module)

Also, wasm3 is a very straightforward runtime to use.