Open dit7ya opened 1 year ago
greetings ! @dit7ya can you please update title of PR, thanks :-)
@kirillrdy Thanks! Totally missed it :sweat_smile: .
This would be wonderful. Currently working on setting this up with SuperAGI(installed already) and I don't think this is going to be a simple installation!
@meditans You pasted a debug log with a [nix-shell]
in the prompt. Do you have a dev environment we can maybe start from?
There's also this: https://github.com/LucianU/nix-text-generation-webui by @LucianU
@dmadisetti, yeah, I got stuck at some point, because I don't have experience with creating derivations for Python packages. But it's something I find valuable, so I'll get around to it at some point.
Looks like this is already packaged as a test for poetry2nix by @Atry in https://github.com/nix-community/poetry2nix/commit/eb2a3bfcaaac221352aabefb6c65a7a1db248614
thoughts on pushing it to pkgs?
I can't because poetry2nix
packages are different from nixpkgs
's Python packages
Hey @dmadisetti, I'm not sure which debug log you're mentioning, but I can share how I run text-generation-webui
(and every other python-based ml repo) under nix - although that's not particularly useful for nixpkgs.
I just use a standard shell.nix
like:
let pkgs = import <nixpkgs> { };
in (pkgs.mkShell.override { stdenv = pkgs.gcc11Stdenv; }) {
buildInputs = [
pkgs.cudatoolkit
(pkgs.python3.withPackages (ps: [ ps.numpy ps.torch-bin ]))
pkgs.virtualenv
];
shellHook = ''
DIR="${builtins.toPath ./.}"
VENV_DIR="$DIR/venv"
SOURCE_DATE_EPOCH=$(date +%s) # required for python wheels
virtualenv --no-setuptools "$VENV_DIR"
export PYTHONPATH="$VENV_DIR/${pkgs.python3.sitePackages}:$PYTHONPATH"
export PATH="$VENV_DIR/bin:$PATH"
source venv/bin/activate
'';
}
This creates a non-reproducible environment in which i have cuda, the versions of pytorch and gcc that go with cuda, and after that I just install everything I need with pip
. I wish I had a cleaner way, though.
Project description
https://github.com/oobabooga/text-generation-webui
A gradio web UI for running Large Language Models like LLaMA, llama.cpp, GPT-J, Pythia, OPT, and GALACTICA. Metadata