ausboss / Local-LLM-Langchain

Load local LLMs effortlessly in a Jupyter notebook for testing purposes alongside Langchain or other agents. Contains Oobagooga and KoboldAI versions of the langchain notebooks with examples.
211 stars 26 forks source link

modules for Local-LLM notebook #2

Open josemlopez opened 1 year ago

josemlopez commented 1 year ago

Hi, in the Local-LLM notebook we need to import:

`import sys sys.argv = [sys.argv[0]] import importlib import json import math import os import re import sys import time import traceback from functools import partial from pathlib import Path from threading import Lock sys.path.append(str(Path().resolve().parent / "modules"))

import modules.extensions as extensions_module from modules import chat, presets, shared, training, ui, utils from modules.extensions import apply_extensions from modules.github import clone_or_pull_repository from modules.html_generator import chat_html_wrapper from modules.LoRA import add_lora_to_model from modules.models import load_model, unload_model from modules.text_generation import (generate_reply_wrapper, get_encoded_length, stop_everything_event)

import torch torch.cuda.set_device(0)`

What is "modules"? It would be possible to have access to that code, please? Thanks for sharing and all the hard work. Cheers!

Proper231 commented 1 year ago

were you ever able to find out what the error was?

ausboss commented 1 year ago

modules is from https://github.com/oobabooga/text-generation-webui

I recommend using one of these installers https://github.com/oobabooga/text-generation-webui/releases/tag/installers

modules will be in text-generation-webui folder

image