ProjectUnifree / unifree

MIT License
1.43k stars 75 forks source link

Support for Huggingface Models #50

Closed bshikin closed 11 months ago

bshikin commented 11 months ago

This branch adds support for AutoModelForCausalLM from ctransformers, and a sample godot-codellama configuration with the following model setting:

llm:
  class: HuggingfaceLLM
  config:
    checkpoint: TheBloke/CodeLlama-34B-Instruct-GGUF
    context_length: 4096
    model_type: llama
    gpu_layers: 50
harryarakkal commented 11 months ago

It looks like the package ctransformers isn't getting installed.

ImportError: Failed to import test module: test_code_extrators
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/unittest/loader.py", line 154, in loadTestsFromName
module = import(module_name)
File "/home/runner/work/unifree/unifree/tests/test_code_extrators.py", line 7, in
from unifree.llms.code_extrators import extract_first_source_code, extract_header_implementation
File "/home/runner/work/unifree/unifree/unifree/llms/init.py", line 8, in
from .huggingface_llm import HuggingfaceLLM
File "/home/runner/work/unifree/unifree/unifree/llms/huggingface_llm.py", line 4, in
import ctransformers
ModuleNotFoundError: No module named 'ctransformers'

Should we add it to the requirements.txt?

bshikin commented 11 months ago

[@harryarakkal ]Should we add [ctransformers] to the requirements.txt?

I don't think it is a good idea -- from what I understand ctransformers has a custom build instructions per target platform. Let me modify huggingface_llm.py to fix the problem though, this library should be imported lazyly