turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.2k stars 236 forks source link

2 minor changes #324

Closed flying-x closed 4 months ago

flying-x commented 5 months ago
  1. added .so file to the ignored list
  2. removed 2 unused imports from test_inference.py, which also avoided a warning for me that was produced by importing pandas

cc: @turboderp