[ +] I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
Simply importing llama_cpp and Weaviate together will show no warnings
Current Behavior
import llama_cpp as llm
import weaviate
When I import those 2 libraries together i get the
sys:1: ResourceWarning: unclosed file <_io.TextIOWrapper name='nul' mode='w' encoding='cp932'>
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
Physical (or virtual) hardware you are using, e.g. for Linux:
$ python3 --version
Python 3.11.10
$ make --version cmake maybe?
cmake version 3.29.5-msvc4
$ g++ --version
Microsoft (R) C/C++ Optimizing Compiler Version 19.41.34123 for x86
nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2024 NVIDIA Corporation
Built on Wed_Apr_17_19:36:51_Pacific_Daylight_Time_2024
Cuda compilation tools, release 12.5, V12.5.40
Build cuda_12.5.r12.5/compiler.34177558_0
Failure Information (for bugs)
If i import 2 libraries together i get this bug:
import llama_cpp as llm
import weaviate
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
Simply having this code will produce the issue for me.
I'm not sure which library's fault this is but i submitted issues to weaviate and here.
I would appreciate any help with this matter. I don't really know myself how can i trace the file which lefts an open io connection
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
Simply importing llama_cpp and Weaviate together will show no warnings
Current Behavior
When I import those 2 libraries together i get the
sys:1: ResourceWarning: unclosed file <_io.TextIOWrapper name='nul' mode='w' encoding='cp932'>
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
CPU: 11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz GPU: Nvidia GeForce RTX 3060 Laptop GPU
Microsoft Windows 10 Home
Failure Information (for bugs)
If i import 2 libraries together i get this bug:
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
Simply having this code will produce the issue for me.
package versions weaviate-client=4.9.3 llama-cpp-python=0.3.1 python=3.11.10
I'm not sure which library's fault this is but i submitted issues to weaviate and here. I would appreciate any help with this matter. I don't really know myself how can i trace the file which lefts an open io connection