Open M0E-lnx opened 1 year ago
Not as such.
Unless your file is just a short text file, then you can just include it in as a log file:
./chat (other parameters) --use_log file.txt
I do think having local docs as input would be a really good feature to add! So thanks for this.
I think one could maybe integrate this into privateGPT and langchain by simply just replacing the "prompt" part of langchain to use os.system()
or subprocess
to call this program instead. And then print the response. Who knows, it might even be faster than pure python.
I'm looking forward to this functionality. I'm looking to feed it more than a simple text file.
Just a heads up:
I just have no idea when it'll be ready...
For now I think you should try to use privateGPT or the gpt4all-gui.
Thanks for the support. :)
I too would like this feature since privateGPT does not work in my environment either no matter what I try no matter what model is used they all say their invalid I think there is a python library at the wrong level but other people have gotten it to work There are several like me as well that it still does not work, I have an issue open still concerning the problem
Yeah. I agree it would be useful to have a non-python alternative. But its just more tricky than I originally thought.
Afaik the way PrivateGPT works, it imports LangChain to make the embeddings from the text. And langchain is python-only. Unless there is a c++ alternative, one would need python for at least that part...
According to the AI bot search at the langchain web site When I asked about a c++ implementation one of its several responses was this
While there is no official C++ port of LangChain yet, there are a few possibilities for using LangChain-like functionality in C++:
Call the Python LangChain library from C++ code using bindings like pybind11 or Boost.Python. This would allow invoking LangChain chains and functionality from C++.
Is it possible to feed local documents/files and chat about them with this as with imartinez/privateGPT ?? if so, how