Hey there!, In this pull request I have made this following contributions :
Python Module to run queries : I have added a module named LocalQueryRunner. This module contains a class named RunLocalQuey. This class have a method named get_response that takes a prompt parameter and returns the response from LLM. A user can use this module in a separate python file by using this spinnet :-
from LocalQueryRunner.querylocal import RunLocalQuey
runner = RunLocalQuey()
while True:
user_input = input("<user> ")
response = runner.get_response(prompt=user_input)
print(f"\n<model> {response}")
Updated the docs according to the modifications I made : As per the modifications I made I also updated the instruction for a user to run queries locally.
Hey there!, In this pull request I have made this following contributions :
Python Module to run queries : I have added a module named LocalQueryRunner. This module contains a class named
RunLocalQuey
. This class have a method namedget_response
that takes aprompt
parameter and returns the response from LLM. A user can use this module in a separate python file by using this spinnet :-