PromtEngineer / localGPT

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
Apache License 2.0
19.79k stars 2.2k forks source link

it is not restrict to the given pdf #341

Open tunisha-factacy opened 1 year ago

tunisha-factacy commented 1 year ago

i have asked the general query like what is sun which is not in my given pdf it asnwer like- XLMWithLMHeadModel', 'XLMProphetNetForCausalLM', 'XLMRobertaForCausalLM', 'XLMRobertaXLForCausalLM', 'XLNetLMHeadModel', 'XmodForCausalLM']. 2023-08-05 16:57:28,198 - INFO - run_localGPT.py:127 - Local LLM Loaded

Enter a query:

Question:

Answer: What are some applications of machine learning(ML) across industries? Please provide specific examples from different sectors like healthcare, finance, education, etc.

Enter a query: what is sun

Question: what is sun

Answer: Sun is a star.

Enter a query:

do i need some changes in a code?

tunisha-factacy commented 1 year ago

i need help regarding this

aabalke33 commented 1 year ago

You will have to alter the prompt template on run_localGPT.py. I use: template = """You are an AI assistant for answering questions about {subject}. Provide a very detailed comprehensive academic answer. If you don't know the answer, just say "I'm not sure." Don't try to make up an answer. If the question is not about {subject} and not directly in the given context, politely inform them that you are tuned to only answer questions about {subject}. Question: {question} ========= {context} ========= Answer:"""

For subject just enter what you are wanting it to answer questions about.

tunisha-factacy commented 1 year ago

Thanks dear i will apply your suggestion. Hope it will work

Sent from Outlook for Androidhttps://aka.ms/AAb9ysg


From: Aaron Balke @.> Sent: Thursday, August 17, 2023 12:22:48 AM To: PromtEngineer/localGPT @.> Cc: Tunisha Varshney @.>; Author @.> Subject: Re: [PromtEngineer/localGPT] it is not restrict to the given pdf (Issue #341)

You will have to alter the prompt template on run_localGPT.py. I use: template = """You are an AI assistant for answering questions about {subject}. Provide a very detailed comprehensive academic answer. If you don't know the answer, just say "I'm not sure." Don't try to make up an answer. If the question is not about {subject} and not directly in the given context, politely inform them that you are tuned to only answer questions about {subject}. Question: {question} ========= {context} ========= Answer:"""

For subject just enter what you are wanting it to answer questions about.

— Reply to this email directly, view it on GitHubhttps://github.com/PromtEngineer/localGPT/issues/341#issuecomment-1681117057, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BA7DO35QITOQJ33CTLAV4IDXVUJIBANCNFSM6AAAAAA3FFECN4. You are receiving this because you authored the thread.Message ID: @.***>

tunisha-factacy commented 1 year ago

hello,

i have applied your suggestion where i replace the subject which i want to answer but this is not restricted it give general answer also.

template = """ You are an AI assistant for answering questions about how can AI company help to various industry. Provide complete answer to each question. If you don't know the answer, just say "I'm not sure." Don't try to make up an answer. If the question is not about how can AI company help to various industry and not directly in the given context, politely inform them that you are tuned to only answer questions about how can AI company help to various industry. Don't answer the politics question or any legal matter question. Lastly, dont give any answer like my knowledge is till 2021. {context} {history} Question={question}

Helpful Answer:"""

prompt = PromptTemplate(input_variables=["history","context", "question"], template=template)

From: Aaron Balke @.> Sent: Thursday, August 17, 2023 12:22 AM To: PromtEngineer/localGPT @.> Cc: Tunisha Varshney @.>; Author @.> Subject: Re: [PromtEngineer/localGPT] it is not restrict to the given pdf (Issue #341)

You will have to alter the prompt template on run_localGPT.py. I use: template = """You are an AI assistant for answering questions about {subject}. Provide a very detailed comprehensive academic answer. If you don't know the answer, just say "I'm not sure." Don't try to make up an answer. If the question is not about {subject} and not directly in the given context, politely inform them that you are tuned to only answer questions about {subject}. Question: {question} ========= {context} ========= Answer:"""

For subject just enter what you are wanting it to answer questions about.

— Reply to this email directly, view it on GitHubhttps://github.com/PromtEngineer/localGPT/issues/341#issuecomment-1681117057, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BA7DO35QITOQJ33CTLAV4IDXVUJIBANCNFSM6AAAAAA3FFECN4. You are receiving this because you authored the thread.Message ID: @.***>

aabalke33 commented 1 year ago

In another issue chain, someone said the same thing, It appears the prompt method is not as simple as I thought. Is your pipeline temperature set to 0? And are you using the "TheBloke/Llama-2-7B-Chat-GGML" model? I did also have to remove memory/history from mine because it was causing other errors, you may want to try this as well.

tunisha-factacy commented 1 year ago

thanks dear yes temprature is set to 0 and i am using vicuna-hf model